Abstract :
This study introduces the constrained forms of the Shannon information entropy and Kullback–Leibler cross-entropy functions. Applicable to a closed system, these functions incorporate the constraints on the system in a generic fashion, irrespective of the form or even the number of the constraints. Since they are fully constrained, the constrained functions may be “pulled apart” to give their partial or local constrained forms, providing the means to examine the probability of each outcome or state relative to its local maximum entropy (or minimum cross-entropy) position, independently of the other states of the system. The Shannon entropy and Kullback–Leibler cross-entropy functions do not possess this functionality. The features of each function are examined in detail, and the similarities between the constrained and Tsallis entropy functions are examined. The continuous constrained entropy and cross-entropy functions are also introduced, the integrands of which may be used to independently examine each infinitesimal state (physical element) of the system.