Title :
Information Inequalities for Joint Distributions, With Interpretations and Applications
Author :
Madiman, Mokshay ; Tetali, Prasad
Author_Institution :
Dept. of Stat., Yale Univ., New Haven, CT, USA
fDate :
6/1/2010 12:00:00 AM
Abstract :
Upper and lower bounds are obtained for the joint entropy of a collection of random variables in terms of an arbitrary collection of subset joint entropies. These inequalities generalize Shannon´s chain rule for entropy as well as inequalities of Han, Fujishige, and Shearer. A duality between the upper and lower bounds for joint entropy is developed. All of these results are shown to be special cases of general, new results for submodular functions-thus, the inequalities presented constitute a richly structured class of Shannon-type inequalities. The new inequalities are applied to obtain new results in combinatorics, such as bounds on the number of independent sets in an arbitrary graph and the number of zero-error source-channel codes, as well as determinantal inequalities in matrix theory. A general inequality for relative entropies is also developed. Finally, revealing connections of the results to literature in economics, computer science, and physics are explored.
Keywords :
channel coding; entropy; graph theory; matrix algebra; source coding; Shannon-type inequalities; arbitrary graph; combinatorics; determinantal inequalities; inequalities generalize Shannon´s chain rule; information inequalities; joint distributions; lower bound duality; matrix theory; relative entropy; submodular functions; subset joint entropy; upper bound duality; zero-error source-channel codes; Combinatorial mathematics; Computer science; Cramer-Rao bounds; Density measurement; Entropy; Information theory; Linear matrix inequalities; Physics; Random variables; Upper bound; Entropy-based counting; entropy inequality; inequality for minors; submodularity;
Journal_Title :
Information Theory, IEEE Transactions on
DOI :
10.1109/TIT.2010.2046253