Title :
Inequalities between entropy and index of coincidence derived from information diagrams
Author :
Harremoës, Peter ; Topsoe, Flemming
Author_Institution :
Ordrup Gymnasium, Charlottenlund, Denmark
fDate :
11/1/2001 12:00:00 AM
Abstract :
To any discrete probability distribution P we can associate its entropy H(P)=-Σpi ln pi and its index of coincidence IC(P)=Σpi2. The main result of the paper is the determination of the precise range of the map P→(IC(P), H(P)). The range looks much like that of the map P→(Pmax, H(P)) where Pmax is the maximal point probability, cf. research from 1965 (Kovalevskij (1965)) to 1994 (Feder and Merhav (1994)). The earlier results, which actually focus on the probability of error 1-Pmax rather than Pmax, can be conceived as limiting cases of results obtained by methods presented here. Ranges of maps as those indicated are called information diagrams. The main result gives rise to precise lower as well as upper bounds for the entropy function. Some of these bounds are essential for the exact solution of certain problems of universal coding and prediction for Bernoulli sources. Other applications concern Shannon theory (relations between various measures of divergence), statistical decision theory, and rate distortion theory. Two methods are developed. One is topological; the other involves convex analysis and is based on a “lemma of replacement” which is of independent interest in relation to problems of optimization of mixed type (concave/convex optimization)
Keywords :
decision theory; entropy; error statistics; optimisation; rate distortion theory; Bernoulli sources; Shannon theory; concave/convex optimization; convex analysis; discrete probability distribution; divergence measures; entropy; entropy function; error probability; exact solution; index of coincidence; inequalities; information diagrams; lemma of replacement; lower bounds; rate distortion theory; statistical decision theory; topological method; universal coding; upper bounds; Convergence; Councils; Decision theory; Distortion measurement; Entropy; Information theory; Probability distribution; Rate distortion theory; Statistical distributions; Upper bound;
Journal_Title :
Information Theory, IEEE Transactions on