DocumentCode
1037556
Title
Relations between entropy and error probability
Author
Feder, Meir ; Merhav, Neri
Author_Institution
Dept. of Electr. Eng.-Syst., Tel Aviv Univ., Israel
Volume
40
Issue
1
fYear
1994
fDate
1/1/1994 12:00:00 AM
Firstpage
259
Lastpage
266
Abstract
The relation between the entropy of a discrete random variable and the minimum attainable probability of error made in guessing its value is examined. While Fano´s inequality provides a tight lower bound on the error probability in terms of the entropy, the present authors derive a converse result-a tight upper bound on the minimal error probability in terms of the entropy. Both bounds are sharp, and can draw a relation, as well, between the error probability for the maximum a posteriori (MAP) rule, and the conditional entropy (equivocation), which is a useful uncertainty measure in several applications. Combining this relation and the classical channel coding theorem, the authors present a channel coding theorem for the equivocation which, unlike the channel coding theorem for error probability, is meaningful at all rates. This theorem is proved directly for DMCs, and from this proof it is further concluded that for R⩾C the equivocation achieves its minimal value of R-C at the rate of n1/2 where n is the block length
Keywords
encoding; error statistics; parameter estimation; Fano´s inequality; MAP; channel coding; conditional entropy; discrete memoryless channels; discrete random variable; entropy; equivocation; error probability; maximum a posteriori rule; minimum attainable probability of error; uncertainty measure; Channel coding; Data compression; Entropy; Error probability; Information theory; Measurement uncertainty; Random variables; Rate distortion theory; Upper bound;
fLanguage
English
Journal_Title
Information Theory, IEEE Transactions on
Publisher
ieee
ISSN
0018-9448
Type
jour
DOI
10.1109/18.272494
Filename
272494
Link To Document