Title :
A note on Fanos inequality
Author_Institution :
Dept. of Electr. & Comput. Eng., Louisiana State Univ., Baton Rouge, LA, USA
Abstract :
Fano´s inequality is a sharp upper bound on conditional entropy in terms of the probability of error. It plays a fundamental role in the proof of converse part in coding theorems of information theory. A standard proof of Fanos inequality as seen in textbooks is based on properties of Shannon´s information measures with a trick of introducing an auxiliary random variable. In this brief, it is observed that the generic ℒ1 bound on entropy can straightforwardly provide an upper bound on conditional entropy in terms of the probability of error. This generic ℒ1 bound on conditional entropy is still equally effective as the Fano bound for applications in the converse proof. Compared with the generic ℒ1 bound, Fanos inequality can be regarded as a specific bound on conditional entropy by employing the structural property of the two joint probability distributions. This viewpoint motivates us to find an identity to connect the conditional entropy and the probability of error. As a corollary, a necessary and sufficient condition for tightness of the Fano bound is also obtained.
Keywords :
encoding; entropy; error statistics; Fano inequality; Shannon information; auxiliary random variable; coding theorems; conditional entropy; error probability; information theory; probability distributions; Calculus; ℒ1 bound on entropy; Fanos inequality; conditional entropy;
Conference_Titel :
Information Sciences and Systems (CISS), 2011 45th Annual Conference on
Conference_Location :
Baltimore, MD
Print_ISBN :
978-1-4244-9846-8
Electronic_ISBN :
978-1-4244-9847-5
DOI :
10.1109/CISS.2011.5766186