DocumentCode :
1749035
Title :
Information transfer through classifiers and its relation to probability of error
Author :
Erdogmus, Deniz ; Principe, Jose C.
Author_Institution :
Comput. NeuroEng. Lab., Florida Univ., Gainesville, FL, USA
Volume :
1
fYear :
2001
fDate :
2001
Firstpage :
50
Abstract :
Fano´s (1961) bound identifies a lower bound for the classification error probability and indicates how the information transfer through classifier affects its performance. It was an important step towards linking the information theory and pattern recognition. In this paper, a family of lower bounds is derived using Renyi´s entropy, which yields Fano´s lower bound as a special case. Using a different set of entropy orders, Renyi´s definition also allows the construction a family of upper bounds for the probability of error. This is impossible using Shannon´s definition of entropy. Further analysis to obtain the tightest lower and upper bounds revealed the fact that Fano´s bound is indeed the tightest lower bound, and the upper bounds become tighter as the entropy order approaches to one from below. Numerical evaluations of the bounds are presented for three digital modulation schemes under AWGN channel
Keywords :
entropy; error statistics; learning (artificial intelligence); pattern classification; probability; Fano bound; Renyi entropy; error probability; information theory; information transfer; lower bound; pattern classification; upper bounds; AWGN channels; Digital modulation; Entropy; Error probability; Information theory; Joining processes; Mutual information; Neural engineering; Pattern recognition; Upper bound;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2001. Proceedings. IJCNN '01. International Joint Conference on
Conference_Location :
Washington, DC
ISSN :
1098-7576
Print_ISBN :
0-7803-7044-9
Type :
conf
DOI :
10.1109/IJCNN.2001.938990
Filename :
938990
Link To Document :
بازگشت