DocumentCode :
1810879
Title :
Conditional entropy minimization in neural network classifiers
Author :
Willey, J. ; Szu, H. ; Zaghloul, M.
Author_Institution :
Naval Res. Lab., Washington, DC, USA
Volume :
2
fYear :
1999
fDate :
36342
Firstpage :
1465
Abstract :
We explore the role of entropy manipulation during learning in supervised multilayer perceptron classifiers. We show that for a 2-layer MLP classifier, conditional entropy minimization in the internal layer is a necessary condition for error minimization in the mapping from the input to the output. The relationship between entropy and the expected volume and mass of a convex hull constructed from n sample points is examined. We show that minimizing the expected hull volume may have more desirable gradient dynamics when compared to minimizing entropy. We show that entropy by itself has some geometrical invariance with respect to expected hull volumes. We develop closed form expressions for the expected convex hull mass and volumes in R1 and relate these to error probability. Finally, we show that learning in an MLP may be accomplished solely by minimization of the conditional expected hull volumes and the expected volume of the “intensity of collision”
Keywords :
computational geometry; error statistics; learning (artificial intelligence); minimum entropy methods; multilayer perceptrons; pattern classification; probability; conditional entropy minimization; convex hull; error probability; geometrical invariance; gradient dynamics; learning; multilayer perceptron; neural network; Blind source separation; Convergence; Entropy; Error probability; Feature extraction; Intelligent networks; Mutual information; Neural networks; Neurons; State estimation;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1999. IJCNN '99. International Joint Conference on
Conference_Location :
Washington, DC
ISSN :
1098-7576
Print_ISBN :
0-7803-5529-6
Type :
conf
DOI :
10.1109/IJCNN.1999.831182
Filename :
831182
Link To Document :
بازگشت