DocumentCode :
2874570
Title :
A probabilistic approach to the understanding and training of neural network classifiers
Author :
Gish, Herbert
Author_Institution :
BBN Syst. & Technol. Corp., Cambridge, MA, USA
fYear :
1990
fDate :
3-6 Apr 1990
Firstpage :
1361
Abstract :
It is shown that training a neural network using a mean-square-error criterion gives network outputs that approximate posterior class probabilities. Based on this probabilistic interpretation of the network operation, information-theoretic training criteria such as maximum mutual information and the Kullback-Liebler measure are investigated. It is shown that both of these criteria are equivalent to the maximum-likelihood estimation (MLE) of the network parameters. MLE of a network allows for the comparison of network models using the Akaike information criterion and the minimum-description length criterion
Keywords :
information theory; neural nets; parameter estimation; pattern recognition; probability; Akaike information criterion; Kullback-Liebler measure; MLE; information-theoretic training criteria; maximum likelihood estimation; maximum mutual information; mean-square-error criterion; minimum-description length criterion; network parameters; neural network classifiers; posterior class probabilities; probabilistic approach; Integral equations; Maximum likelihood estimation; Mean square error methods; Mutual information; Neural networks; Parameter estimation;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Acoustics, Speech, and Signal Processing, 1990. ICASSP-90., 1990 International Conference on
Conference_Location :
Albuquerque, NM
ISSN :
1520-6149
Type :
conf
DOI :
10.1109/ICASSP.1990.115636
Filename :
115636
Link To Document :
بازگشت