Title :
Mutual information training and size minimization of adaptive probabilistic neural networks
Author :
Fakhr, Waleed ; Elmasry, M.I.
Author_Institution :
E & CE Dept., Waterloo Univ., Ont., Canada
Abstract :
An upper bound of the Bayes error probability is used as a generalized performance criterion for supervised neural network classifiers. It is shown that the maximization of the mutual information is equivalent to the minimization of this bound, which leads to a direct implementation of the Bayes framework for classification. The criterion is used both in training neural networks and in minimizing their size by adaptive pruning. A top-down heuristic for adaptively pruning oversized networks is proposed and applied. The approach is applied to adaptive probabilistic neural networks. Two benchmark problem results are given, verifying its validity
Keywords :
Bayes methods; error statistics; learning (artificial intelligence); neural nets; Bayes error probability; adaptive probabilistic neural networks; adaptive pruning; benchmark problem results; classification; generalized performance criterion; size minimization; top-down heuristic; upper bound; Adaptive systems; Bayesian methods; Error probability; Mutual information; Neural networks; Parameter estimation; Pattern recognition; Training data; Upper bound; Very large scale integration;
Conference_Titel :
Circuits and Systems, 1992. ISCAS '92. Proceedings., 1992 IEEE International Symposium on
Conference_Location :
San Diego, CA
Print_ISBN :
0-7803-0593-0
DOI :
10.1109/ISCAS.1992.230014