Title :
Differentially generated neural network classifiers are efficient
Author :
Hampshire, J.B., II ; Kumar, B. V K Vijaya
Author_Institution :
Dept. of Electr. & Comput. Eng., Carnegie Mellon Univ., Pittsburgh, PA, USA
Abstract :
Differential learning for statistical pattern classification is based on the classification figure-of-merit (CFM) objective function. It is proved that differential learning is asymptotically efficient, guaranteeing the best generalization allowed by the choice of hypothesis class as the training sample size grows large, while requiring the least classifier complexity necessary for Bayesian (i.e., minimum probability-of-error) discrimination. Differential learning almost always guarantees the best generalization allowed by the choice of hypothesis class for small training sample sizes
Keywords :
computational complexity; generalisation (artificial intelligence); learning (artificial intelligence); neural nets; pattern classification; statistical analysis; Bayesian discrimination; asymptotic efficiency; classification figure-of-merit; classifier complexity; differential learning; differentially generated neural network classifiers; generalization; hypothesis class; minimum error probability discrimination; minimum probability-of-error discrimination; statistical pattern classification; Bayesian methods; Complexity theory; Inductors; Neural networks; Pattern recognition; Probability; Stochastic processes;
Conference_Titel :
Neural Networks for Processing [1993] III. Proceedings of the 1993 IEEE-SP Workshop
Conference_Location :
Linthicum Heights, MD
Print_ISBN :
0-7803-0928-6
DOI :
10.1109/NNSP.1993.471874