DocumentCode :
1909519
Title :
Differentially generated neural network classifiers are efficient
Author :
Hampshire, J.B., II ; Kumar, B. V K Vijaya
Author_Institution :
Dept. of Electr. & Comput. Eng., Carnegie Mellon Univ., Pittsburgh, PA, USA
fYear :
1993
fDate :
6-9 Sep 1993
Firstpage :
151
Lastpage :
160
Abstract :
Differential learning for statistical pattern classification is based on the classification figure-of-merit (CFM) objective function. It is proved that differential learning is asymptotically efficient, guaranteeing the best generalization allowed by the choice of hypothesis class as the training sample size grows large, while requiring the least classifier complexity necessary for Bayesian (i.e., minimum probability-of-error) discrimination. Differential learning almost always guarantees the best generalization allowed by the choice of hypothesis class for small training sample sizes
Keywords :
computational complexity; generalisation (artificial intelligence); learning (artificial intelligence); neural nets; pattern classification; statistical analysis; Bayesian discrimination; asymptotic efficiency; classification figure-of-merit; classifier complexity; differential learning; differentially generated neural network classifiers; generalization; hypothesis class; minimum error probability discrimination; minimum probability-of-error discrimination; statistical pattern classification; Bayesian methods; Complexity theory; Inductors; Neural networks; Pattern recognition; Probability; Stochastic processes;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks for Processing [1993] III. Proceedings of the 1993 IEEE-SP Workshop
Conference_Location :
Linthicum Heights, MD
Print_ISBN :
0-7803-0928-6
Type :
conf
DOI :
10.1109/NNSP.1993.471874
Filename :
471874
Link To Document :
بازگشت