DocumentCode :
1190610
Title :
A model for nonpolynomial decrease in error rate with increasing sample size
Author :
Barnard, Etienne
Author_Institution :
Dept. of Electron. & Comput. Eng., Pretoria Univ., South Africa
Volume :
5
Issue :
6
fYear :
1994
fDate :
11/1/1994 12:00:00 AM
Firstpage :
994
Lastpage :
997
Abstract :
Much theoretical evidence exists for an inverse proportionality between the error rate of a classifier and the number of samples used to train it. Cohn and Tesauro (1992) have, however, discovered various problems which experimentally display an approximately exponential decrease in error rate. We present evidence that the observed exponential decrease is caused by the finite nature of the problems studied. A simple model classification problem is presented, which demonstrates how the error rate approaches zero exponentially or faster when sufficiently many training samples are used
Keywords :
convergence; error analysis; neural nets; pattern recognition; statistical analysis; classifier; convergence; error rate; exponential decrease; inverse proportionality; model classification; nonpolynomial decrease; sample size; Africa; Computer science; Convergence; Density functional theory; Displays; Error analysis; Sampling methods;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.329698
Filename :
329698
Link To Document :
بازگشت