Title :
A model for nonpolynomial decrease in error rate with increasing sample size
Author :
Barnard, Etienne
Author_Institution :
Dept. of Electron. & Comput. Eng., Pretoria Univ., South Africa
fDate :
11/1/1994 12:00:00 AM
Abstract :
Much theoretical evidence exists for an inverse proportionality between the error rate of a classifier and the number of samples used to train it. Cohn and Tesauro (1992) have, however, discovered various problems which experimentally display an approximately exponential decrease in error rate. We present evidence that the observed exponential decrease is caused by the finite nature of the problems studied. A simple model classification problem is presented, which demonstrates how the error rate approaches zero exponentially or faster when sufficiently many training samples are used
Keywords :
convergence; error analysis; neural nets; pattern recognition; statistical analysis; classifier; convergence; error rate; exponential decrease; inverse proportionality; model classification; nonpolynomial decrease; sample size; Africa; Computer science; Convergence; Density functional theory; Displays; Error analysis; Sampling methods;
Journal_Title :
Neural Networks, IEEE Transactions on