Title :
On rigorous derivation of learning curves
Author :
Suyari, Hiroki ; Matsuba, Ikuo
Author_Institution :
Dept. of Inf. & Image Sci., Chiba Univ., Japan
Abstract :
The concrete mathematical formula of the average generalization errors and their learning curves of a simple perceptron are derived as rigorously as possible, which means rigorous derivation except for using one approximation called “self-averaging” in statistical physics. These learning curves can be plotted by numerically computing the obtained formulas and the behavior of their learning curves are easily found. In particular, it is shown that in a case of binary weights as the number of examples increases, the student perceptron suddenly freezes into the state of reference perceptron at a certain number of examples per weight and above that point the average generalization error is constantly zero. This phenomena is called “perfect generalization”. Our results are in good agreement with those by the statistical method
Keywords :
generalisation (artificial intelligence); learning (artificial intelligence); perceptrons; probability; average generalization errors; binary weights; learning curves; perceptron; perfect generalization; probability; self-averaging; Computer architecture; Concrete; Physics; Statistical analysis;
Conference_Titel :
Neural Networks, 2001. Proceedings. IJCNN '01. International Joint Conference on
Conference_Location :
Washington, DC
Print_ISBN :
0-7803-7044-9
DOI :
10.1109/IJCNN.2001.939102