DocumentCode :
3334184
Title :
Improving generalization performance in character recognition
Author :
Drucker, Harris ; Cun, Yann Le
Author_Institution :
Monmouth Coll., West Long Branch, NJ, USA
fYear :
1991
fDate :
30 Sep-1 Oct 1991
Firstpage :
198
Lastpage :
207
Abstract :
One test of a new training algorithm is how well the algorithm generalizes from the training data to the test data. A new neural net training algorithm termed double backpropagation improves generalization in character recognition by minimizing the change in the output due to small changes in the input. This is accomplished by minimizing the normal energy term found in backpropagation and an additional energy term that is a function of the Jacobian
Keywords :
backpropagation; generalisation (artificial intelligence); neural nets; optical character recognition; AI; Jacobian; character recognition; double backpropagation; generalization performance; neural nets; training algorithm; Backpropagation algorithms; Character recognition; Educational institutions; Equations; Jacobian matrices; Neural networks; Noise level; Signal to noise ratio; Testing; Training data;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks for Signal Processing [1991]., Proceedings of the 1991 IEEE Workshop
Conference_Location :
Princeton, NJ
Print_ISBN :
0-7803-0118-8
Type :
conf
DOI :
10.1109/NNSP.1991.239522
Filename :
239522
Link To Document :
بازگشت