Title :
Improving generalization performance in character recognition
Author :
Drucker, Harris ; Cun, Yann Le
Author_Institution :
Monmouth Coll., West Long Branch, NJ, USA
fDate :
30 Sep-1 Oct 1991
Abstract :
One test of a new training algorithm is how well the algorithm generalizes from the training data to the test data. A new neural net training algorithm termed double backpropagation improves generalization in character recognition by minimizing the change in the output due to small changes in the input. This is accomplished by minimizing the normal energy term found in backpropagation and an additional energy term that is a function of the Jacobian
Keywords :
backpropagation; generalisation (artificial intelligence); neural nets; optical character recognition; AI; Jacobian; character recognition; double backpropagation; generalization performance; neural nets; training algorithm; Backpropagation algorithms; Character recognition; Educational institutions; Equations; Jacobian matrices; Neural networks; Noise level; Signal to noise ratio; Testing; Training data;
Conference_Titel :
Neural Networks for Signal Processing [1991]., Proceedings of the 1991 IEEE Workshop
Conference_Location :
Princeton, NJ
Print_ISBN :
0-7803-0118-8
DOI :
10.1109/NNSP.1991.239522