Title :
The learning rate in back-propagation systems: an application of Newton´s method
Abstract :
In backpropagation learning, the internode connection strengths, or weights, are adjusted by a method of gradient descent in weight space. The author shows how to apply a multidimensional version of Newton´s method for finding the roots of an equation to the question of determining how far to move down the gradient in each learning cycle in backpropagation. The results of a few simulations for a fully recurrent net are presented. The results show an appreciable improvement, by a factor of five to ten, in the convergence rate for these hard-to-learn tests
Keywords :
convergence; learning systems; neural nets; backpropagation learning; convergence rate; hard-to-learn tests; internode connection strengths; learning cycle; multidimensional Newton method; recurrent net;
Conference_Titel :
Neural Networks, 1990., 1990 IJCNN International Joint Conference on
Conference_Location :
San Diego, CA, USA
DOI :
10.1109/IJCNN.1990.137647