Title :
Preconditioning method to accelerate neural networks gradient training algorithms
Author :
Pérez-Ilzarbe, M.J.
Author_Institution :
Dept. de Autom. y Comput., Univ. Publica de Navarro, Spain
Abstract :
In this work a simple method for conditioning neural networks gradient training algorithms is presented. It consists of using a different learning rate for the outgoing weights of each one of the neurons or network input nodes. In the case of one layer neural networks the method can also be implemented by normalizing the input training examples in certain way. The performance of the method proposed has been tested in the training of neural networks to solve a problem of image recognition. A considerable acceleration of the training algorithms has been attained in the examples tested
Keywords :
backpropagation; feedforward neural nets; gradient methods; image recognition; accelerated learning; delta rule; error backpropagation; gradient learning algorithms; image recognition; learning rate; multilayer neural networks; Acceleration; Backpropagation algorithms; Convergence; Eigenvalues and eigenfunctions; Image recognition; Multi-layer neural network; Neural networks; Neurons; Testing; Vectors;
Conference_Titel :
Neural Networks, 1999. IJCNN '99. International Joint Conference on
Conference_Location :
Washington, DC
Print_ISBN :
0-7803-5529-6
DOI :
10.1109/IJCNN.1999.831165