Title :
New accelerated learning algorithm to reduce the oscillation of weights in multilayered neural networks
Author :
Ochiai, Keihiro ; Toda, Naohiro ; Usui, Shiro
Author_Institution :
Dept. of Inf. & Comput. Sci., Toyohashi Univ. of Technol., Japan
Abstract :
An accelerated learning algorithm is proposed. It is based on R.A. Jacob´s heuristics with reduction of the oscillation by correcting the next point of weight to the bottom of a ravine. The performance is evaluated through several simulation examples and compared with several acceleration algorithms, such as the DBD (delta-bar-delta) algorithm and the EDBD (extended delta-bar-delta) algorithm. Simulation results showed that the proposed algorithm converges 3.7 times faster than the DBD algorithm, 1.8 times faster than the EDBD algorithm, and 19.4 times faster than the basic backpropagation
Keywords :
backpropagation; feedforward neural nets; learning (artificial intelligence); simulation; accelerated learning algorithm; acceleration algorithms; backpropagation; extended delta-bar-delta; multilayered neural networks; oscillation of weights; simulation; Acceleration; Backpropagation algorithms; Computer errors; Computer networks; Convergence; Intelligent networks; Jacobian matrices; Kalman filters; Multi-layer neural network; Neural networks;
Conference_Titel :
Neural Networks, 1992. IJCNN., International Joint Conference on
Conference_Location :
Baltimore, MD
Print_ISBN :
0-7803-0559-0
DOI :
10.1109/IJCNN.1992.287070