DocumentCode :
1842664
Title :
Acceleration of learning speed in neural networks by reducing weight oscillations
Author :
Ihm, Bin-Chul ; Park, Dong-Jo
Author_Institution :
Dept. of Electr. Eng., Korea Adv. Inst. of Sci. & Technol., Seoul, South Korea
Volume :
3
fYear :
1999
fDate :
1999
Firstpage :
1729
Abstract :
We propose a novel fast learning algorithm in neural networks. The conventional backpropagation algorithm suffers from slow convergence due to weight oscillations at a narrow valley in the error surface. To overcome this difficulty we derive a new gradient term by modifying the original gradient term with an estimated downward direction at a valley. Simulation results show that the proposed method reduces oscillations considerably and achieves fast convergence
Keywords :
backpropagation; convergence; gradient methods; neural nets; oscillations; convergence; error surface valley; learning speed acceleration; neural networks; weight oscillation reduction; weight oscillations; Acceleration; Computer simulation; Convergence; Extraterrestrial measurements; Jacobian matrices; Multilayer perceptrons; Neural networks; Neurons; Tensile stress; Testing;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1999. IJCNN '99. International Joint Conference on
Conference_Location :
Washington, DC
ISSN :
1098-7576
Print_ISBN :
0-7803-5529-6
Type :
conf
DOI :
10.1109/IJCNN.1999.832637
Filename :
832637
Link To Document :
بازگشت