DocumentCode :
2705468
Title :
A generalized backpropagation algorithm for faster convergence
Author :
Ng, S.C. ; Leung, S.H. ; Luk, A.
Author_Institution :
Dept. of Electron. Eng., City Univ. of Hong Kong, Kowloon, Hong Kong
Volume :
1
fYear :
1996
fDate :
3-6 Jun 1996
Firstpage :
409
Abstract :
The conventional backpropagation algorithm is basically a gradient-descent method, it has the problems of local minima and slow convergence. A new generalized backpropagation algorithm which can effectively speed up the convergence rate and reduce the chance of being trapped in local minima is introduced in this paper. The new backpropagation algorithm is to change the derivative of the activation function so as to magnify the backward propagated error signal, thus the convergence rate can be accelerated and the local minimum can be escaped
Keywords :
backpropagation; convergence; transfer functions; activation function; backward propagated error signal; convergence rate; generalized backpropagation algorithm; Acceleration; Convergence; Error correction; Multi-layer neural network; Neural networks; Neurons; Signal processing; Supervised learning;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1996., IEEE International Conference on
Conference_Location :
Washington, DC
Print_ISBN :
0-7803-3210-5
Type :
conf
DOI :
10.1109/ICNN.1996.548927
Filename :
548927
Link To Document :
بازگشت