DocumentCode :
2775416
Title :
Convergence Analysis of Generalized Back-propagation Algorithm with Modified Gradient Function
Author :
Ng, S.C. ; Leung, S.H. ; Luk, Andrew ; Wu, Yunfeng
Author_Institution :
Open Univ. of Hong Kong, Hong Kong
fYear :
0
fDate :
0-0 0
Firstpage :
3672
Lastpage :
3678
Abstract :
In this paper, we further investigate the convergence properties of the generalized back-propagation algorithm using magnified gradient function (MGFPROP). The idea of MGFPROP is to increase the convergence rate by magnifying the gradient function of the activation function. The algorithm can effectively speed up the convergence rate and reduce the chance of being trapped in premature saturation. From the convergence analysis, it is shown that MGFPROP retains the gradient descent property, gives faster convergence and has better global searching capability than that of the back-propagation algorithm.
Keywords :
backpropagation; convergence; activation function; convergence analysis; generalized back-propagation algorithm; gradient descent property; modified gradient function; Acceleration; Algorithm design and analysis; Australia; Backpropagation algorithms; Convergence; Equations; Error correction; Investments; Neurons; Standards development;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2006. IJCNN '06. International Joint Conference on
Conference_Location :
Vancouver, BC
Print_ISBN :
0-7803-9490-9
Type :
conf
DOI :
10.1109/IJCNN.2006.247381
Filename :
1716603
Link To Document :
بازگشت