Title :
Convergence Analysis of Generalized Back-propagation Algorithm with Modified Gradient Function
Author :
Ng, S.C. ; Leung, S.H. ; Luk, Andrew ; Wu, Yunfeng
Author_Institution :
Open Univ. of Hong Kong, Hong Kong
Abstract :
In this paper, we further investigate the convergence properties of the generalized back-propagation algorithm using magnified gradient function (MGFPROP). The idea of MGFPROP is to increase the convergence rate by magnifying the gradient function of the activation function. The algorithm can effectively speed up the convergence rate and reduce the chance of being trapped in premature saturation. From the convergence analysis, it is shown that MGFPROP retains the gradient descent property, gives faster convergence and has better global searching capability than that of the back-propagation algorithm.
Keywords :
backpropagation; convergence; activation function; convergence analysis; generalized back-propagation algorithm; gradient descent property; modified gradient function; Acceleration; Algorithm design and analysis; Australia; Backpropagation algorithms; Convergence; Equations; Error correction; Investments; Neurons; Standards development;
Conference_Titel :
Neural Networks, 2006. IJCNN '06. International Joint Conference on
Conference_Location :
Vancouver, BC
Print_ISBN :
0-7803-9490-9
DOI :
10.1109/IJCNN.2006.247381