DocumentCode :
1194774
Title :
On Adaptive Learning Rate That Guarantees Convergence in Feedforward Networks
Author :
Behera, L. ; Kumar, S. ; Patnaik, A.
Author_Institution :
Dept. of Electr. Eng., Indian Inst. of Technol., Kanpur
Volume :
17
Issue :
5
fYear :
2006
Firstpage :
1116
Lastpage :
1125
Abstract :
This paper investigates new learning algorithms (LF I and LF II) based on Lyapunov function for the training of feedforward neural networks. It is observed that such algorithms have interesting parallel with the popular backpropagation (BP) algorithm where the fixed learning rate is replaced by an adaptive learning rate computed using convergence theorem based on Lyapunov stability theory. LF II, a modified version of LF I, has been introduced with an aim to avoid local minima. This modification also helps in improving the convergence speed in some cases. Conditions for achieving global minimum for these kind of algorithms have been studied in detail. The performances of the proposed algorithms are compared with BP algorithm and extended Kalman filtering (EKF) on three bench-mark function approximation problems: XOR, 3-bit parity, and 8-3 encoder. The comparisons are made in terms of number of learning iterations and computational time required for convergence. It is found that the proposed algorithms (LF I and II) are much faster in convergence than other two algorithms to attain same accuracy. Finally, the comparison is made on a complex two-dimensional (2-D) Gabor function and effect of adaptive learning rate for faster convergence is verified. In a nutshell, the investigations made in this paper help us better understand the learning procedure of feedforward neural networks in terms of adaptive learning rate, convergence speed, and local minima
Keywords :
Kalman filters; Lyapunov methods; adaptive systems; backpropagation; convergence; feedforward neural nets; function approximation; nonlinear filters; stability; Gabor function; Lyapunov stability theory; adaptive learning rate; backpropagation algorithm; convergence; extended Kalman filtering; feedforward neural networks; function approximation; Approximation algorithms; Backpropagation algorithms; Concurrent computing; Convergence; Feedforward neural networks; Filtering algorithms; Function approximation; Kalman filters; Lyapunov method; Neural networks; Adaptive learning rate; Lyapunov function; Lyapunov stability theory; backpropagation (BP); extended Kalman filtering (EKF); feedforward networks; system-identification; Algorithms; Artificial Intelligence; Cluster Analysis; Computing Methodologies; Feedback; Models, Statistical; Neural Networks (Computer); Pattern Recognition, Automated; Systems Theory;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/TNN.2006.878121
Filename :
1687923
Link To Document :
بازگشت