DocumentCode :
814351
Title :
Two highly efficient second-order algorithms for training feedforward networks
Author :
Ampazis, Nikolaos ; Perantonis, Stavros J.
Author_Institution :
Inst. of Informatics & Telecommun., Nat. Center for Sci. Res. "DEMOKRITOS", Athens, Greece
Volume :
13
Issue :
5
fYear :
2002
fDate :
9/1/2002 12:00:00 AM
Firstpage :
1064
Lastpage :
1074
Abstract :
We present two highly efficient second-order algorithms for the training of multilayer feedforward neural networks. The algorithms are based on iterations of the form employed in the Levenberg-Marquardt (LM) method for nonlinear least squares problems with the inclusion of an additional adaptive momentum term arising from the formulation of the training task as a constrained optimization problem. Their implementation requires minimal additional computations compared to a standard LM iteration. Simulations of large scale classical neural-network benchmarks are presented which reveal the power of the two methods to obtain solutions in difficult problems, whereas other standard second-order techniques (including LM) fail to converge.
Keywords :
convergence; feedforward neural nets; learning (artificial intelligence); multilayer perceptrons; optimisation; Levenberg-Marquardt method; adaptive momentum; constrained optimization problem; convergence properties; large scale classical neural-network benchmarks; multilayer feedforward neural networks; nonlinear least squares problems; second-order algorithms; training; Artificial neural networks; Backpropagation algorithms; Cost function; Feedforward neural networks; Jacobian matrices; Large-scale systems; Least squares methods; Multi-layer neural network; Neural networks; Optimization methods;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/TNN.2002.1031939
Filename :
1031939
Link To Document :
بازگشت