DocumentCode :
1166302
Title :
A new class of quasi-Newtonian methods for optimal learning in MLP-networks
Author :
Bortoletti, Alessandro ; Di Fiore, Carmine ; Fanelli, Stefano ; Zellini, Paolo
Author_Institution :
Dipt. di Matematica, Univ. di Roma "Tor Vergata", Rome, Italy
Volume :
14
Issue :
2
fYear :
2003
fDate :
3/1/2003 12:00:00 AM
Firstpage :
263
Lastpage :
273
Abstract :
In this paper, we present a new class of quasi-Newton methods for an effective learning in large multilayer perceptron (MLP)-networks. The algorithms introduced in this work, named LQN, utilize an iterative scheme of a generalized BFGS-type method, involving a suitable family of matrix algebras L. The main advantages of these innovative methods are based upon the fact that they have an O(nlogn) complexity per step and that they require O(n) memory allocations. Numerical experiences, performed on a set of standard benchmarks of MLP-networks, show the competitivity of the LQN methods, especially for large values of n.
Keywords :
computational complexity; convergence; learning (artificial intelligence); matrix algebra; minimisation; multilayer perceptrons; computational complexity; convergence; fast discrete transforms; matrix algebras; minimization; multilayer perceptron; neural networks; optimal learning; quasi Newton methods; Algebra; Computational complexity; Convergence; Discrete transforms; Equations; Iterative algorithms; Iterative methods; Matrices; Multilayer perceptrons; Neural networks;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/TNN.2003.809425
Filename :
1189625
Link To Document :
بازگشت