Title :
Extended least squares based algorithm for training feedforward networks
Author :
Yam, Jim Y F ; Chow, Tommy W S
Author_Institution :
City Univ. of Hong Kong, Kowloon, Hong Kong
fDate :
5/1/1997 12:00:00 AM
Abstract :
An extended least squares-based algorithm for feedforward networks is proposed. The weights connecting the last hidden and output layers are first evaluated by least squares algorithm. The weights between input and hidden layers are then evaluated using the modified gradient descent algorithms. This arrangement eliminates the stalling problem experienced by the pure least squares type algorithms; however, still maintains the characteristic of fast convergence. In the investigated problems, the total number of FLOPS required for the networks to converge using the proposed training algorithm are only 0.221%-16.0% of that using the Levenberg-Marquardt algorithm. The number of floating point operations per iteration of the proposed algorithm are only 1.517-3.521 times of that of the standard backpropagation algorithm
Keywords :
computational complexity; feedforward neural nets; learning (artificial intelligence); least squares approximations; extended least squares-based algorithm; feedforward network training; modified gradient descent algorithms; neural net; Circuits; Constraint optimization; Differential equations; Least squares methods; Linear programming; Neural networks; Quadratic programming; Stability;
Journal_Title :
Neural Networks, IEEE Transactions on