Title :
A simple `linearized´ learning algorithm which outperforms back-propagation
Author :
Deller, J.R., Jr. ; Hunt, S.D.
Author_Institution :
Dept. of Electr. Eng., Michigan State Univ., East Lansing, MI, USA
Abstract :
A class of algorithms is presented for training multilayer perceptrons using purely linear techniques. The methods are based upon linearization of the network using error surface analysis, followed by a contemporary least-squares estimation procedure. Specific algorithms are presented for estimating weights node-wise and layer-wise and for estimating the entire set of network weights simultaneously. In several experimental studies, the node-wise method was superior to backpropagation and to an alternative linearization method due to M. Azimi-Sadjadi et al. (1990) in terms of number of convergences and convergence rate. The layer-wise and network-wise updating offer further improvement
Keywords :
feedforward neural nets; learning (artificial intelligence); least squares approximations; linearisation techniques; convergence rate; error surface analysis; layer-wise estimation; layer-wise updating; least-squares estimation procedure; linearized learning algorithm; multilayer perceptron training; network-wise updating; neural nets; node-wise estimation; weight estimation; Control systems; Error analysis; Fuzzy control; Joining processes; Least squares approximation; Multilayer perceptrons; Neural networks; Resonance light scattering; Signal processing algorithms; Speech processing;
Conference_Titel :
Neural Networks, 1992. IJCNN., International Joint Conference on
Conference_Location :
Baltimore, MD
Print_ISBN :
0-7803-0559-0
DOI :
10.1109/IJCNN.1992.227180