Title :
Fast learning process of multilayer neural networks using recursive least squares method
Author :
Azimi-Sadjadi, Mahmood R. ; Liou, Ren-Jean
Author_Institution :
Dept. of Electr. Eng., Colorado State Univ., Fort Collins, CO, USA
fDate :
2/1/1992 12:00:00 AM
Abstract :
A new approach for the learning process of multilayer perceptron neural networks using the recursive least squares (RLS) type algorithm is proposed. This method minimizes the global sum of the square of the errors between the actual and the desired output values iteratively. The weights in the network are updated upon the arrival of a new training sample and by solving a system of normal equations recursively. To determine the desired target in the hidden layers an analog of the back-propagation strategy used in the conventional learning algorithms is developed. This permits the application of the learning procedure to all the layers. Simulation results on the 4-b parity checker and multiplexer networks were obtained which indicate significant reduction in the total number of iterations when compared with those of the conventional and accelerated back-propagation algorithms
Keywords :
learning systems; least squares approximations; neural nets; 4-b parity checker; RLS algorithm; back-propagation strategy; iterations; learning process; multilayer neural networks; multiplexer networks; perceptron; recursive least squares method; Convergence; Iterative algorithms; Least squares methods; Multi-layer neural network; Multilayer perceptrons; Neural networks; Resonance light scattering; Signal processing algorithms; Signal representations; Training data;
Journal_Title :
Signal Processing, IEEE Transactions on