Title :
Linearized least-squares training of multilayer feedforward neural networks
Author :
Douglas, Scott C. ; Meng, Teresa H Y
Author_Institution :
Inf. Syst. Lab., Stanford Univ., CA, USA
Abstract :
The authors develop a linearized least-squares formulation for estimating the weight coefficients of a neural network. Linearization of the nonlinear network about the most recent weight estimates leads to a conditional least-squares criterion which may be solved recursively in time. The resulting coefficient update equations resemble those of the recursive least-squares solution in adaptive filtering, much as the update equations for linearized stochastic gradient descent (backpropagation) resemble those of the least mean squares solution in adaptive filtering. Simulations on small logic mapping problems indicate a three- to tenfold increase in training efficiency for this technique as compared to gradient descent
Keywords :
least squares approximations; neural nets; adaptive filtering; backpropagation; coefficient update equations; conditional least-squares criterion; least mean squares; linearized least-squares; linearized stochastic gradient descent; logic mapping; multilayer feedforward neural networks; nonlinear network; recursive least-squares; training efficiency; weight coefficients; weight estimates; Adaptive filters; Feedforward neural networks; Information systems; Multi-layer neural network; Neural networks; Nonlinear equations; Parameter estimation; Recursive estimation; State estimation; Vectors;
Conference_Titel :
Neural Networks, 1991., IJCNN-91-Seattle International Joint Conference on
Conference_Location :
Seattle, WA
Print_ISBN :
0-7803-0164-1
DOI :
10.1109/IJCNN.1991.155195