Title :
Decoupled recursive estimation training and trainable degree of feedforward neural networks
Author :
Jin, L. ; Nikiforuk, P.N. ; Gupta, M.M.
Author_Institution :
Coll. of Eng., Saskatchewan Univ., Saskatoon, Sask., Canada
Abstract :
A recursive estimation training algorithm of multilayered neural networks (MNNs) with feedforward connections is proposed. The sequential neuron-decoupled extended Kalman filter equations or SNDEKF formulations are derived based on the extended Kalman filter (EKF) algorithm. Except for the advantage of fast convergence, the computational and storage requirements of SNDEKF training are significantly less than those of EKF, and the SNDEKF can naturally be integrated into the parallel structure for the weight vector of each neuron of the network like the conventional backpropagation algorithm. The effectiveness of the training approach is shown through a nonlinear function approximation example
Keywords :
Kalman filters; approximation theory; feedforward neural nets; learning (artificial intelligence); backpropagation algorithm; decoupled recursive estimation training; feedforward connections; feedforward neural networks; nonlinear function approximation; parallel structure; sequential neuron-decoupled extended Kalman filter equations; storage requirements; trainable degree; Backpropagation algorithms; Computer networks; Concurrent computing; Convergence; Equations; Feedforward neural networks; Multi-layer neural network; Neural networks; Neurons; Recursive estimation;
Conference_Titel :
Neural Networks, 1992. IJCNN., International Joint Conference on
Conference_Location :
Baltimore, MD
Print_ISBN :
0-7803-0559-0
DOI :
10.1109/IJCNN.1992.287073