Title :
A hybrid linear/nonlinear training algorithm for feedforward neural networks
Author :
McLoone, Sean ; Brown, Michael D. ; Irwin, George ; Lightbody, Gordon
Author_Institution :
Dept. of Electr. & Electron. Eng., Queen´´s Univ., Belfast, UK
fDate :
7/1/1998 12:00:00 AM
Abstract :
This paper presents a new hybrid optimization strategy for training feedforward neural networks. The algorithm combines gradient-based optimization of nonlinear weights with singular value decomposition (SVD) computation of linear weights in one integrated routine. It is described for the multilayer perceptron (MLP) and radial basis function (RBF) networks and then extended to the local model network (LMN), a new feedforward structure in which a global nonlinear model is constructed from a set of locally valid submodels. Simulation results are presented demonstrating the superiority of the new hybrid training scheme compared to second-order gradient methods. It is particularly effective for the LMN architecture where the linear to nonlinear parameter ratio is large
Keywords :
feedforward neural nets; learning (artificial intelligence); multilayer perceptrons; optimisation; singular value decomposition; LMN; MLP; RBF networks; SVD; feedforward neural networks; feedforward structure; gradient-based optimization; hybrid linear/nonlinear training algorithm; hybrid optimization strategy; linear weights; local model network; multilayer perceptron; nonlinear weights; radial basis function networks; singular value decomposition; Backpropagation algorithms; Computational modeling; Cost function; Feedforward neural networks; Gradient methods; Helium; Multilayer perceptrons; Neural networks; Singular value decomposition; Vectors;
Journal_Title :
Neural Networks, IEEE Transactions on