DocumentCode :
1587682
Title :
Output weight optimization for the multi-layer perceptron
Author :
Manry, M.T. ; Guan, Xiujun ; Apollo, S.J. ; Allen, L.S. ; Lyle, W.D. ; Gong, W.
Author_Institution :
Dept. of Electr. Eng., Texas Univ., Arlington, TX, USA
fYear :
1992
Firstpage :
502
Abstract :
A fast method for designing multilayer perceptron (MLP) neural networks was introduced by S.A. Barton (1991). In this method, linear equations are solved for the output weights. An analysis of the MLP based on polynomial basis functions (PBFs) is used to justify the technique. A conjugate gradient solution to the output weight equations is introduced. A mutation technique that can be used to improve hidden unit weights is described. The output weight optimization (OWO) technique is extended to classification networks, which have nonlinear output unit activations. In several examples, it is seen that OWO is significantly faster than backpropagation (BP)
Keywords :
feedforward neural nets; learning (artificial intelligence); signal processing; conjugate gradient solution; hidden unit weights; multi-layer perceptron; mutation technique; neural networks; nonlinear output unit activations; output weight optimisation; polynomial basis functions; signal processing; Biomedical imaging; Design methodology; Equations; Genetic mutations; Multi-layer neural network; Multilayer perceptrons; Neural networks; Pattern recognition; Polynomials; Research and development;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Signals, Systems and Computers, 1992. 1992 Conference Record of The Twenty-Sixth Asilomar Conference on
Conference_Location :
Pacific Grove, CA
ISSN :
1058-6393
Print_ISBN :
0-8186-3160-0
Type :
conf
DOI :
10.1109/ACSSC.1992.269221
Filename :
269221
Link To Document :
بازگشت