DocumentCode :
2926714
Title :
A learning rule eliminating local minima in multilayer perceptrons
Author :
Burrascano, P. ; Lucci, P.
Author_Institution :
Info-Com Dept., Roma Univ., Italy
fYear :
1990
fDate :
3-6 Apr 1990
Firstpage :
865
Abstract :
Convergence problems in the case of the generalized delta rule are discussed. On the basis of the analysis performed, a modification to the nonlinearity of processing elements is proposed; this modification is shown to smooth the cost function to be minimized during the learning phase. A variation to the generalized delta rule learning procedure, required by the introduced modification, is discussed. Extensive tests have been performed on several examples proposed in the technical literature. The tests show the effectiveness of the proposed procedure in improving the convergence properties of the backpropagation algorithm. In particular, it has been verified that the proposed modification virtually eliminates nonconvergence problems of a moderate η value is used
Keywords :
artificial intelligence; convergence; learning systems; neural nets; backpropagation algorithm; convergence properties; cost function smoothing; generalized delta rule; learning rule; local minima elimination; multilayer perceptrons; Artificial neural networks; Backpropagation algorithms; Convergence; Cost function; Image processing; Linearity; Multi-layer neural network; Multilayer perceptrons; Nonhomogeneous media; Performance analysis; Performance evaluation; Supervised learning; Testing; Time of arrival estimation;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Acoustics, Speech, and Signal Processing, 1990. ICASSP-90., 1990 International Conference on
Conference_Location :
Albuquerque, NM
ISSN :
1520-6149
Type :
conf
DOI :
10.1109/ICASSP.1990.115976
Filename :
115976
Link To Document :
بازگشت