DocumentCode :
1031945
Title :
Can backpropagation error surface not have local minima
Author :
Yu, Xiao-Hu
Author_Institution :
Dept. of Radio Eng., Southeast Univ., Nanjing, China
Volume :
3
Issue :
6
fYear :
1992
fDate :
11/1/1992 12:00:00 AM
Firstpage :
1019
Lastpage :
1021
Abstract :
It is shown theoretically that for an arbitrary T-element training set with t(tT) different inputs, the backpropagation error surface does not have suboptimal local minima if the network is capable of exactly implementing an arbitrary training set consisting of t different patterns. As a special case, the error surface of a backpropagation network with one hidden layer and t-1 hidden units has no local minima, if the network is trained by an arbitrary T-element set with t different inputs
Keywords :
backpropagation; neural nets; backpropagation error surface; hidden layer; hidden units; learning; local minima; neural nets; training set; Backpropagation algorithms; Combinatorial mathematics; Convergence; Neural networks; Surface treatment; Vectors;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.165604
Filename :
165604
Link To Document :
بازگشت