DocumentCode :
2629582
Title :
On the nonexistence of local minima of the backpropagation error surfaces
Author :
Yu, Xiao-Hu
Author_Institution :
Dept. of Radio Eng., Southeast Univ., Nanjing, China
fYear :
1991
fDate :
18-21 Nov 1991
Firstpage :
1272
Abstract :
It is shown from a theoretical point of view that, if a backpropagation neural network satisfies the Kolmogorov theorem (implying that the backpropagation neural network can form arbitrarily continuous mappings), the error surface does not have any local minima with an error level higher than that of the global ones. Formulas for calculating the exact value of the global minima are also provided, which are especially useful for monitoring the training process
Keywords :
learning systems; minimax techniques; neural nets; Kolmogorov theorem; backpropagation error surfaces; global minima; learning process; neural network; Backpropagation algorithms; Computer aided software engineering; Convergence; Differential equations; Monitoring; Neural networks;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1991. 1991 IEEE International Joint Conference on
Print_ISBN :
0-7803-0227-3
Type :
conf
DOI :
10.1109/IJCNN.1991.170572
Filename :
170572
Link To Document :
بازگشت