Title :
On the nonexistence of local minima of the backpropagation error surfaces
Author_Institution :
Dept. of Radio Eng., Southeast Univ., Nanjing, China
Abstract :
It is shown from a theoretical point of view that, if a backpropagation neural network satisfies the Kolmogorov theorem (implying that the backpropagation neural network can form arbitrarily continuous mappings), the error surface does not have any local minima with an error level higher than that of the global ones. Formulas for calculating the exact value of the global minima are also provided, which are especially useful for monitoring the training process
Keywords :
learning systems; minimax techniques; neural nets; Kolmogorov theorem; backpropagation error surfaces; global minima; learning process; neural network; Backpropagation algorithms; Computer aided software engineering; Convergence; Differential equations; Monitoring; Neural networks;
Conference_Titel :
Neural Networks, 1991. 1991 IEEE International Joint Conference on
Print_ISBN :
0-7803-0227-3
DOI :
10.1109/IJCNN.1991.170572