DocumentCode :
1145664
Title :
On the local minima free condition of backpropagation learning
Author :
Yu, Xiao-Hu ; Chen, Guo-An
Author_Institution :
Dept. of Radio Eng., Southeast Univ., Nanjing, China
Volume :
6
Issue :
5
fYear :
1995
fDate :
9/1/1995 12:00:00 AM
Firstpage :
1300
Lastpage :
1303
Abstract :
It is shown that if there are P noncoincident input patterns to learn and a two-layered feedforward neural network having P-1 sigmoidal hidden neuron and one dummy hidden neuron is used for the learning, then any suboptimal equilibrium point of the corresponding error surface is unstable in the sense of Lyapunov. This result leads to a sufficient local minima free condition for the backpropagation learning
Keywords :
backpropagation; feedforward neural nets; minimisation; backpropagation learning; dummy hidden neuron; error surface; local minima; local minima free condition; noncoincident input patterns; sigmoidal hidden neuron; suboptimal equilibrium point; two-layered feedforward neural network; Backpropagation algorithms; Binary sequences; Communication channels; Feedforward neural networks; Multi-layer neural network; Neural networks; Neurons; Sufficient conditions; Supervised learning; Training data;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.410380
Filename :
410380
Link To Document :
بازگشت