Title :
On the local minima free condition of backpropagation learning
Author :
Yu, Xiao-Hu ; Chen, Guo-An
Author_Institution :
Dept. of Radio Eng., Southeast Univ., Nanjing, China
fDate :
9/1/1995 12:00:00 AM
Abstract :
It is shown that if there are P noncoincident input patterns to learn and a two-layered feedforward neural network having P-1 sigmoidal hidden neuron and one dummy hidden neuron is used for the learning, then any suboptimal equilibrium point of the corresponding error surface is unstable in the sense of Lyapunov. This result leads to a sufficient local minima free condition for the backpropagation learning
Keywords :
backpropagation; feedforward neural nets; minimisation; backpropagation learning; dummy hidden neuron; error surface; local minima; local minima free condition; noncoincident input patterns; sigmoidal hidden neuron; suboptimal equilibrium point; two-layered feedforward neural network; Backpropagation algorithms; Binary sequences; Communication channels; Feedforward neural networks; Multi-layer neural network; Neural networks; Neurons; Sufficient conditions; Supervised learning; Training data;
Journal_Title :
Neural Networks, IEEE Transactions on