Title :
Convergence of gradient method with momentum for two-Layer feedforward neural networks
Author :
Zhang, Naimin ; Wu, Wei ; Zheng, Gaofeng
Author_Institution :
Math. & Inf. Sci. Coll., Wenzhou Univ., China
fDate :
3/1/2006 12:00:00 AM
Abstract :
A gradient method with momentum for two-layer feedforward neural networks is considered. The learning rate is set to be a constant and the momentum factor an adaptive variable. Both the weak and strong convergence results are proved, as well as the convergence rates for the error function and for the weight. Compared to the existing convergence results, our results are more general since we do not require the error function to be quadratic.
Keywords :
convergence; feedforward neural nets; gradient methods; method of moments; adaptive variable; error function; gradient method convergence; learning rate; momentum factor; two-layer feedforward neural networks; Convergence; Defense industry; Feedforward neural networks; Gradient methods; Information science; Mathematics; Minimization methods; Multi-layer neural network; Neural networks; Convergence; feedforward neural network; gradient method; momentum; Algorithms; Artificial Intelligence; Computer Simulation; Models, Theoretical; Neural Networks (Computer); Numerical Analysis, Computer-Assisted; Pattern Recognition, Automated;
Journal_Title :
Neural Networks, IEEE Transactions on
DOI :
10.1109/TNN.2005.863460