DocumentCode :
2495256
Title :
A BPGAD neural network algorithm and its efficiency analysis
Author :
Ni, Yuanping
Author_Institution :
Sch. of Inf. & Autom., Kunming Univ. of Sci. & Technol., Kunming
fYear :
2008
fDate :
25-27 June 2008
Firstpage :
6958
Lastpage :
6962
Abstract :
In order to enhance the learning efficiency of back propagation (BP) neural network, the studies developed an improving BP algorithm based on the gradient ascend and descend (BPGAD), discussed the conditions for the convergence of BPGAD. The conditions are that the second derivative of cost function E with respect to both HMT and weight W must be positive. BPGAD was tested by simulation, which shows that BPGAD is much better in converging speed, learning ability and stability in comparison with BP, and can be applied to engineering practices.
Keywords :
backpropagation; gradient methods; neural nets; back propagation neural network; converging speed; gradient ascend algorithm; gradient descend algorithm; learning ability; learning efficiency; neural network algorithm; Algorithm design and analysis; Automation; Cost function; Electronic mail; Information analysis; Intelligent control; Neural networks; Stability; Testing; efficiency analysis; gradient ascend and descend; neural network;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Intelligent Control and Automation, 2008. WCICA 2008. 7th World Congress on
Conference_Location :
Chongqing
Print_ISBN :
978-1-4244-2113-8
Electronic_ISBN :
978-1-4244-2114-5
Type :
conf
DOI :
10.1109/WCICA.2008.4593995
Filename :
4593995
Link To Document :
بازگشت