Title :
A BPGAD neural network algorithm and its efficiency analysis
Author_Institution :
Sch. of Inf. & Autom., Kunming Univ. of Sci. & Technol., Kunming
Abstract :
In order to enhance the learning efficiency of back propagation (BP) neural network, the studies developed an improving BP algorithm based on the gradient ascend and descend (BPGAD), discussed the conditions for the convergence of BPGAD. The conditions are that the second derivative of cost function E with respect to both HMT and weight W must be positive. BPGAD was tested by simulation, which shows that BPGAD is much better in converging speed, learning ability and stability in comparison with BP, and can be applied to engineering practices.
Keywords :
backpropagation; gradient methods; neural nets; back propagation neural network; converging speed; gradient ascend algorithm; gradient descend algorithm; learning ability; learning efficiency; neural network algorithm; Algorithm design and analysis; Automation; Cost function; Electronic mail; Information analysis; Intelligent control; Neural networks; Stability; Testing; efficiency analysis; gradient ascend and descend; neural network;
Conference_Titel :
Intelligent Control and Automation, 2008. WCICA 2008. 7th World Congress on
Conference_Location :
Chongqing
Print_ISBN :
978-1-4244-2113-8
Electronic_ISBN :
978-1-4244-2114-5
DOI :
10.1109/WCICA.2008.4593995