Title :
Evolution strategies on connection weights into modified gradient function for multi-layer neural networks
Author :
Ng, S.C. ; Leung, S.H. ; Luk, Andrew
Author_Institution :
Sch. of Sci. & Tech., Open Univ. of Hong Kong, China
Abstract :
In this paper, two modifications on the conventional back-propagation algorithm for feedforward multi-layer neural networks are presented. One modification is based on the calculation of the gradient function, while the other one is the use of evolution strategies on connection weights into the gradient search algorithm. From simulation results, the new modified algorithm always converges to the global optimal solution with better performance when compared with other fast learning algorithms and global search methods.
Keywords :
backpropagation; evolutionary computation; feedforward neural nets; gradient methods; multilayer perceptrons; search problems; backpropagation; evolution strategy; fast learning; feedforward multilayer neural network; global search method; gradient search algorithm; modified gradient function; multilayer neural networks; Acceleration; Australia; Backpropagation algorithms; Convergence; Investments; Multi-layer neural network; Neural networks; Neurons; Search methods; Signal processing;
Conference_Titel :
Neural Networks, 2005. IJCNN '05. Proceedings. 2005 IEEE International Joint Conference on
Print_ISBN :
0-7803-9048-2
DOI :
10.1109/IJCNN.2005.1556074