Title :
A New Fast Learning Algorithm for Multi-Layer Feedforward Neural Networks
Author :
Zhang, De-Xian ; Liu, Can ; Wang, Zi-qiang ; Liu, Nan-bo
Author_Institution :
Sch. of Inf. Sci. & Eng., Henan Univ. of Technol., Zhengzhou
Abstract :
The strong nonlinear relation between the training sample´s impact on the errors and error´s derivatives is the fundamental reason underlying the low learning efficiency of the multi-layer forward neural networks. Effectively decreasing the degree of the nonlinear relation and its impact on network learning is critical to improve the neural network´s training efficiency. Based on the above idea, this paper propose a new approach to accelerate learning efficiency, including linearization technique of the non-linear relation, the convergence technique based on the local equalization of training sample´s errors, and the rotation adjustment of the weights. A new fast learning algorithm for the multi-layer forward neural networks is also presented. The experimental results prove that the new algorithm is capable of shortening the training time by hundreds time and remarkably improve generalization of the neural networks, compared with the conventional algorithms
Keywords :
convergence; feedforward neural nets; learning (artificial intelligence); linearisation techniques; convergence technique; linearization technique; local error equalization; multilayer feedforward neural network; neural network learning; nonlinear relation; Acceleration; Computer errors; Convergence; Cybernetics; Electronic mail; Feedforward neural networks; Feedforward systems; Information science; Machine learning; Machine learning algorithms; Multi-layer neural network; Neural networks; Nonhomogeneous media; Neural network learning; faster learning algorithms; local error´s equalization; rotation modification of weights;
Conference_Titel :
Machine Learning and Cybernetics, 2006 International Conference on
Conference_Location :
Dalian, China
Print_ISBN :
1-4244-0061-9
DOI :
10.1109/ICMLC.2006.259140