DocumentCode :
2489997
Title :
Enhanced Two-Phase method in fast learning algorithms
Author :
Cheung, Chi-Chung ; Ng, Sin-Chun ; Lui, Andrew K. ; Xu, Sean Shensheng
Author_Institution :
Dept. of Electron. & Inf. Eng., Hong Kong Polytech. Univ., Hong Kong, China
fYear :
2010
fDate :
18-23 July 2010
Firstpage :
1
Lastpage :
7
Abstract :
Backpropagation (BP) learning algorithm is the most widely supervised learning technique which is extensively applied in the training of multi-layer feed-forward neural networks. Many modifications of BP have been proposed to speed up the learning of the original BP. However, the performance of these modifications is still not promising due to the existence of the local minimum problem and the error overshooting problem. This paper proposes an Enhanced Two-Phase method to solve these two problems to improve the performance of existing fast learning algorithms. The proposed method effectively locates the existence of the above problems and assigns appropriate fast learning algorithms to solve them. Throughout our investigation, the proposed method significantly improves the performance of different fast learning algorithms in terms of the convergence rate and the global convergence capability in different problems. The convergence rate can be increased up to 100 times compared with the existing fast learning algorithms.
Keywords :
backpropagation; convergence; multilayer perceptrons; problem solving; recurrent neural nets; backpropagation learning algorithm; convergence rate; enhanced two-phase method; error overshooting problem; fast learning algorithms; local minimum problem; multilayer feedforward neural network training; supervised learning technique; Machine learning;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks (IJCNN), The 2010 International Joint Conference on
Conference_Location :
Barcelona
ISSN :
1098-7576
Print_ISBN :
978-1-4244-6916-1
Type :
conf
DOI :
10.1109/IJCNN.2010.5596519
Filename :
5596519
Link To Document :
بازگشت