Title :
Further enhancements in WOM algorithm to solve the local minimum and flat-spot problem in feed-forward neural networks
Author :
Chi-Chung Cheung ; Sin-Chun Ng ; Lui, Andrew ; Xu, Sendren Sheng-Dong
Author_Institution :
Dept. of Electron. & Inf. Eng., Hong Kong Polytech. Univ., Hong Kong, China
Abstract :
Backpropagation (BP) algorithm is very popular in supervised learning for feed-forward neural networks. However, it is sometimes slow and easily trapped into a local minimum or a flat-spot area (known as the local minimum and flat-spot area problem respectively). Many modifications have been proposed to speed up its convergence rate but they seldom improve the global convergence capability. Some fast learning algorithms have been proposed recently to solve these two problems: Wrong Output Modification (WOM) is one new algorithm that can improve the global convergence capability significantly. However, some limitations exist in WOM so that it cannot solve the local minimum and flat-spot problem effectively. In this paper, some enhancements are proposed to further improve the performance of WOM by (a) changing the mechanism to escape from a local minimum or a flat-spot area and (b) adding a fast checking procedure to identify the existence of a local minimum or a flat-spot area. The performance investigation shows that the proposed enhancements can improve the performance of WOM significantly when it is applied into different fast learning algorithms. Moreover, WOM with these enhancements is also applied to a very popular second-order gradient descent learning algorithm, Levenberg-Marquardt (LM) algorithm. The performance investigation shows that it can significantly improve the performance of LM.
Keywords :
convergence; feedforward neural nets; gradient methods; learning (artificial intelligence); LM algorithm; Levenberg-Marquardt algorithm; WOM algorithm; WOM performance; fast checking procedure; fast learning algorithm; flat-spot problem; global convergence capability; local minimum problem; second-order gradient descent learning algorithm; wrong output modification; Breast cancer; Convergence; Databases; Educational institutions; Neural networks; Neurons; Training;
Conference_Titel :
Neural Networks (IJCNN), 2014 International Joint Conference on
Conference_Location :
Beijing
Print_ISBN :
978-1-4799-6627-1
DOI :
10.1109/IJCNN.2014.6889622