Title :
An enhanced backpropagation training algorithm
Author :
Agba, Lawrence C. ; Tucker, Jerry H.
Author_Institution :
Div. of Sci. & Math., Bethune-Cookman Coll., USA
Abstract :
The enhanced backpropagation (EBP) algorithm presented in this paper addresses the problems encountered while training a layered neural network using the classical backpropagation (BP) algorithm. These problems include slow convergence and possible termination at a non-global solution. This EBP algorithm alleviates these problems by employing incremental training and gradual error reduction as a means of scheduling the sequence in which the vectors in the training set are deployed. The advantages of the EBP algorithm are, speed up of up to 46 times, ability to avoid local minima, and prevention of over learning. Moreover, it has the advantage of reduced computations when compared to other proposed enhancements to the BP algorithm
Keywords :
backpropagation; multilayer perceptrons; enhanced backpropagation training algorithm; gradual error reduction; incremental training; layered neural network; slow convergence; Computer networks; Educational institutions; Flowcharts; Humans; Mathematics; NASA; Neural networks; Processor scheduling; Scheduling algorithm; Termination of employment;
Conference_Titel :
Neural Networks, 1995. Proceedings., IEEE International Conference on
Conference_Location :
Perth, WA
Print_ISBN :
0-7803-2768-3
DOI :
10.1109/ICNN.1995.488179