Title :
Simulated annealing and weight decay in adaptive learning: the SARPROP algorithm
Author :
Treadgold, Nicholas K. ; Gedeon, Tamas D.
Author_Institution :
Dept. of Inf. Eng., New South Wales Univ., Kensington, NSW, Australia
fDate :
7/1/1998 12:00:00 AM
Abstract :
A problem with gradient descent algorithms is that they can converge to poorly performing local minima. Global optimization algorithms address this problem, but at the cost of greatly increased training times. This work examines combining gradient descent with the global optimization technique of simulated annealing (SA). Simulated annealing in the form of noise and weight decay is added to resiliant backpropagation (RPROP), a powerful gradient descent algorithm for training feedforward neural networks. The resulting algorithm, SARPROP, is shown through various simulations not only to be able to escape local minima, but is also able to maintain, and often improve the training times of the RPROP algorithm. In addition, SARPROP may be used with a restart training phase which allows a more thorough search of the error surface and provides an automatic annealing schedule
Keywords :
backpropagation; feedforward neural nets; simulated annealing; RPROP; SA; SARPROP algorithm; adaptive learning; automatic annealing schedule; error surface; feedforward neural network training; global optimization algorithms; gradient descent algorithms; local minima; noise decay; poorly performing local minima; resiliant backpropagation; restart training phase; simulated annealing; weight decay; Backpropagation algorithms; Computer networks; Convergence; Cost function; Feedforward neural networks; Feedforward systems; Gradient methods; Neural networks; Optimization methods; Simulated annealing;
Journal_Title :
Neural Networks, IEEE Transactions on