DocumentCode
1905239
Title
Global descent replaces gradient descent to avoid local minima problem in learning with artificial neural networks
Author
Cetin, Bedri C. ; Burdick, Joel W. ; Barhen, Jacob
Author_Institution
Jet Propulsion Lab., California Inst. of Technol., Pasadena, CA, USA
fYear
1993
fDate
1993
Firstpage
836
Abstract
One of the fundamental limitations of artificial neural network learning by gradient descent is the susceptibility to local minima during training. A new approach to learning is presented in which the gradient descent rule in the backpropagation learning algorithm is replaced with a novel global descent formalism. This methodology is based on a global optimization scheme, acronymed TRUST (terminal repeller unconstrained subenergy tunneling), which formulates optimization in terms of the flow of a special deterministic dynamical system. The ability of the new dynamical system to overcome local minima with common benchmark examples and a pattern recognition example is tested. The results demonstrate that the new method does indeed escape encountered local minima, and thus finds the global minimum solution to the specific problems
Keywords
backpropagation; learning (artificial intelligence); neural nets; TRUST; artificial neural networks; backpropagation learning algorithm; global descent formalism; local minima; pattern recognition; terminal repeller unconstrained subenergy tunneling; Artificial neural networks; Backpropagation algorithms; Convergence; Intelligent networks; Jacobian matrices; Laboratories; Mechanical engineering; Optimization methods; Pattern recognition; Propulsion;
fLanguage
English
Publisher
ieee
Conference_Titel
Neural Networks, 1993., IEEE International Conference on
Conference_Location
San Francisco, CA
Print_ISBN
0-7803-0999-5
Type
conf
DOI
10.1109/ICNN.1993.298667
Filename
298667
Link To Document