DocumentCode :
1841594
Title :
Improved second-order training algorithms for globally and partially recurrent neural networks
Author :
Santos, Euripedes P dos ; Von Zuben, Fernando J.
Author_Institution :
Dept. of Comput. Eng. & Ind. Autom., State Univ. of Campinas, Brazil
Volume :
3
fYear :
1999
fDate :
1999
Firstpage :
1501
Abstract :
Recurrent neural networks are dynamic nonlinear systems that can exhibit a wide range of behaviors. However, the availability of recurrent neural networks of practical importance is associated with the existence of efficient supervised learning algorithms based on optimization procedures for adjusting the parameters. To improve performance, second order information should be considered to minimize the error in the training process. The first objective of this work is to describe systematic ways of obtaining exact second-order information for a range of recurrent neural network configurations, with a low computational cost. The second objective is to present an improved version of the conjugate gradient algorithm that can be used to effectively explore the available second-order information
Keywords :
conjugate gradient methods; learning (artificial intelligence); nonlinear systems; optimisation; recurrent neural nets; conjugate gradient algorithm; dynamic nonlinear systems; optimization; recurrent neural networks; second-order learning; Computer networks; Industrial training; Neural networks; Neurofeedback; Neurons; Nonlinear dynamical systems; Nonlinear equations; Nonlinear systems; Recurrent neural networks; State-space methods;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1999. IJCNN '99. International Joint Conference on
Conference_Location :
Washington, DC
ISSN :
1098-7576
Print_ISBN :
0-7803-5529-6
Type :
conf
DOI :
10.1109/IJCNN.1999.832591
Filename :
832591
Link To Document :
بازگشت