DocumentCode :
3190813
Title :
On the structure of the Hessian matrix in feedforward networks and second derivative methods
Author :
Wille, Jörg
Author_Institution :
Dept. of Math., Cottbus Univ. of Technol., Germany
Volume :
3
fYear :
1997
fDate :
9-12 Jun 1997
Firstpage :
1851
Abstract :
Adaptation in feedforward networks based on backpropagation learning is one of the most important techniques in the area of artificial neural networks. Considering properties of backpropagation learning it is possible to construct efficient adaptive first derivative algorithms. A possibility to improve this adaptation is given by using second derivatives of the error function. But a lot of problems arise when applying such algorithms. How can optimized adaptive methods with second derivatives be applied? This paper deals with investigation into the Hessian matrix in feedforward networks and its properties. Furthermore a formulation of a separated online learning algorithm using second derivatives is presented
Keywords :
Hessian matrices; backpropagation; feedforward neural nets; minimisation; Hessian matrix; backpropagation learning; error function; feedforward networks; optimized adaptive methods; second derivative methods; separated online learning algorithm; Artificial neural networks; Backpropagation algorithms; Gradient methods; Intelligent networks; Iterative methods; Mathematics; Network topology; Neurons; Optimization methods;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks,1997., International Conference on
Conference_Location :
Houston, TX
Print_ISBN :
0-7803-4122-8
Type :
conf
DOI :
10.1109/ICNN.1997.614180
Filename :
614180
Link To Document :
بازگشت