DocumentCode
288335
Title
The second derivative of a recurrent network
Author
Piché, Stephen W.
Author_Institution
Microelectron. & Comput. Technol. Corp., Austin, TX, USA
Volume
1
fYear
1994
fDate
27 Jun-2 Jul 1994
Firstpage
245
Abstract
The equations for the exact calculation of the second derivative of an error function with respect to the weights (Hessian matrix) of a recurrent network are presented in this paper. The second derivative of feedforward networks has proven useful for fast retraining, weight pruning, and output error estimation. However, until now, techniques based upon the Hessian could not be used for recurrent networks because no exact equations for the second derivative existed. It is the author´s hope that the equations presented which allow for the exact calculation of the second derivative will prove useful in the development of new methods for designing recurrent networks
Keywords
Hessian matrices; iterative methods; learning (artificial intelligence); recurrent neural nets; Hessian matrix; error function; fast retraining; output error estimation; recurrent network; second derivative; weight pruning; Computational efficiency; Computer errors; Computer networks; Design methodology; Electronic mail; Equations; Error analysis; Estimation error; Microelectronics; Taylor series;
fLanguage
English
Publisher
ieee
Conference_Titel
Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on
Conference_Location
Orlando, FL
Print_ISBN
0-7803-1901-X
Type
conf
DOI
10.1109/ICNN.1994.374169
Filename
374169
Link To Document