Title :
Computing second derivatives in feed-forward networks: a review
Author :
Buntine, Wray L. ; Weigend, Andreas S.
Author_Institution :
Res. Inst. for Adv. Comput. Sci., NASA Ames Res. Center, Moffett Field, CA, USA
fDate :
5/1/1994 12:00:00 AM
Abstract :
The calculation of second derivatives is required by recent training and analysis techniques of connectionist networks, such as the elimination of superfluous weights, and the estimation of confidence intervals both for weights and network outputs. We review and develop exact and approximate algorithms for calculating second derivatives. For networks with |w| weights, simply writing the full matrix of second derivatives requires O(|w|2) operations. For networks of radial basis units or sigmoid units, exact calculation of the necessary intermediate terms requires of the order of 2h+2 backward/forward-propagation passes where h is the number of hidden units in the network. We also review and compare three approximations (ignoring some components of the second derivative, numerical differentiation, and scoring). The algorithms apply to arbitrary activation functions, networks, and error functions
Keywords :
backpropagation; differential equations; feedforward neural nets; function approximation; approximations; backwardpropagation; confidence interval estimation; connectionist networks; error functions; feedforward neural networks; forwardpropagation; full matrix; radial basis units; second derivatives; sigmoid units; Approximation algorithms; Backpropagation algorithms; Computer networks; Cost function; Feedforward systems; Intelligent networks; Iterative algorithms; Least squares approximation; Neural networks; Writing;
Journal_Title :
Neural Networks, IEEE Transactions on