Title :
Tree-structured neural networks: efficient evaluation of higher-order derivatives and integrals
Author_Institution :
Inst. fur Inf., Freiburg Univ., Germany
Abstract :
Tree-structured neural networks (TSNN) are universal approximators that have extremely fast evaluation procedures if so-called lazy activation functions are used. In this paper it is shown how to choose a proper lazy activation function such that the existence of continuous derivatives of order n and less is guaranteed. A fast algorithm is presented that evaluates the n-th order derivative of a univariate multidimensional TSNN function using Taylor expansions of a functional decomposition. The same technique is used to derive a fast algorithm for the evaluation of definite integrals
Keywords :
differential equations; function approximation; integral equations; mathematics computing; neural nets; Taylor expansions; function approximation; functional decomposition; higher-order derivatives; higher-order integrals; lazy activation functions; tree-structured neural networks; Binary trees; Computer applications; Feedforward systems; Integral equations; Integrodifferential equations; Multidimensional systems; Neural networks; Performance evaluation; Polynomials; Taylor series;
Conference_Titel :
Neural Networks, 2000. IJCNN 2000, Proceedings of the IEEE-INNS-ENNS International Joint Conference on
Conference_Location :
Como
Print_ISBN :
0-7695-0619-4
DOI :
10.1109/IJCNN.2000.857884