Title :
Fast bounded smooth regression with lazy neural trees
Author_Institution :
Inst. fur Inf., Freiburg Univ., Germany
fDate :
27 Jun-2 Jul 1994
Abstract :
Proposes the lazy neural tree (LNT) as the appropriate architecture for the realization of smooth regression systems. The LNT is a hybrid of a decision tree and a neural network. From the neural network it inherits smoothness of the generated function, incremental adaptability, and conceptual simplicity. From the decision tree it inherits the topology and initial parameter setting as well as a very efficient sequential implementation that out-performs traditional neural network simulations by the order of magnitudes. The enormous speed is achieved by lazy evaluation. A further speed-up can be obtained by the application of a windowing scheme if the region of interesting results is restricted
Keywords :
decision theory; neural nets; statistical analysis; trees (mathematics); conceptual simplicity; decision tree; fast bounded smooth regression; incremental adaptability; lazy evaluation; lazy neural trees; neural network; sequential implementation; topology; windowing scheme; Artificial neural networks; Classification tree analysis; Decision trees; Feedforward neural networks; Multidimensional systems; Network topology; Neural networks; Neurons; Regression tree analysis; Transfer functions;
Conference_Titel :
Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on
Conference_Location :
Orlando, FL
Print_ISBN :
0-7803-1901-X
DOI :
10.1109/ICNN.1994.374421