• DocumentCode
    288540
  • Title

    Fast bounded smooth regression with lazy neural trees

  • Author

    Heinz, Alois P.

  • Author_Institution
    Inst. fur Inf., Freiburg Univ., Germany
  • Volume
    3
  • fYear
    1994
  • fDate
    27 Jun-2 Jul 1994
  • Firstpage
    1755
  • Abstract
    Proposes the lazy neural tree (LNT) as the appropriate architecture for the realization of smooth regression systems. The LNT is a hybrid of a decision tree and a neural network. From the neural network it inherits smoothness of the generated function, incremental adaptability, and conceptual simplicity. From the decision tree it inherits the topology and initial parameter setting as well as a very efficient sequential implementation that out-performs traditional neural network simulations by the order of magnitudes. The enormous speed is achieved by lazy evaluation. A further speed-up can be obtained by the application of a windowing scheme if the region of interesting results is restricted
  • Keywords
    decision theory; neural nets; statistical analysis; trees (mathematics); conceptual simplicity; decision tree; fast bounded smooth regression; incremental adaptability; lazy evaluation; lazy neural trees; neural network; sequential implementation; topology; windowing scheme; Artificial neural networks; Classification tree analysis; Decision trees; Feedforward neural networks; Multidimensional systems; Network topology; Neural networks; Neurons; Regression tree analysis; Transfer functions;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on
  • Conference_Location
    Orlando, FL
  • Print_ISBN
    0-7803-1901-X
  • Type

    conf

  • DOI
    10.1109/ICNN.1994.374421
  • Filename
    374421