• DocumentCode
    1928034
  • Title

    A study on on-line learning of NNTrees

  • Author

    Takaharu, Takeda ; Zhao, Qiangfu ; Liu, Yong

  • Author_Institution
    Univ. of Aizu, Aizuwakamatsu, Japan
  • Volume
    4
  • fYear
    2003
  • fDate
    20-24 July 2003
  • Firstpage
    2540
  • Abstract
    A neural network tree (NNTree) is a hybrid learning model with the overall structure being a decision tree (DT), and each nonterminal node containing a neural network (NN). Using NNTrees, it is possible to learn new knowledge online by adjusting the NNs in the nonterminal nodes. It is also possible to understand the learned knowledge online because the NNs in the nonterminal nodes are usually very small, and can be interpreted easily. Currently, we have studied retraining of the NNTrees; by adjusting the NNs in the nonterminal nodes. The structure of the trees is fixed during retraining. We found that this kind of retraining is good for size reduction in offline learning, if the training set is highly redundant. However, updating the NNs alone is not enough for online learning. In this paper, we introduce two methods for online learning of NNTrees. The first one is SGU (simple growing up), and the second one is GUWL (growing up with learning). The effectiveness of these methods are compared with each other through experiments with several public databases.
  • Keywords
    decision trees; learning (artificial intelligence); neural nets; NNTrees; decision tree; growing up with learning; hybrid learning model; neural network tree; nonterminal nodes; online learning; public databases; simple growing up; Computational complexity; Decision trees; Neural networks; Spatial databases;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 2003. Proceedings of the International Joint Conference on
  • ISSN
    1098-7576
  • Print_ISBN
    0-7803-7898-9
  • Type

    conf

  • DOI
    10.1109/IJCNN.2003.1223965
  • Filename
    1223965