• DocumentCode
    285200
  • Title

    Partially trained neural networks based on partition of unity

  • Author

    Choi, Chong-Ho ; Choi, Jin Young

  • Author_Institution
    Dept. of Control & Instrum. Eng., Seoul Nat. Univ., South Korea
  • Volume
    3
  • fYear
    1992
  • fDate
    7-11 Jun 1992
  • Firstpage
    811
  • Abstract
    The authors propose partially trained neural networks (PTNNs) where only a part of connection weights are trained at a time to improve generalization, learning speed, computational time and incremental learning capabilities. PTNNs are composed of many small neural network fractions and firing neurons. The firing neuron fires a fraction of the PTNN depending on input patterns. The main features of the PTNN are partial update of weights, self-determined network size, no corruption of the old learning, reduced computational time, reduced connections, and fast convergence for a complicated problem. Simulations reported that the learning speed and computational time of PTNNs were superior to those of the standard neural networks for a complicated continuous function and the two-spiral problem
  • Keywords
    learning (artificial intelligence); neural nets; PTNNs; connection weights; continuous function; firing neurons; neural network fractions; partially trained neural networks; partition of unity; two-spiral problem; Artificial neural networks; Backpropagation; Computational modeling; Computer networks; Convergence; Instruments; Multi-layer neural network; Neural networks; Neurons; Partitioning algorithms;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1992. IJCNN., International Joint Conference on
  • Conference_Location
    Baltimore, MD
  • Print_ISBN
    0-7803-0559-0
  • Type

    conf

  • DOI
    10.1109/IJCNN.1992.227052
  • Filename
    227052