• DocumentCode
    3408545
  • Title

    Superimposing memory by dynamic and spatial changing synaptic weights

  • Author

    Homma, Noriyasu ; Gupta, Madan M. ; Abe, Kenichi ; Takeda, Hiroshi

  • Author_Institution
    Tohoku Univ., Sendai, Japan
  • Volume
    5
  • fYear
    2002
  • fDate
    5-7 Aug. 2002
  • Firstpage
    3100
  • Abstract
    A novel neural network model is presented for incremental learning tasks where networks are required to learn new knowledge without forgetting the old. An essential core of the proposed neural network structure is its dynamic and spatial changing weights (DSCWs). A learning scheme is developed for the formulation of the dynamic changing weights, while a structural adaptation is formulated by the spatial changing (growing) connecting weights. As the new. synaptic connections are formed, a new network structure is superimposed on the previous structure. In this superimposition, to avoid disturbing the past knowledge due to the creation of new connections, a restoration mechanism is introduced by using the DSCWs. The usefulness of the proposed model is demonstrated by using pattern classification and system identification tasks.
  • Keywords
    learning (artificial intelligence); neural nets; pattern classification; DSCWs; dynamic spatial changing synaptic weights; function approximation; incremental learning tasks; learning scheme; long-term memory; neural network model; past knowledge; pattern classification; restoration mechanism; structural adaptation; synaptic connections; system identification tasks; Backpropagation; Computational modeling; Joining processes; Nerve fibers; Neural networks; Neurons;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    SICE 2002. Proceedings of the 41st SICE Annual Conference
  • Print_ISBN
    0-7803-7631-5
  • Type

    conf

  • DOI
    10.1109/SICE.2002.1195603
  • Filename
    1195603