• DocumentCode
    303305
  • Title

    A neural model of sequential memory

  • Author

    Wang, DeLiang ; Yuwono, Budi

  • Author_Institution
    Dept. of Comput. & Inf. Sci., Ohio State Univ., Columbus, OH, USA
  • Volume
    2
  • fYear
    1996
  • fDate
    3-6 Jun 1996
  • Firstpage
    834
  • Abstract
    A neural model for temporal pattern generation is analyzed for learning multiple complex sequences in a sequential manner. The network exhibits a degree of interference when new sequences are acquired. It is proven that the model is capable of incrementally learning a finite number of complex sequences. The model is then evaluated with a large set of highly correlated sequences. While the number of intact sequences increases linearly, the amount of retraining due to interference appears to be independent of the size of existing memory. The idea of chunking helps to substantially reduce the amount of retraining in sequential learning. The network investigated here constitutes an effective sequential memory
  • Keywords
    learning (artificial intelligence); neural nets; chunking; multiple complex sequences; neural model; retraining; sequential memory; temporal pattern generation; Artificial intelligence; Associative memory; Biological neural networks; Cognitive science; Context modeling; Information analysis; Information science; Interference; Multilayer perceptrons; Pattern analysis;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1996., IEEE International Conference on
  • Conference_Location
    Washington, DC
  • Print_ISBN
    0-7803-3210-5
  • Type

    conf

  • DOI
    10.1109/ICNN.1996.549005
  • Filename
    549005