• DocumentCode
    1209129
  • Title

    The self-trapping attractor neural network-part II: properties of a sparsely connected model storing multiple memories

  • Author

    Pavloski, Raymond ; Karimi, Majid

  • Author_Institution
    Dept. of Psychol., Indiana Univ. of Pennsylvania, PA, USA
  • Volume
    16
  • Issue
    6
  • fYear
    2005
  • Firstpage
    1427
  • Lastpage
    1439
  • Abstract
    In a previous paper, the self-trapping network (STN) was introduced as more biologically realistic than attractor neural networks (ANNs) based on the Ising model. This paper extends the previous analysis of a one-dimensional (1-D) STN storing a single memory to a model that stores multiple memories and that possesses generalized sparse connectivity. The energy, Lyapunov function, and partition function derived for the 1-D model are generalized to the case of an attractor network with only near-neighbor synapses, coupled to a system that computes memory overlaps. Simulations reveal that 1) the STN dramatically reduces intra-ANN connectivity without severly affecting the size of basins of attraction, with fast self-trapping able to sustain attractors even in the absence of intra-ANN synapses; 2) the basins of attraction can be controlled by a single free parameter, providing natural attention-like effects; 3) the same parameter determines the memory capacity of the network, and the latter is much less dependent than a standard ANN on the noise level of the system; 4) the STN serves as a useful memory for some correlated memory patterns for which the standard ANN totally fails; 5) the STN can store a large number of sparse patterns; and 6) a Monte Carlo procedure, a competitive neural network, and binary neurons with thresholds can be used to induce self-trapping.
  • Keywords
    Hopfield neural nets; Ising model; Lyapunov methods; Monte Carlo methods; content-addressable storage; pattern recognition; self-focusing; storage area networks; ANN; Hopfield model; Ising model; Lyapunov function; Monte Carlo methods; associative memory; binary neuron; competitive neural network; correlated memory pattern; coupled system; memory overlap; multiple memories; network memory capacity; partition function; self-trapping attractor neural network; single memory; Artificial neural networks; Biological system modeling; Computational modeling; Computer networks; Lyapunov method; Monte Carlo methods; Neural networks; Neurons; Noise level; Size control; Associative memory; Hopfield model; Ising model; attractor neural network (ANN); connectivity; coupled systems; self-trapping network (STN); Algorithms; Biomimetics; Computer Simulation; Information Storage and Retrieval; Memory; Models, Theoretical; Neural Networks (Computer); Signal Processing, Computer-Assisted;
  • fLanguage
    English
  • Journal_Title
    Neural Networks, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    1045-9227
  • Type

    jour

  • DOI
    10.1109/TNN.2005.852969
  • Filename
    1528521