• DocumentCode
    659187
  • Title

    Maximum likelihood associative memories

  • Author

    Gripon, Vincent ; Rabbat, Michael

  • Author_Institution
    Electron. Dept., Telecom Bretagne, Brest, France
  • fYear
    2013
  • fDate
    9-13 Sept. 2013
  • Firstpage
    1
  • Lastpage
    5
  • Abstract
    Associative memories are structures that store data in such a way that it can later be retrieved given only a part of its content - a sort-of error/erasure-resilience property. They are used in applications ranging from caches and memory management in CPUs to database engines. In this work we study associative memories built on the maximum likelihood principle. We derive minimum residual error rates when the data stored comes from a uniform binary source. Second, we determine the minimum amount of memory required to store the same data. Finally, we bound the computational complexity for message retrieval. We then compare these bounds with two existing associative memory architectures: the celebrated Hopfield neural networks and a neural network architecture introduced more recently by Gripon and Berrou.
  • Keywords
    Hopfield neural nets; computational complexity; content-addressable storage; maximum likelihood decoding; Hopfield neural networks; computational complexity; error-erasure-resilience property; maximum likelihood associative memories; maximum likelihood decoding principle; message retrieval; minimum residual error rates; uniform binary source; Associative memory; Biological neural networks; Computational complexity; Computer architecture; Error analysis;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Information Theory Workshop (ITW), 2013 IEEE
  • Conference_Location
    Sevilla
  • Print_ISBN
    978-1-4799-1321-3
  • Type

    conf

  • DOI
    10.1109/ITW.2013.6691310
  • Filename
    6691310