Title :
The capacity of the Kanerva associative memory
Author_Institution :
Dept. of Electr. Eng., Stanford Univ., CA, USA
fDate :
3/1/1989 12:00:00 AM
Abstract :
Asymptotic expressions for the capacity of an associative memory proposed by P. Kanerva (1984) are derived. Capacity is defined as the maximum number of random binary words that can be stored at random addresses so that the probability that a word is in error is arbitrarily small when it is retrieved by an n-bit address containing fewer than δn errors, δ⩽1/2. Sphere-packing arguments show that the capacity of any associative memory can grow exponentially in n at a rate of at most 1-h2(δ), where h2(δ) is the binary entropy function in bits. It turns out that the Kanerva associative memory achieves this upper bound when its parameters are optimally set. Thus, the capacity of the Kanerva associative memory has an exponential growth rate equal to the rate of the best information-theoretic codes, that is 1-h 2(δ). However, the Kanerva memory achieves its exponential growth in capacity at the expense of an exponential growth in hardware
Keywords :
content-addressable storage; information theory; neural nets; Kanerva associative memory; asymptotic expressions; binary entropy function; capacity; information-theoretic codes; neural networks; random addresses; random binary words; sphere packing arguments; upper bound; Artificial neural networks; Associative memory; Brain modeling; Capacity planning; Computational modeling; Entropy; Hardware; Humans; Upper bound; Vector quantization;
Journal_Title :
Information Theory, IEEE Transactions on