Author_Institution :
Dept. of Comput. Sci., Technion-Israel Inst. of Technol., Haifa, Israel
Abstract :
A neural network that retrieves stored binary vectors, when probed by possibly corrupted versions of them, is presented. It employs sparse ternary internal coding and autocorrelation (Hebbian) storage. It is symmetrically structured and, consequently, can be folded into a feedback configuration. Bounds on the network parameters are derived from probabilistic considerations. It is shown that when the input dimension is n, the proportional activation radius is ρ and the network size is 2νn with ν>1-h2(ρ), the equilibrium capacity is at least 2αn/8nρ(1-ρ) for any α<1-h2(ρ), where h2(·) is the binary entropy. A similar capacity bound is derived for the correction of errors of proportional size ρ or less, when ρ⩽0.3. The performance of a finite-size symmetric network is examined by simulation and found to exceed, at the cost of higher connectivity, that of the Kanerva (1988) model, operating as a content addressable memory
Keywords :
Hebbian learning; content-addressable storage; correlation methods; encoding; error correction codes; neural nets; autocorrelation Hebbian storage; binary entropy; capacity bound; content addressable memory; corrupted versions; equilibrium capacity; feedback configuration; finite-size symmetric network; input dimension; network parameter bounds; network size; neural network; proportional activation radius; sparse ternary internal coding; stored binary vectors retrieval; symmetric sparsely encoded network; Associative memory; Autocorrelation; Biological system modeling; Capacity planning; Costs; Entropy; Error correction; Hydrogen; NASA; Neural networks; Neurofeedback; Neurons; Senior members;