Title :
Lower bounds on the capacities of binary and ternary networks storing sparse random vectors
Author :
Baram, Yoram ; Sal´ee, D.
Author_Institution :
Technion-Israel Inst. of Technol., Haifa, Israel
fDate :
11/1/1992 12:00:00 AM
Abstract :
It is shown that the memory capacity of networks of binary neurons storing, by the Hebbian rule, sparse random vectors over the field {0, 1}N is at least c(N/p log N ), where c is a positive scalar involving input error probabilities probability of an element being nonzero. A similar bound is derived for networks of ternary neurons, storing sparse vectors over {-1,0,1}N. These results, pertaining to stability and error correction with probability tending to one as the number of neurons tends to infinity, generalize and extend previously known capacity bounds for binary networks storing vectors of equally probable {±1} bits. Lower bounds on the capacities of binary and ternary networks of finite sizes are also derived. These bounds suggest critical network sizes that guarantee high gains in capacity per neuron for given sparsites
Keywords :
Hebbian learning; content-addressable storage; error correction; neural nets; vectors; Hebbian rule; associative memory; binary networks; binary neurons; error correction; input error probabilities; lower bounds; memory capacity; neural networks; sparse random vectors; stability; ternary networks; ternary neurons; Associative memory; Capacity planning; Character recognition; Error correction; Error probability; H infinity control; Neurons; Sampling methods; Sensor arrays; Stability;
Journal_Title :
Information Theory, IEEE Transactions on