DocumentCode :
1029734
Title :
Lower bounds on the capacities of binary and ternary networks storing sparse random vectors
Author :
Baram, Yoram ; Sal´ee, D.
Author_Institution :
Technion-Israel Inst. of Technol., Haifa, Israel
Volume :
38
Issue :
6
fYear :
1992
fDate :
11/1/1992 12:00:00 AM
Firstpage :
1633
Lastpage :
1647
Abstract :
It is shown that the memory capacity of networks of binary neurons storing, by the Hebbian rule, sparse random vectors over the field {0, 1}N is at least c(N/p log N ), where c is a positive scalar involving input error probabilities probability of an element being nonzero. A similar bound is derived for networks of ternary neurons, storing sparse vectors over {-1,0,1}N. These results, pertaining to stability and error correction with probability tending to one as the number of neurons tends to infinity, generalize and extend previously known capacity bounds for binary networks storing vectors of equally probable {±1} bits. Lower bounds on the capacities of binary and ternary networks of finite sizes are also derived. These bounds suggest critical network sizes that guarantee high gains in capacity per neuron for given sparsites
Keywords :
Hebbian learning; content-addressable storage; error correction; neural nets; vectors; Hebbian rule; associative memory; binary networks; binary neurons; error correction; input error probabilities; lower bounds; memory capacity; neural networks; sparse random vectors; stability; ternary networks; ternary neurons; Associative memory; Capacity planning; Character recognition; Error correction; Error probability; H infinity control; Neurons; Sampling methods; Sensor arrays; Stability;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/18.165439
Filename :
165439
Link To Document :
بازگشت