DocumentCode :
1171674
Title :
Neural networks for high-storage content-addressable memory: VLSI circuit and learning algorithm
Author :
Verleysen, Michel ; Sirletti, B. ; Vandemeulebroecke, André M. ; Jespers, Paul G A
Author_Institution :
Lab. of Microelectron., Univ. Catholique de Louvain, Louvain-la-Neuve, Belgium
Volume :
24
Issue :
3
fYear :
1989
fDate :
6/1/1989 12:00:00 AM
Firstpage :
562
Lastpage :
569
Abstract :
An implementation of a VLSI fully interconnected neural network with only two binary memory points per synapse is described. The small area of single synaptic cells allows implementation of neural networks with hundreds of neurons. Classical learning algorithms like the Hebb´s rule show a poor storage capacity, especially in VLSI neural networks where the range of the synapse weights is limited by the number of memory points contained in each connection; an algorithm for programming a Hopfield neural network as a high-storage content-addressable memory is proposed. The storage capacity obtained with this algorithm is very promising for pattern-recognition applications
Keywords :
VLSI; computerised pattern recognition; content-addressable storage; integrated memory circuits; learning systems; neural nets; programming; CAM; Hopfield neural network; VLSI circuit; analogue VLSI implementation; content-addressable memory; fully interconnected neural network; learning algorithm; pattern-recognition applications; programming; storage capacity; synapse weights; synaptic cells; Artificial intelligence; Artificial neural networks; Biological neural networks; Biology computing; Circuits; Hopfield neural networks; Neural networks; Neurons; Pattern recognition; Very large scale integration;
fLanguage :
English
Journal_Title :
Solid-State Circuits, IEEE Journal of
Publisher :
ieee
ISSN :
0018-9200
Type :
jour
DOI :
10.1109/4.32008
Filename :
32008
Link To Document :
بازگشت