DocumentCode :
1501938
Title :
Stochastic Gradient Descent Inspired Training Technique for a CMOS/Nano Memristive Trainable Threshold Gate Array
Author :
Manem, Harika ; Rajendran, Jeyavijayan ; Rose, Garrett S.
Author_Institution :
Dept. of Electr. & Comput. Eng., New York Univ., Brooklyn, NY, USA
Volume :
59
Issue :
5
fYear :
2012
fDate :
5/1/2012 12:00:00 AM
Firstpage :
1051
Lastpage :
1060
Abstract :
Neuromorphic computing is an attractive avenue of research for processing and learning complex real-world data. With technology migration into nano and molecular scales several area and power efficient approaches to the design and implementation of artificial neural networks have been proposed. The discovery of the memristor has further enabled the realization of denser nanoscale logic and memory systems by facilitating the implementation of multilevel logic. Specifically, the innate reconfigurability of memristors can be exploited to realize synapses in artificial neural networks. This work focuses on the development of a variation-tolerant training methodology to efficiently reconfigure memristive synapses in a Trainable Threshold Gate Array (TTGA) system. The training process is inspired from the gradient descent machine learning algorithm commonly used to train artificial threshold neural networks, perceptrons. The design and CMOS/Nano implementation of the TTGA system from trainable perceptron based threshold gates is detailed and results are provided to showcase the training process and performance characteristics of the proposed system. Also shown are the results for training a 1T1M (1 Transistor and 1 Memristor) multilevel memristive memory and its performance characteristics.
Keywords :
CMOS logic circuits; electronic engineering computing; gradient methods; learning (artificial intelligence); logic gates; memristors; multivalued logic; neural nets; CMOS; artificial neural network; artificial threshold neural network training; complex real-world data learning; gradient descent machine learning algorithm; memory system; memristor reconfigurability; multilevel logic; multilevel memristive memory; nanomemristive trainable threshold gate array; nanoscale logic; neuromorphic computing; perceptron training; reconfigure memristive synapses; stochastic gradient descent inspired training technique; technology migration; transistor; variation-tolerant training methodology; Boolean functions; CMOS integrated circuits; Hardware; Logic gates; Memristors; Training; Vectors; Digital integrated circuits; VLSI; machine learning; memristor; nanoelectronics; neural networks;
fLanguage :
English
Journal_Title :
Circuits and Systems I: Regular Papers, IEEE Transactions on
Publisher :
ieee
ISSN :
1549-8328
Type :
jour
DOI :
10.1109/TCSI.2012.2190665
Filename :
6189064
Link To Document :
بازگشت