• DocumentCode
    597237
  • Title

    RRAM-based adaptive neural logic block for implementing non-linearly separable functions in a single layer

  • Author

    Soltiz, M. ; Merkel, Cory ; Kudithipudi, Dhireesha ; Rose, Garrett S.

  • Author_Institution
    Dept. of Comput. Eng., Rochester Inst. of Technol., Rochester, NY, USA
  • fYear
    2012
  • fDate
    4-6 July 2012
  • Firstpage
    218
  • Lastpage
    225
  • Abstract
    As the efficiency of neuromorphic systems improves, biologically-inspired learning techniques are becoming more and more appealing for various computing applications, ranging from pattern and character recognition to general purpose reconfigurable logic. Due to their functional similarities to synapses in the brain, memristors are becoming a key element in the hardware realization of Hebbian Learning systems. By pairing such devices and a perceptron-based neuron model with a threshold activation function, previous work has shown that a neural logic block capable of learning any linearly separable function in real-time can be developed. However, in this configuration, any function with two or more decision boundaries cannot be learned in a single layer. While previous memristor-based neural logic block designs have proven to achieve very low area and high performance when compared to Look-Up Tables (LUT) and Capacitive Threshold Logic (CTL), the limitation on the set of learnable functions has made networks of these logic blocks impractical to scale to realistic applications. By integrating an additional layer of memristors into a neural logic block, this paper proposes a logic block with an adaptive activation function. The resulting logic block is capable of learning any function in a single layer, reducing the number of logic blocks required to implement a single 4-input function by up to 10 and significantly improving training time. When considered as a building block for ISCAS-85 benchmark circuits, the proposed logic block is capable of achieving an Energy-Delay Product (EDP) up to 97.8% lower than a neural logic block with a threshold activation function. Furthermore, the performance improvement over a CMOS LUT implementation ranges from 78.08% to 97.43% for all ISCAS-85 circuits.
  • Keywords
    CMOS integrated circuits; Hebbian learning; brain; memristors; random-access storage; table lookup; CMOS LUT implementation; Hebbian learning systems; ISCAS-85 benchmark circuits; RRAM-based adaptive neural logic block; adaptive activation function; biologically-inspired learning techniques; brain; decision boundaries; energy-delay product; memristors; neuromorphic systems; nonlinearly separable functions; perceptron-based neuron; single 4-input function; single layer; threshold activation function; Memristors; Multiplexing; Nanobioscience; Neurons;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Nanoscale Architectures (NANOARCH), 2012 IEEE/ACM International Symposium on
  • Conference_Location
    Amsterdam
  • Print_ISBN
    978-1-4503-1671-2
  • Type

    conf

  • Filename
    6464166