DocumentCode :
6819
Title :
Efficient VLSI Implementation of Neural Networks With Hyperbolic Tangent Activation Function
Author :
Zamanlooy, Babak ; Mirhassani, Mitra
Author_Institution :
Dept. of Electr. & Comput. Eng., Univ. of Windsor, Windsor, ON, Canada
Volume :
22
Issue :
1
fYear :
2014
fDate :
Jan. 2014
Firstpage :
39
Lastpage :
48
Abstract :
Nonlinear activation function is one of the main building blocks of artificial neural networks. Hyperbolic tangent and sigmoid are the most used nonlinear activation functions. Accurate implementation of these transfer functions in digital networks faces certain challenges. In this paper, an efficient approximation scheme for hyperbolic tangent function is proposed. The approximation is based on a mathematical analysis considering the maximum allowable error as design parameter. Hardware implementation of the proposed approximation scheme is presented, which shows that the proposed structure compares favorably with previous architectures in terms of area and delay. The proposed structure requires less output bits for the same maximum allowable error when compared to the state-of-the-art. The number of output bits of the activation function determines the bit width of multipliers and adders in the network. Therefore, the proposed activation function results in reduction in area, delay, and power in VLSI implementation of artificial neural networks with hyperbolic tangent activation function.
Keywords :
VLSI; adders; integrated circuit design; integrated logic circuits; neural nets; transfer functions; VLSI implementation; adder bit width; artificial neural networks; building blocks; design parameter; digital networks; hardware implementation; hyperbolic tangent; hyperbolic tangent activation function; mathematical analysis; maximum allowable error; multiplier bit width; neural networks; nonlinear activation function; nonlinear activation functions; sigmoid; transfer functions; Hardware; Linear approximation; Mathematical analysis; Neural networks; Piecewise linear approximation; Very large scale integration; Hyperbolic tangent; VLSI implementation; neural networks; nonlinear activation function;
fLanguage :
English
Journal_Title :
Very Large Scale Integration (VLSI) Systems, IEEE Transactions on
Publisher :
ieee
ISSN :
1063-8210
Type :
jour
DOI :
10.1109/TVLSI.2012.2232321
Filename :
6409494
Link To Document :
بازگشت