Title :
High Speed, Programmable Implementation of a Tanh-like Activation Function and Its Derivative for Digital Neural Networks
Author :
Marra, S. ; Iachino, M.A. ; Morabito, F.C.
Author_Institution :
Univ. Mediterranea of Reggio Calabria, Reggio Calabria
Abstract :
Digital hardware implementations of neural networks demand the efficient computation of the neurons activation function. In this paper, two new circuits to implement a programmable tanh-like activation function and its derivative are presented. The function exhibits learning and generalization abilities perfectly comparable with those achieved by the typical activation functions but, what is more, can be easily implemented in hardware through only binary shift operations. The first derivative, contrary to the classical solutions, requires a simple constant coefficient multiplier resulting in a great advantage in terms of area occupancy and performance. The accuracy analysis carried out by the average and maximum error, the high computational speed and the small amount of hardware resources point out that the proposed approach is fully competitive with the most recent implementations of activation functions capable of learning.
Keywords :
neural nets; binary shift operations; digital hardware; digital neural networks; neuron activation function; programmable tanh-like activation function; Approximation methods; Arithmetic; Circuits; Clocks; Computer networks; Field programmable gate arrays; Microcomputers; Neural network hardware; Neural networks; Piecewise linear techniques;
Conference_Titel :
Neural Networks, 2007. IJCNN 2007. International Joint Conference on
Conference_Location :
Orlando, FL
Print_ISBN :
978-1-4244-1379-9
Electronic_ISBN :
1098-7576
DOI :
10.1109/IJCNN.2007.4371008