Abstract :
An alternative approach to implementing nonlinear activation functions for digital neural networks is presented. Unlike other methods, this approach has the advantage that it is processed by a common arithmetic unit already required for the network computations and it does not require the use of a look-up table.