Title :
Speeding up MLP execution by approximating neural network activation functions
Author :
Cancelliere, Rossella
Author_Institution :
Dipt. di Matematica, Torino Univ., Italy
fDate :
31 Aug-2 Sep 1998
Abstract :
At present the multilayer perceptron model (MLP) is, without doubt, the most used neural network for applications so it is important to design and test methods to improve MLP efficiency at run time. This paper analyzes the error introduced by a simple but effective method to cut down execution time for MLP networks dealing with a sequential input, this is a very common case, including all kinds of temporal processing, like speech, video, and in general signals varying in time. The technique requires neither specialized hardware nor large quantities of additional memory and is based on the ubiquitous idea of difference transmission, widely used in signal coding. It requires the introduction of a sort of quantization of the unit activation function; this causes an error which is analyzed in this paper from a theoretical point of view
Keywords :
approximation theory; encoding; multilayer perceptrons; signal processing; time-varying systems; transfer functions; MLP execution; difference transmission; multilayer perceptron model; neural network activation function approximation; quantization; run-time efficiency; sequential input; signal coding; speech; temporal processing; time-varying signals; unit activation function; video; Design methodology; Error analysis; Multi-layer neural network; Multilayer perceptrons; Neural networks; Signal analysis; Signal processing; Speech analysis; Speech processing; Testing;
Conference_Titel :
Neural Networks for Signal Processing VIII, 1998. Proceedings of the 1998 IEEE Signal Processing Society Workshop
Conference_Location :
Cambridge
Print_ISBN :
0-7803-5060-X
DOI :
10.1109/NNSP.1998.710659