Title :
Strategies for reducing the complexity of a RNN based speech recognizer
Author :
Kasper, Klaus ; Reininger, H. ; Wust, H.
Author_Institution :
Inst. fur Angewandte Phys., Frankfurt Univ., Germany
Abstract :
Recurrent neural networks (RNN) provide a solution for low cost speech recognition systems (SRS) in mass products or in products with energetic constraints if their inherent parallelism could be exploited in a hardware realization. Actually, the computational complexity of SRS based on fully recurrent neural networks (FRNN), e.g. the large number of connections, prevents a hardware realization. We introduce locally recurrent neural networks (LRNN) in order to keep the properties of RNN on the one hand and to reduce the connectivity density of the network on the other hand. By simulation experiments it is shown that the recognition capability of LRNN is equivalent to that of FRNN and superior to other proposed network architectures. Furthermore, it is shown that with an appropriate representation of the network parameters and a retraining of the network 5 Bit quantization of the weights and activities is possible without significant loss in recognition performance
Keywords :
computational complexity; recurrent neural nets; speech recognition; computational complexity; connectivity density; locally recurrent neural networks; low cost speech recognition systems; recognition performance; speech recognizer; Computer networks; Costs; Hardware; Hidden Markov models; Network topology; Neurons; Parallel processing; Recurrent neural networks; Speech recognition; Vents;
Conference_Titel :
Acoustics, Speech, and Signal Processing, 1996. ICASSP-96. Conference Proceedings., 1996 IEEE International Conference on
Conference_Location :
Atlanta, GA
Print_ISBN :
0-7803-3192-3
DOI :
10.1109/ICASSP.1996.550596