Title :
Converting Neural Network Language Models into back-off language models for efficient decoding in automatic speech recognition
Author :
Arisoy, Ebru ; Chen, S.F. ; Ramabhadran, Bhuvana ; Sethy, Abhinav
Author_Institution :
IBM T.J. Watson Res. Center, Yorktown Heights, NY, USA
Abstract :
Neural Network Language Models (NNLMs) have achieved very good performance in large-vocabulary continuous speech recognition (LVCSR) systems. Because decoding with NNLMs is very computationally expensive, there is interest in developing methods to approximate NNLMs with simpler language models that are suitable for fast decoding. In this work, we propose an approximate method for converting a feedforward NNLM into a back-off n-gram language model that can be used directly in existing LVCSR decoders. We convert NNLMs of increasing order to pruned back-off language models, using lower-order models to constrain the n-grams allowed in higher-order models. In experiments on Broadcast News data, we find that the resulting back-off models retain the bulk of the gain achieved by NNLMs over conventional n-gram language models, and give significant accuracy improvements as compared to existing methods for converting NNLMs to back-off models. In addition, the proposed approach can be applied to any type of non-back-off language model to enable efficient decoding.
Keywords :
decoding; feedforward neural nets; speech recognition; LVCSR decoders; LVCSR systems; automatic speech recognition; back-off n-gram language model; broadcast news data; feedforward NNLM; higher-order models; large-vocabulary continuous speech recognition systems; lower-order models; neural network language models; non-back-off language model; Artificial neural networks; Computational modeling; Data models; Decoding; Lattices; Speech; Vocabulary; Neural network language models; decoding with neural network language models;
Conference_Titel :
Acoustics, Speech and Signal Processing (ICASSP), 2013 IEEE International Conference on
Conference_Location :
Vancouver, BC
DOI :
10.1109/ICASSP.2013.6639272