DocumentCode :
1696095
Title :
Multiple parallel hidden layers and other improvements to recurrent neural network language modeling
Author :
Caseiro, Diamantino ; Ljolje, Andrej
Author_Institution :
AT&T Labs. Res., Florham Park, NJ, USA
fYear :
2013
Firstpage :
8426
Lastpage :
8429
Abstract :
Recurrent neural network language modeling (RNNLM) have been shown to outperform most other advanced language modeling techniques, however, it suffers from high computational complexity. In this paper, we present techniques for building faster and more accurate RNNLMs. In particular, we show that Brown clustering of the vocabulary is much more effective than other techniques. We also present an algorithm for converting an ensemble of RNNLMs into a single model that can be further tuned or adapted. The resulting models have significantly lower perplexity than single models with the same number of parameters. An error rate reduction of 5.9% was observed on a state of the art multi-pass voice-mail to text ASR system using RNNLMs trained with the proposed algorithm.
Keywords :
computational linguistics; recurrent neural nets; speech recognition; Brown clustering; RNNLM; automatic speech recognition; error rate reduction; multipass voice-mail; multiple parallel hidden layer; recurrent neural network language modeling; text ASR system; Adaptation models; Computational modeling; Data models; Hidden Markov models; Interpolation; Training; Vocabulary; Automatic Speech Recognition; Language Modeling; Multiple Parallel Hidden Layers; Recurrent Neural Network Language Model;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Acoustics, Speech and Signal Processing (ICASSP), 2013 IEEE International Conference on
Conference_Location :
Vancouver, BC
ISSN :
1520-6149
Type :
conf
DOI :
10.1109/ICASSP.2013.6639309
Filename :
6639309
Link To Document :
بازگشت