Title :
Scaling recurrent neural network language models
Author :
Williams, Will ; Prasad, Niranjani ; Mrva, David ; Ash, Tom ; Robinson, Tony
Author_Institution :
Cantab Res., Cambridge, UK
Abstract :
This paper investigates the scaling properties of Recurrent Neural Network Language Models (RNNLMs). We discuss how to train very large RNNs on GPUs and address the questions of how RNNLMs scale with respect to model size, training-set size, computational costs and memory. Our analysis shows that despite being more costly to train, RNNLMs obtain much lower perplexities on standard benchmarks than n-gram models. We train the largest known RNNs and present relative word error rates gains of 18% on an ASR task. We also present the new lowest perplexities on the recently released billion word language modelling benchmark, 1 BLEU point gain on machine translation and a 17% relative hit rate gain in word prediction.
Keywords :
graphics processing units; language translation; prediction theory; recurrent neural nets; speech recognition; word processing; ASR; GPU; RNN; RNNLM scaling properties; machine translation; n-gram model; scaling recurrent neural network language model; speech recognition; word error rate; word prediction; Benchmark testing; Computational modeling; Entropy; Graphics processing units; Recurrent neural networks; Training; GPU; RNNLM; language modelling; recurrent neural network; speech recognition;
Conference_Titel :
Acoustics, Speech and Signal Processing (ICASSP), 2015 IEEE International Conference on
Conference_Location :
South Brisbane, QLD
DOI :
10.1109/ICASSP.2015.7179001