DocumentCode :
3166716
Title :
Performance analysis of Neural Networks in combination with n-gram language models
Author :
Oparin, Ilya ; Sundermeyer, Martin ; Ney, Hermann ; Gauvain, Jean-Luc
Author_Institution :
Spoken Language Process. Group, LIMSI, Orsay, France
fYear :
2012
fDate :
25-30 March 2012
Firstpage :
5005
Lastpage :
5008
Abstract :
Neural Network language models (NNLMs) have recently become an important complement to conventional n-gram language models (LMs) in speech-to-text systems. However, little is known about the behavior of NNLMs. The analysis presented in this paper aims to understand which types of events are better modeled by NNLMs as compared to n-gram LMs, in what cases improvements are most substantial and why this is the case. Such an analysis is important to take further benefit from NNLMs used in combination with conventional n-gram models. The analysis is carried out for different types of neural network (feed-forward and recurrent) LMs. The results showing for which type of events NNLMs provide better probability estimates are validated on two setups that are different in their size and the degree of data homogeneity.
Keywords :
feedforward neural nets; recurrent neural nets; speech synthesis; NNLM; data homogeneity; feedforward neural network; n-gram language models; neural network language models; performance analysis; recurrent neural network; speech-to-text systems; Analytical models; Artificial neural networks; History; Interpolation; Training data; Vocabulary; Neural network; STT; language model;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Acoustics, Speech and Signal Processing (ICASSP), 2012 IEEE International Conference on
Conference_Location :
Kyoto
ISSN :
1520-6149
Print_ISBN :
978-1-4673-0045-2
Electronic_ISBN :
1520-6149
Type :
conf
DOI :
10.1109/ICASSP.2012.6289044
Filename :
6289044
Link To Document :
بازگشت