Title :
Language-model look-ahead for large vocabulary speech recognition
Author :
Ortmanns, S. ; Ney, H. ; Eiden, A.
Author_Institution :
Lehrstuhl fur Inf., Tech. Hochschule Aachen, Germany
Abstract :
Presents an efficient look-ahead technique which incorporates the language model knowledge at the earliest possible stage during the search process. This so-called language model look-ahead is built into the time-synchronous beam search algorithm using a tree-organized pronunciation lexicon for a bigram language model. The language model look-ahead technique exploits the full knowledge of the bigram language model by distributing the language model probabilities over the nodes of the lexical tree for each predecessor word. We present a method for handling the resulting memory requirements. The recognition experiments performed on the 20,000-word North American Business task (Nov. 1996) demonstrate that, in comparison with the unigram look-ahead, a reduction by a factor of 5 in the acoustic search effort can be achieved without loss in recognition accuracy
Keywords :
acoustics; languages; nomograms; probability; speech recognition; tree searching; vocabulary; North American Business task; acoustic search effort; bigram language model; language model probabilities; language-model look-ahead technique; large-vocabulary speech recognition; lexical tree; memory requirements; predecessor words; recognition accuracy; time-synchronous beam search algorithm; tree-organized pronunciation lexicon; Acoustic beams; Costs; Dynamic programming; Natural languages; Search methods; Speech recognition; Structural beams; Testing; Vocabulary;
Conference_Titel :
Spoken Language, 1996. ICSLP 96. Proceedings., Fourth International Conference on
Conference_Location :
Philadelphia, PA
Print_ISBN :
0-7803-3555-4
DOI :
10.1109/ICSLP.1996.607215