DocumentCode :
2875357
Title :
Integrating a non-probabilistic grammar into large vocabulary continuous speech recognition
Author :
Beutler, René ; Kaufmann, Tobias ; Pfister, Beat
Author_Institution :
Comput. Eng. & Networks Lab., ETH Zurich
fYear :
2005
fDate :
27-27 Nov. 2005
Firstpage :
104
Lastpage :
109
Abstract :
We propose a method of incorporating a non-probabilistic grammar into large vocabulary continuous speech recognition (LVCSR). Our basic assumption is that the utterances to be recognized are grammatical to a sufficient degree, which enables us to decrease the word error rate by favouring grammatical phrases. We use a parser and a handcrafted grammar to identify grammatical phrases in word lattices produced by a speech recognizer. This information is then used to rescore the word lattice. We measured the benefit of our method by extending an LVCSR baseline system (based on hidden Markov models and a 4-gram language model) with our rescoring component. We achieved a statistically significant reduction in word error rate compared to the baseline system
Keywords :
grammars; hidden Markov models; natural languages; speech recognition; vocabulary; grammatical phrases; hidden Markov models; large vocabulary continuous speech recognition; nonprobabilistic grammar; word error rate; Computer networks; Error analysis; Hidden Markov models; Laboratories; Lattices; Natural languages; Speech processing; Speech recognition; Statistics; Vocabulary;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Automatic Speech Recognition and Understanding, 2005 IEEE Workshop on
Conference_Location :
San Juan
Print_ISBN :
0-7803-9478-X
Electronic_ISBN :
0-7803-9479-8
Type :
conf
DOI :
10.1109/ASRU.2005.1566496
Filename :
1566496
Link To Document :
بازگشت