DocumentCode :
2029153
Title :
TV-gram language models for offline handwritten text recognition
Author :
Zimmermann, Matthias ; Bunke, Horst
Author_Institution :
Dept. of Comput. Sci., Bern Univ., Switzerland
fYear :
2004
fDate :
26-29 Oct. 2004
Firstpage :
203
Lastpage :
208
Abstract :
This paper investigates the impact of bigram and trigram language models on the performance of a hidden Markov model (HMM) based offline recognition system for handwritten sentences. The language models are trained on the LOB corpus which is supplemented by various additional sources of text, including sentences from additional corpora and random sentences produced by a stochastic context-free grammar (SCFG). Experimental results are provided in terms of test set perplexity and performance of the corresponding recognition systems. For the text recognition experiments handwritten material from the IAM database has been used.
Keywords :
handwritten character recognition; hidden Markov models; stochastic processes; gram language models; hidden Markov model; offline handwritten text recognition; stochastic context-free grammar; test set perplexity; Context modeling; Databases; Handwriting recognition; Hidden Markov models; Natural languages; Probability; Speech recognition; Stochastic processes; System testing; Text recognition;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Frontiers in Handwriting Recognition, 2004. IWFHR-9 2004. Ninth International Workshop on
ISSN :
1550-5235
Print_ISBN :
0-7695-2187-8
Type :
conf
DOI :
10.1109/IWFHR.2004.71
Filename :
1363911
Link To Document :
بازگشت