DocumentCode :
3082794
Title :
A novel divergence measure for hidden Markov models
Author :
Mohammad, Maruf ; Tranter, W.H.
Author_Institution :
Mobile & Portable Radio Res. Group, Virginia Tech, Blacksburg, VA, USA
fYear :
2005
fDate :
8-10 April 2005
Firstpage :
240
Lastpage :
243
Abstract :
In this paper, a novel divergence measure for hidden Markov models (HMMs) is introduced. The widely used distance measure between two HMMs is the Kullback-Leibler divergence (KLD). The Monte-Carlo method is usually applied to calculate the KLD, whose computational complexity is prohibitive in practical applications. Numerical examples show that the proposed divergence measure closely approximates the KLD with a saving of hundreds of times in computational complexity.
Keywords :
computational complexity; hidden Markov models; HMM divergence measure; KLD; Kullback-Leibler divergence; Monte-Carlo method; computational complexity; hidden Markov models; Automatic speech recognition; Computational complexity; Hidden Markov models; Iterative algorithms; Probability distribution; Signal processing algorithms; Speech recognition; Statistics; Upper bound; Vocabulary;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
SoutheastCon, 2005. Proceedings. IEEE
Print_ISBN :
0-7803-8865-8
Type :
conf
DOI :
10.1109/SECON.2005.1423253
Filename :
1423253
Link To Document :
بازگشت