DocumentCode :
2268739
Title :
Hidden Markov models estimation via the most informative stopping times for Viterbi algorithm
Author :
Kogan, Joseph A.
Author_Institution :
Courant Inst. of Math. Sci., New York Univ., NY, USA
fYear :
1995
fDate :
17-22 Sep 1995
Firstpage :
178
Abstract :
We propose a sequential approach for studying the Viterbi algorithm via a renewal sequence of the most informative stopping times which allows us in particular to obtain new asymptotic “single-letter” decoding conditions of equivalency between the Baum-Welch, segmental K-means and vector quantization algorithms of the hidden Markov models parameters estimation which have important applications in speech recognition
Keywords :
decoding; hidden Markov models; maximum likelihood estimation; parameter estimation; sequential estimation; speech recognition; vector quantisation; Baum-Welch algorithm; Viterbi algorithm; asymptotic single-letter decoding; hidden Markov models estimation; most informative stopping times; parameters estimation; renewal sequence; segmental K-means algorithm; sequential approach; speech recognition; vector quantization algorithm; Decoding; Distortion measurement; Dynamic programming; Entropy; Hidden Markov models; Parameter estimation; Sequential analysis; Speech recognition; Vector quantization; Viterbi algorithm;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Information Theory, 1995. Proceedings., 1995 IEEE International Symposium on
Conference_Location :
Whistler, BC
Print_ISBN :
0-7803-2453-6
Type :
conf
DOI :
10.1109/ISIT.1995.531527
Filename :
531527
Link To Document :
بازگشت