Title :
EM algorithms of Gaussian mixture model and hidden Markov model
Author :
Xuan, Guorong ; Zhang, Wei ; Chai, Peiqi
Author_Institution :
Dept. of Comput. Sci., Tongji Univ., Shanghai, China
fDate :
6/23/1905 12:00:00 AM
Abstract :
The HMM (hidden Markov model) is a probabilistic model of the joint probability of a collection of random variables with both observations and states. The GMM (Gaussian mixture model) is a finite mixture probability distribution model. Although the two models have a close relationship, they are always discussed independently and separately. The EM (expectation-maximum) algorithm is a general method to improve the descent algorithm for finding the maximum likelihood estimation. The EM of HMM and the EM of GMM have similar formulae. Two points are proposed in this paper. One is that the EM of GMM can be regarded as a special EM of HMM. The other is that the EM algorithm of GMM based on symbols is faster in implementation than the EM algorithm of GMM based on samples (or on observation) traditionally
Keywords :
Gaussian processes; hidden Markov models; image processing; iterative methods; maximum likelihood estimation; EM algorithms; GMM; Gaussian mixture model; HMM; descent algorithm; expectation-maximization algorithm; finite mixture probability distribution model; hidden Markov model; maximum likelihood estimation; random variables; Computer science; Covariance matrix; Electronic mail; Gaussian distribution; Hidden Markov models; Histograms; Maximum likelihood estimation; Parameter estimation; Probability distribution; Random variables;
Conference_Titel :
Image Processing, 2001. Proceedings. 2001 International Conference on
Conference_Location :
Thessaloniki
Print_ISBN :
0-7803-6725-1
DOI :
10.1109/ICIP.2001.958974