DocumentCode :
2207271
Title :
On the likelihood function of HMMs for a long data sequence
Author :
Yamazaki, Keisuke
Author_Institution :
Precision & Intell. Lab., Tokyo Inst. of Technol., Yokohama, Japan
fYear :
2009
fDate :
1-4 Sept. 2009
Firstpage :
1
Lastpage :
6
Abstract :
Hidden Markov models (HMMs) are widely applied to the analysis of time-dependent data sequences, such as nonlinear signal processing, natural language processing, and bioinformatics. Training data in HMMs have two possible formats: a large set of time-dependent sequential data and an infinitely long sequence. The learning process is one of the main concerns in machine learning. For a large set of time-dependent sequential data, the generalization ability can be determined based on algebraic geometry. However, there has been no theoretical analysis for the case of an infinitely long sequence. Therefore, the present paper experimentally determines a number of unique properties of the likelihood function and explains these properties theoretically. The results indicate that the likelihood function implicitly includes a local maximum factor, which can make the learning process slow, and that this slow learning enables high performance in a stationary state evaluation.
Keywords :
data handling; hidden Markov models; learning (artificial intelligence); HMM; algebraic geometry; hidden Markov models; learning process; long data sequence; machine learning; time-dependent data sequences; time-dependent sequential data; Bioinformatics; Biomedical signal processing; Data analysis; Geometry; Hidden Markov models; Machine learning; Natural language processing; Signal analysis; Stationary state; Training data;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Machine Learning for Signal Processing, 2009. MLSP 2009. IEEE International Workshop on
Conference_Location :
Grenoble
Print_ISBN :
978-1-4244-4947-7
Electronic_ISBN :
978-1-4244-4948-4
Type :
conf
DOI :
10.1109/MLSP.2009.5306222
Filename :
5306222
Link To Document :
بازگشت