DocumentCode :
2724125
Title :
Discriminative training of GMM based on Maximum Mutual Information for language identification
Author :
Dan, Qu ; Bingxi, Wang ; Honggang, Yan ; Guannan, Dai
Author_Institution :
Dept. of Signal Analyzing Eng., Inf. Eng. Univ., Zhengzhou
Volume :
1
fYear :
0
fDate :
0-0 0
Firstpage :
1576
Lastpage :
1579
Abstract :
In this paper, a discriminative training procedure based on maximum mutual information (MMI) for a Gaussian mixture model (GMM) language identification system is described. The idea is to find the model parameters lambda that minimize the conditional entropy Hlambda (C | X) of the random variable C given the random variable X , which means minimize the uncertainty in knowing what language was spoken given access to the utterance in X . The implementation of the proposal is based on the generalized probabilistic descent (GPD) algorithm formulated to estimate the GMM parameters. The evaluation is conducted using the OGI multi-language telephone speech corpus. The experimental results show such system is very effective in language identification tasks
Keywords :
Gaussian processes; entropy; linguistics; minimisation; parameter estimation; probability; speech processing; Gaussian mixture model; conditional entropy minimization; discriminative training; generalized probabilistic descent; language identification; maximum mutual information; parameter estimation; Gaussian distribution; Information analysis; Maximum likelihood estimation; Mutual information; Natural languages; Parameter estimation; Random variables; Signal analysis; Signal processing; Speech recognition; Gaussian Mixture Model(GMM); Generalized Probabilistic Descent (GPD); Language Identification; Maximum Mutual Information(MMI);
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Intelligent Control and Automation, 2006. WCICA 2006. The Sixth World Congress on
Conference_Location :
Dalian
Print_ISBN :
1-4244-0332-4
Type :
conf
DOI :
10.1109/WCICA.2006.1712616
Filename :
1712616
Link To Document :
بازگشت