Title :
A covariance kernel for svm language recognition
Author_Institution :
Lincoln Lab., MIT, Lexington, MA
fDate :
March 31 2008-April 4 2008
Abstract :
Discriminative training for language recognition has been a key tool for improving system performance. In addition, recognition directly from shifted-delta cepstral features has proven effective. A successful example of this paradigm is SVM-based discrimination of languages based on GMM mean supervectors (GSVs). GSVs are created through MAP adaptation of a universal background model (UBM) GMM. This work proposes a novel extension to this idea by extending the supervector framework to the covariances of the UBM. We demonstrate a new SVM kernel including this covariance structure. In addition, we propose a method for pushing SVM model parameters back to GMM models. These GMM models can be used as an alternate form of scoring. The new approach is demonstrated on a fourteen language task with substantial performance improvements over prior techniques.
Keywords :
Gaussian processes; covariance analysis; natural language processing; support vector machines; GMM models; Gaussian mixture model mean supervectors; MAP adaptation; SVM-based language discrimination; covariance kernel; discriminative training; shifted-delta cepstral features; support vector machine language recognition; universal background model; Cepstral analysis; Contracts; Kernel; Labeling; Laboratories; Mutual information; Polynomials; Support vector machine classification; Support vector machines; System performance; language recognition; support vector machines;
Conference_Titel :
Acoustics, Speech and Signal Processing, 2008. ICASSP 2008. IEEE International Conference on
Conference_Location :
Las Vegas, NV
Print_ISBN :
978-1-4244-1483-3
Electronic_ISBN :
1520-6149
DOI :
10.1109/ICASSP.2008.4518566