Title :
Incorporating Training Errors for Large Margin HMMS Under Semi-Definite Programming Framework
Author :
Hui Jiang ; Xinwei Li
Author_Institution :
Dept. of Comput. Sci. & Eng., York Univ., Toronto, Ont., Canada
Abstract :
In this paper, we study how to incorporate training errors in large margin estimation (LME) under semi-definite programming (SDP) framework. Like soft-margin SVM, we propose to optimize a new objective function which linearly combines the minimum margin among positive tokens and an average error function of all negative tokens. The new method is named as soft-LME. It is shown the new soft-LME problem can still be converted into an SDP problem if we properly define the average error function of all negative tokens based on their discriminative functions. Some preliminary results on TIDIGITS show that the soft-LML/SDP method yields modest performance gain when training error rates are significant. Moreover, it is also shown that the soft-LML/SDP can achieve much faster convergence for all cases which we have investigated.
Keywords :
hidden Markov models; mathematical programming; speech recognition; HMM; average error function; large margin estimation; semi-definite programming framework; soft-LME; speech recognition; training errors; Computer errors; Computer science; Convergence; Databases; Error analysis; Hidden Markov models; Optimization methods; Performance gain; Speech recognition; Support vector machines; CDHMM; discriminative training; large margin estimation (LME); semi-definite programming (SDP); soft-LME;
Conference_Titel :
Acoustics, Speech and Signal Processing, 2007. ICASSP 2007. IEEE International Conference on
Conference_Location :
Honolulu, HI
Print_ISBN :
1-4244-0727-3
DOI :
10.1109/ICASSP.2007.366991