• DocumentCode
    2838437
  • Title

    MCE-based training of subspace distribution clustering HMM

  • Author

    Li, Xiao-Bing ; Li-Rong Dai ; Wang, Ren-Hua

  • Author_Institution
    USTC iFly Speech Lab, Univ. of Sci. & Technol. of China, Hefei, China
  • fYear
    2004
  • fDate
    15-18 Dec. 2004
  • Firstpage
    113
  • Lastpage
    116
  • Abstract
    For resource-limited platforms, the subspace distribution clustering hidden Markov model (SDCHMM) is better than the continuous density hidden Markov model (CDHMM) for its smaller storage and lower computations while maintaining a decent recognition performance. But the normal SDCHMM obtaining method does not ensure optimality in classifier design. In order to obtain an optimal classifier, a new SDCHMM training algorithm that adjusts the parameters of SDCHMM according to the minimum classification error (MCE) criterion is proposed in this paper. Our experimental results on TiDigits and RM tasks show the MCE-based SDCHMM training algorithm provides 15-80% word error rate reduction (WERR) compared with the normal SDCHMM that is converted from CDHMM.
  • Keywords
    error statistics; hidden Markov models; optimisation; pattern classification; speech recognition; statistical distributions; MCE criterion; RM task; SDCHMM training algorithm; TiDigits task; minimum classification error; optimal classifier; resource-limited platforms; speech recognition performance; subspace distribution clustering hidden Markov model; word error rate reduction; Arithmetic; Degradation; Distributed computing; Error analysis; Gaussian distribution; Hidden Markov models; Prototypes; Speech recognition; System performance; System testing;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Chinese Spoken Language Processing, 2004 International Symposium on
  • Print_ISBN
    0-7803-8678-7
  • Type

    conf

  • DOI
    10.1109/CHINSL.2004.1409599
  • Filename
    1409599