• DocumentCode
    1506046
  • Title

    A minimum cross-entropy approach to hidden Markov model adaptation

  • Author

    Afify, Mohamed ; Gong, Yifan ; Haton, Jean-Paul

  • Author_Institution
    Dept. of Electr. Eng., Cairo Univ., Fayoum, Egypt
  • Volume
    6
  • Issue
    6
  • fYear
    1999
  • fDate
    6/1/1999 12:00:00 AM
  • Firstpage
    132
  • Lastpage
    134
  • Abstract
    An adaptation algorithm using the theoretically optimal maximum a posteriori (MAP) formulation, and at the same time accounting for parameter correlation between different classes is desirable, especially when using sparse adaptation data. However, a direct implementation of such an approach may be prohibitive in many practical situations. We present an algorithm that approximates the above mentioned correlated MAP algorithm by iteratively maximizing the set of posterior marginals. With some simplifying assumptions, expressions for these marginals are then derived, using the principle of minimum cross-entropy. The resulting algorithm is simple, and includes conventional MAP estimation as a special case. The utility of the proposed method is tested in adaptation experiments for an alphabet recognition task.
  • Keywords
    Gaussian processes; correlation methods; hidden Markov models; maximum likelihood estimation; minimum entropy methods; speech recognition; Gaussian mixture densities; MAP estimation; adaptation algorithm; adaptation experiments; alphabet recognition task; correlated MAP algorithm; hidden Markov model adaptation; minimum cross-entropy; optimal maximum a posteriori formulation; parameter correlation; posterior marginals; sparse adaptation data; speech recognition; Adaptation model; Hidden Markov models; Iterative algorithms; Laboratories; Random variables; Speech recognition; Testing;
  • fLanguage
    English
  • Journal_Title
    Signal Processing Letters, IEEE
  • Publisher
    ieee
  • ISSN
    1070-9908
  • Type

    jour

  • DOI
    10.1109/97.763143
  • Filename
    763143