• DocumentCode
    3501023
  • Title

    Noise benefits in the expectation-maximization algorithm: Nem theorems and models

  • Author

    Osoba, Osonde ; Mitaim, Sanya ; Kosko, Bart

  • Author_Institution
    Dept. of Electr. Eng., Univ. of Southern California, Los Angeles, CA, USA
  • fYear
    2011
  • fDate
    July 31 2011-Aug. 5 2011
  • Firstpage
    3178
  • Lastpage
    3183
  • Abstract
    We prove a general sufficient condition for a noise benefit in the expectation-maximization (EM) algorithm. Additive noise speeds the average convergence of the EM algorithm to a local maximum of the likelihood surface when the noise condition holds. The sufficient condition states when additive noise makes the signal more probable on average. The performance measure is Kullback relative entropy. A Gaussian-mixture problem demonstrates the EM noise benefit. Corollary results give other special cases when noise improves performance in the EM algorithm.
  • Keywords
    Gaussian noise; entropy; expectation-maximisation algorithm; Gaussian-mixture problem; Kullback relative entropy; NEM model; NEM theorem; additive noise speeds; expectation-maximization algorithm; noise benefits; Maximum likelihood estimation; Noise; Noise measurement; Probability density function; Random variables; Signal processing algorithms; Stochastic resonance;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks (IJCNN), The 2011 International Joint Conference on
  • Conference_Location
    San Jose, CA
  • ISSN
    2161-4393
  • Print_ISBN
    978-1-4244-9635-8
  • Type

    conf

  • DOI
    10.1109/IJCNN.2011.6033642
  • Filename
    6033642