Title :
Noise benefits in the expectation-maximization algorithm: Nem theorems and models
Author :
Osoba, Osonde ; Mitaim, Sanya ; Kosko, Bart
Author_Institution :
Dept. of Electr. Eng., Univ. of Southern California, Los Angeles, CA, USA
fDate :
July 31 2011-Aug. 5 2011
Abstract :
We prove a general sufficient condition for a noise benefit in the expectation-maximization (EM) algorithm. Additive noise speeds the average convergence of the EM algorithm to a local maximum of the likelihood surface when the noise condition holds. The sufficient condition states when additive noise makes the signal more probable on average. The performance measure is Kullback relative entropy. A Gaussian-mixture problem demonstrates the EM noise benefit. Corollary results give other special cases when noise improves performance in the EM algorithm.
Keywords :
Gaussian noise; entropy; expectation-maximisation algorithm; Gaussian-mixture problem; Kullback relative entropy; NEM model; NEM theorem; additive noise speeds; expectation-maximization algorithm; noise benefits; Maximum likelihood estimation; Noise; Noise measurement; Probability density function; Random variables; Signal processing algorithms; Stochastic resonance;
Conference_Titel :
Neural Networks (IJCNN), The 2011 International Joint Conference on
Conference_Location :
San Jose, CA
Print_ISBN :
978-1-4244-9635-8
DOI :
10.1109/IJCNN.2011.6033642