Title :
Towards EM-style algorithms for a posteriori optimization of normal mixtures
Author :
Ristad, Eric Sven ; Yianilos, Peter N.
Author_Institution :
Mnemonic Technol. Inc., Princeton, NJ, USA
Abstract :
Expectation maximization (EM) provides a simple and elegant approach to the problem of optimizing the parameters of a normal mixture on an unlabeled dataset. To accomplish this, EM iteratively reweights the elements of the dataset until a locally optimal normal mixture is obtained. This paper explores the intriguing question of whether such an EM-style algorithm exists for the related and apparently more difficult problem of finding a normal mixture that maximizes the a posteriori class probabilities of a labeled dataset. We expose a fundamental degeneracy in the relationship between a normal mixture and the a posteriori class probability functions that it induces, and use this degeneracy to prove that reweighting a dataset can almost always give rise to a normal mixture exhibiting any desired class function behavior. This establishes that EM-style approaches are sufficiently expressive for a posteriori optimization problems and opens the way to the design of new algorithms for them
Keywords :
iterative methods; maximum likelihood estimation; normal distribution; optimisation; pattern classification; EM-style algorithms; a posteriori optimization; algorithm design; class function behavior; expectation maximization; fundamental degeneracy; iterative reweighting; normal mixtures; pattern classification; probabilities; unlabeled dataset; Algorithm design and analysis; Design optimization; Iterative algorithms; Markov processes; National electric code; Pattern classification; Pattern recognition; Predictive models; Speech recognition; Vector quantization;
Conference_Titel :
Information Theory, 1998. Proceedings. 1998 IEEE International Symposium on
Conference_Location :
Cambridge, MA
Print_ISBN :
0-7803-5000-6
DOI :
10.1109/ISIT.1998.708846