Title :
Bayesian growing and pruning strategies for MAP-optimal estimation of Gaussian mixture models
Author_Institution :
Center for Sensor Signal & Inf. Process., Univ. of South Australia, The Levels, SA, Australia
Abstract :
Real-time learning requires on-line complexity estimation. Expectation-maximisation (EM) and sampling techniques are presented that enable simultaneous estimation of the complexity and continuous parameters of Gaussian mixture models (GMMs) which can be used for density estimation, classification and feature extraction. The solution is a maximum a posteriori probability (MAP) estimator that is convergent for fixed data and adaptive with accruing data. Issues resolved include estimating the priors for element covariances, means and weights and calculating the local integrated likelihood (evidence) of the solution. The EM algorithm for MAP estimation of GMM parameters is established and extended to include complexity estimation (i.e. iterative pruning). The EMS algorithm is introduced which incorporates a sampling stage that enables iterative growth of the GMM. Early trials involving speech data indicate that the likelihood of hidden Markov speech models can be very substantially increased using this approach
Keywords :
Bayes methods; Gaussian distribution; convergence; hidden Markov models; learning (artificial intelligence); maximum likelihood estimation; neural nets; parameter estimation; Bayesian growing; Bayesian pruning; Gaussian mixture models; MAP estimation; MAP-optimal estimation; classification; convergence; density estimation; expectation-maximisation; feature extraction; hidden Markov speech models; iterative growth; iterative pruning; local integrated likelihood; maximum a posteriori probability estimator; neural networks; online complexity estimation; parameter estimation; real-time learning; sampling techniques;
Conference_Titel :
Artificial Neural Networks, 1995., Fourth International Conference on
Conference_Location :
Cambridge
Print_ISBN :
0-85296-641-5
DOI :
10.1049/cp:19950583