Abstract :
In this paper, a Bayesian approach is proposed for parameter inference of mixture models. There is, however, a difficulty with computational cost, since the standard conjugate prior is not available in this case. Recently, the Variational Bayes (VB) algorithm has become a practical solution, due to its computational efficiency. The objective of this paper is to examine the full derivation of the VB approximation and to explain how VB reduces the dimensional expansion of the posterior distribution at each Bayesian inference step, especially in the case of Hidden Markov model, (HMM). Two interesting applications, model order inference and inference of a HMM, will illustrate this effective procedure.