Abstract :
To cluster or partition data/signal, expectation-and-maximisation or variational approximation with a mixture model (MM), which is a parametric probability density function represented as a weighted sum of K̂ densities, is often used. However, model selection to find the underlying K̂ is one of the key concerns in MM clustering, since the desired clusters can be obtained only when K̂ is known. A new model selection algorithm to explore K̂ in a Bayesian framework is proposed. The proposed algorithm builds the density of the model order which information criterion such as AIC and BIC or other heuristic algorithms basically fail to reconstruct. In addition, this algorithm reconstructs the density quickly as compared with the time-consuming Monte Carlo simulation using integrated nested Laplace approximation.
Keywords :
Bayes methods; Gaussian processes; approximation theory; expectation-maximisation algorithm; mixture models; pattern clustering; signal reconstruction; Bayesian framework; EM method; Gaussian MM clustering; INLA; Monte Carlo simulation; expectation-and-maximisation; integrated nested Laplace approximation; mixture model; model selection; parametric probability density function; signal clustering; signal partition; variational approximation;