Title :
Laplacian Regularized Gaussian Mixture Model for Data Clustering
Author :
He, Xiaofei ; Cai, Deng ; Shao, Yuanlong ; Bao, Hujun ; Han, Jiawei
Author_Institution :
State Key Lab. of CAD & CG, Zhejiang Univ., Hangzhou, China
Abstract :
Gaussian Mixture Models (GMMs) are among the most statistically mature methods for clustering. Each cluster is represented by a Gaussian distribution. The clustering process thereby turns to estimate the parameters of the Gaussian mixture, usually by the Expectation-Maximization algorithm. In this paper, we consider the case where the probability distribution that generates the data is supported on a submanifold of the ambient space. It is natural to assume that if two points are close in the intrinsic geometry of the probability distribution, then their conditional probability distributions are similar. Specifically, we introduce a regularized probabilistic model based on manifold structure for data clustering, called Laplacian regularized Gaussian Mixture Model (LapGMM). The data manifold is modeled by a nearest neighbor graph, and the graph structure is incorporated in the maximum likelihood objective function. As a result, the obtained conditional probability distribution varies smoothly along the geodesics of the data manifold. Experimental results on real data sets demonstrate the effectiveness of the proposed approach.
Keywords :
Gaussian distribution; Laplace equations; expectation-maximisation algorithm; graph theory; pattern clustering; Gaussian distribution; LapGMM; Laplacian regularized Gaussian mixture model; conditional probability distribution; data clustering; data manifold geodesies; expectation-maximization algorithm; nearest neighbor graph model; parameter estimation; Clustering algorithms; Data models; Geometry; Laplace equations; Manifolds; Nearest neighbor searches; Probability distribution; Gaussian mixture model; clustering; graph laplacian; manifold structure.;
Journal_Title :
Knowledge and Data Engineering, IEEE Transactions on
DOI :
10.1109/TKDE.2010.259