Title :
Metric Learning from Relative Comparisons by Minimizing Squared Residual
Author :
Liu, E.Y. ; Zhishan Guo ; Xiang Zhang ; Jojic, Vladimir ; Wei Wang
Author_Institution :
Dept. of Comput. Sci., Univ. of North Carolina at Chapel Hill, Chapel Hill, NC, USA
Abstract :
Recent studies [1] -- [5] have suggested using constraints in the form of relative distance comparisons to represent domain knowledge: d(a, b) <; d(c, d) where d(·) is the distance function and a, b, c, d are data objects. Such constraints are readily available in many problems where pairwise constraints are not natural to obtain. In this paper we consider the problem of learning a Mahalanobis distance metric from supervision in the form of relative distance comparisons. We propose a simple, yet effective, algorithm that minimizes a convex objective function corresponding to the sum of squared residuals of constraints. We also extend our model and algorithm to promote sparsity in the learned metric matrix. Experimental results suggest that our method consistently outperforms existing methods in terms of clustering accuracy. Furthermore, the sparsity extension leads to more stable estimation when the dimension is high and only a small amount of supervision is given.
Keywords :
convex programming; learning (artificial intelligence); minimisation; pattern clustering; Mahalanobis distance metric learning; clustering accuracy; convex objective function minimisation; distance function; learned metric matrix; pairwise constraint; relative distance comparison; squared residual minimisation; Accuracy; Clustering algorithms; Convergence; Covariance matrix; Linear programming; Measurement; Symmetric matrices; Mahalanobis metric; metric learning; relative comparisons;
Conference_Titel :
Data Mining (ICDM), 2012 IEEE 12th International Conference on
Conference_Location :
Brussels
Print_ISBN :
978-1-4673-4649-8
DOI :
10.1109/ICDM.2012.38