DocumentCode :
3724138
Title :
Logdet Divergence Based Sparse Non-Negative Matrix Factorization for Stable Representation
Author :
Qing Liao;Naiyang Guan;Qian Zhang
fYear :
2015
Firstpage :
871
Lastpage :
876
Abstract :
Non-negative matrix factorization (NMF) decomposes any non-negative matrix into the product of two low dimensional non-negative matrices. Since NMF learns effective parts-based representation, it has been widely applied in computer vision and data mining. However, traditional NMF has the riskrisk learning rank-deficient basis learning rank-deficient basis on high-dimensional dataset with few examples especially when some examples are heavily corrupted by outliers. In this paper, we propose a Logdet divergence based sparse NMF method (LDS-NMF) to deal with the rank-deficiency problem. In particular, LDS-NMF reduces the risk of rank deficiency by minimizing the Logdet divergence between the product of basis matrix with its transpose and the identity matrix, meanwhile penalizing the density of the coefficients. Since the objective function of LDS-NMF is nonconvex, it is difficult to optimize. In this paper, we develop a multiplicative update rule to optimize LDS-NMF in the frame of block coordinate descent, and theoretically prove its convergence. Experimental results on popular datasets show that LDS-NMF can learn more stable representations than those learned by representative NMF methods.
Keywords :
"Sparse matrices","Principal component analysis","Linear programming","Data mining","Training","Nuclear magnetic resonance","Computer science"
Publisher :
ieee
Conference_Titel :
Data Mining (ICDM), 2015 IEEE International Conference on
ISSN :
1550-4786
Type :
conf
DOI :
10.1109/ICDM.2015.52
Filename :
7373404
Link To Document :
بازگشت