DocumentCode :
82830
Title :
On {l}_{q} Optimization and Sparse Inverse Covariance Selection
Author :
Marjanovic, Goran ; Solo, Victor
Author_Institution :
Dept. of Electr. Eng. & Comput. Sci., Univ. of Michigan, Ann Arbor, MI, USA
Volume :
62
Issue :
7
fYear :
2014
fDate :
1-Apr-14
Firstpage :
1644
Lastpage :
1654
Abstract :
Graphical models are well established in providing meaningful conditional probability descriptions of complex multivariable interactions. In the Gaussian case, the conditional independencies between different variables correspond to zero entries in the precision (inverse covariance) matrix. Hence, there has been much recent interest in sparse precision matrix estimation in areas such as statistics, machine learning, computer vision, pattern recognition, and signal processing. A popular estimation method involves optimizing a penalized log-likelihood problem. The penalty is responsible for inducing sparsity and a common choice is the convex l1 norm. Even though the l0 penalty is the natural choice guaranteeing maximum sparsity, it has been avoided due to lack of convexity. As a result, in this paper we bridge the gap between these two penalties and propose the non-concave lq penalized log-likelihood problem for sparse precision matrix estimation where 0 ≤ q <; 1. A novel algorithm is developed for the optimization and we provide some of its theoretic properties that are useful in sparse linear regression. We illustrate on synthetic and real data, showing reconstruction quality comparisons of sparsity inducing penalties:l0, lq with 0 <; q <; 1, l1, and SCAD.
Keywords :
computer vision; covariance matrices; estimation theory; learning (artificial intelligence); optimisation; regression analysis; signal processing; Gaussian case; complex multivariable interactions; computer vision; conditional independency; conditional probability descriptions; graphical models; inverse covariance matrix; log-likelihood problem; machine learning; optimization; pattern recognition; reconstruction quality; signal processing; sparse inverse covariance selection; sparse linear regression; sparse precision matrix estimation; Covariance matrices; Estimation; Graphical models; Optimization; Signal processing algorithms; Sparse matrices; Symmetric matrices; $l_{0}$ penalty; $l_{1}$ penalty; $l_{q}$ penalty; Inverse covariance matrix; non-convex optimization; penalized log-likelihood; sparse;
fLanguage :
English
Journal_Title :
Signal Processing, IEEE Transactions on
Publisher :
ieee
ISSN :
1053-587X
Type :
jour
DOI :
10.1109/TSP.2014.2303429
Filename :
6728736
Link To Document :
بازگشت