• DocumentCode
    78087
  • Title

    l_{q} Sparsity Penalized Linear Regression With Cyclic Descent

  • Author

    Marjanovic, Goran ; Solo, Victor

  • Author_Institution
    Dept. of Electr. Eng. & Comput. Sci., Univ. of Michigan, Ann Arbor, MI, USA
  • Volume
    62
  • Issue
    6
  • fYear
    2014
  • fDate
    15-Mar-14
  • Firstpage
    1464
  • Lastpage
    1475
  • Abstract
    Recently, there has been a lot of focus on penalized least squares problems for noisy sparse signal estimation. The penalty induces sparsity and a very common choice has been the convex l1 norm. However, to improve sparsity and reduce the biases associated with the l1 norm, one must move to non-convex penalties such as the lq norm . In this paper we present a novel cyclic descent algorithm for optimizing the resulting lq penalized least squares problem. Optimality conditions for this problem are derived and competing ones clarified. Coordinate-wise convergence as well as convergence to a local minimizer of the algorithm, which is highly non-trivial, is proved and we illustrate with simulations comparing the signal reconstruction quality with three penalty functions: l0, l1 and lq with 0 <; q <; 1.
  • Keywords
    concave programming; inverse problems; least squares approximations; regression analysis; signal reconstruction; convex l1 norm; coordinate-wise convergence; cyclic descent algorithm; lq sparsity penalized linear regression; noisy sparse signal estimation; penalized least squares problems; penalty functions; signal reconstruction quality; Convex optimization; Inverse problems; $l_q$ optimization; Sparsity; inverse problem; non-convex;
  • fLanguage
    English
  • Journal_Title
    Signal Processing, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    1053-587X
  • Type

    jour

  • DOI
    10.1109/TSP.2014.2302740
  • Filename
    6725680