• DocumentCode
    2602247
  • Title

    Sparse regression as a sparse eigenvalue problem

  • Author

    Moghaddam, Baback ; Gruber, Amit ; Weiss, Yair ; Avidan, Shai

  • Author_Institution
    Jet Propulsion Lab., California Inst. of Technol., Pasadena, CA
  • fYear
    2008
  • fDate
    Jan. 27 2008-Feb. 1 2008
  • Firstpage
    219
  • Lastpage
    225
  • Abstract
    We extend the l0-norm ldquosubspectralrdquo algorithms developed for sparse-LDA (Moghaddam, 2006) and sparse-PCA (Moghaddam, 2006) to more general quadratic costs such as MSE in linear (or kernel) regression. The resulting ldquosparse least squaresrdquo (SLS) problem is also NP-hard, by way of its equivalence to a rank-1 sparse eigenvalue problem. Specifically, for minimizing general quadratic cost functions we use a highly-efficient method for direct eigenvalue computation based on partitioned matrix inverse techniques that leads to times103 speed-ups over standard eigenvalue decomposition. This increased efficiency mitigates the O(n4) complexity that limited the previous algorithmspsila utility for high-dimensional problems. Moreover, the new computation prioritizes the role of the less-myopic backward elimination stage which becomes even more efficient than forward selection. Similarly, branch-and-bound search for exact sparse least squares (ESLS) also benefits from partitioned matrix techniques. Our greedy sparse least squares (GSLS) algorithm generalizes Natarajanpsilas algorithm (Natarajan, 1995) also known as order-recursive matching pursuit (ORMP). Specifically, the forward pass of GSLS is exactly equivalent to ORMP but is more efficient, and by including the backward pass, which only doubles the computation, we can achieve a lower MSE than ORMP. In experimental comparisons with LARS (Efron, 2004), forward-GSLS is shown to be not only more efficient and accurate but more flexible in terms of choice of regularization.
  • Keywords
    computational complexity; eigenvalues and eigenfunctions; iterative methods; least squares approximations; matrix inversion; sparse matrices; tree searching; NP-hard problem; branch-and-bound search; eigenvalue decomposition; exact sparse least squares; greedy sparse least squares algorithm; less-myopic backward elimination stage; matrix inverse techniques; order-recursive matching pursuit; partitioned matrix techniques; quadratic cost functions; sparse eigenvalue problem; sparse least squares problem; sparse regression; subspectral algorithms; Cost function; Eigenvalues and eigenfunctions; Kernel; Laser sintering; Least squares methods; Matching pursuit algorithms; Matrix decomposition; Partitioning algorithms; Pursuit algorithms; Sparse matrices;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Information Theory and Applications Workshop, 2008
  • Conference_Location
    San Diego, CA
  • Print_ISBN
    978-1-4244-2670-6
  • Type

    conf

  • DOI
    10.1109/ITA.2008.4601051
  • Filename
    4601051