• DocumentCode
    6407
  • Title

    Accurate Prediction of Phase Transitions in Compressed Sensing via a Connection to Minimax Denoising

  • Author

    Donoho, David L. ; Johnstone, I. ; Montanari, Alessandro

  • Author_Institution
    Dept. of Stat., Stanford Univ., Stanford, CA, USA
  • Volume
    59
  • Issue
    6
  • fYear
    2013
  • fDate
    Jun-13
  • Firstpage
    3396
  • Lastpage
    3433
  • Abstract
    Compressed sensing posits that, within limits, one can undersample a sparse signal and yet reconstruct it accurately. Knowing the precise limits to such undersampling is important both for theory and practice. We present a formula that characterizes the allowed undersampling of generalized sparse objects. The formula applies to approximate message passing (AMP) algorithms for compressed sensing, which are here generalized to employ denoising operators besides the traditional scalar soft thresholding denoiser. This paper gives several examples including scalar denoisers not derived from convex penalization-the firm shrinkage nonlinearity and the minimax nonlinearity-and also nonscalar denoisers-block thresholding, monotone regression, and total variation minimization. Let the variables ε = k/N and δ = n/N denote the generalized sparsity and undersampling fractions for sampling the k-generalized-sparse N-vector x0 according to y=Ax0. Here, A is an n×N measurement matrix whose entries are iid standard Gaussian. The formula states that the phase transition curve δ = δ(ε) separating successful from unsuccessful reconstruction of x0 by AMP is given by δ = M(ε|Denoiser) where M(ε|Denoiser) denotes the per-coordinate minimax mean squared error (MSE) of the specified, optimally tuned denoiser in the directly observed problem y = x + z. In short, the phase transition of a noiseless undersampling problem is identical to the minimax MSE in a denoising problem. We prove that this formula follows from state evolution and present numerical results validating it in a wide range of settings. The above formula generates numerous new insights, both in the scalar and in the nonscalar cases.
  • Keywords
    Gaussian processes; compressed sensing; least mean squares methods; matrix algebra; message passing; minimisation; regression analysis; signal denoising; signal reconstruction; signal sampling; AMP algorithm; MMSE; approximate message passing; block thresholding; compressed sensing; firm shrinkage nonlinearity; generalized sparse N vector; generalized sparsity; iid standard Gaussian; measurement matrix; minimax denoising; minimax mean squared error; minimax nonlinearity; monotone regression; noiseless undersampling problem; nonscalar denoiser; optimally tuned denoiser; phase transition curve; phase transition prediction; scalar soft thresholding denoiser; sparse signal reconstruction; total variation minimization; undersampling fraction; Approximation algorithms; Compressed sensing; Message passing; Minimization; Noise reduction; Partitioning algorithms; Vectors; Approximate message passing (AMP); James–Stein; Lasso; group Lasso; joint sparsity; minimax risk of firm thresholding; minimax risk of soft thresholding; minimax risk over nearly black objects; minimax shrinkage; monotone regression; nonconvex penalization; state evolution; total variation minimization;
  • fLanguage
    English
  • Journal_Title
    Information Theory, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0018-9448
  • Type

    jour

  • DOI
    10.1109/TIT.2013.2239356
  • Filename
    6409458