• DocumentCode
    1324846
  • Title

    Information Theoretic Bounds for Compressed Sensing

  • Author

    Aeron, Shuchin ; Saligrama, Venkatesh ; Zhao, Manqi

  • Author_Institution
    Dept. of Electr. & Comput. Eng., Boston Univ., Boston, MA, USA
  • Volume
    56
  • Issue
    10
  • fYear
    2010
  • Firstpage
    5111
  • Lastpage
    5130
  • Abstract
    In this paper, we derive information theoretic performance bounds to sensing and reconstruction of sparse phenomena from noisy projections. We consider two settings: output noise models where the noise enters after the projection and input noise models where the noise enters before the projection. We consider two types of distortion for reconstruction: support errors and mean-squared errors. Our goal is to relate the number of measurements, m , and SNR, to signal sparsity, k, distortion level, d, and signal dimension, n . We consider support errors in a worst-case setting. We employ different variations of Fano´s inequality to derive necessary conditions on the number of measurements and SNR required for exact reconstruction. To derive sufficient conditions, we develop new insights on max-likelihood analysis based on a novel superposition property. In particular, this property implies that small support errors are the dominant error events. Consequently, our ML analysis does not suffer the conservatism of the union bound and leads to a tighter analysis of max-likelihood. These results provide order-wise tight bounds. For output noise models, we show that asymptotically an SNR of ((n)) together with (k (n/k)) measurements is necessary and sufficient for exact support recovery. Furthermore, if a small fraction of support errors can be tolerated, a constant SNR turns out to be sufficient in the linear sparsity regime. In contrast for input noise models, we show that support recovery fails if the number of measurements scales as o(n(n)/SNR), implying poor compression performance for such cases. Motivated by the fact that the worst-case setup requires significantly high SNR and substantial number of measurements for input and output noise models, we consider a Bayesian setup. To derive necessary conditions, we develop novel extensions to Fano´s inequality to handle continuous domains and arbitrary distortions. We then develop a new max-likelihood analysis over the set - - of rate distortion quantization points to characterize tradeoffs between mean-squared distortion and the number of measurements using rate-distortion theory. We show that with constant SNR the number of measurements scales linearly with the rate-distortion function of the sparse phenomena.
  • Keywords
    maximum likelihood estimation; mean square error methods; rate distortion theory; signal reconstruction; Bayesian setup; Fano inequality; SNR; compressed sensing; distortion level; information theoretic bounds; input noise models; linear sparsity regime; max-likelihood analysis; mean-squared error distortion; noisy projections; order-wise tight bounds; output noise models; rate distortion quantization points; rate-distortion theory; signal dimension; signal sparsity; sparse phenomena reconstruction; superposition property; worst-case setup; Distortion; Distortion measurement; Noise measurement; Rate-distortion; Sensors; Signal to noise ratio; Compressed sensing; Fano´s inequality; sensing capacity; sparsity; support recovery;
  • fLanguage
    English
  • Journal_Title
    Information Theory, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0018-9448
  • Type

    jour

  • DOI
    10.1109/TIT.2010.2059891
  • Filename
    5571873