• DocumentCode
    1365548
  • Title

    Shannon-Theoretic Limits on Noisy Compressive Sampling

  • Author

    Akçakaya, Mehmet ; Tarokh, Vahid

  • Author_Institution
    Sch. of Eng. & Appl. Sci., Harvard Univ., Cambridge, MA, USA
  • Volume
    56
  • Issue
    1
  • fYear
    2010
  • Firstpage
    492
  • Lastpage
    504
  • Abstract
    In this paper, we study the number of measurements required to recover a sparse signal in CM with L nonzero coefficients from compressed samples in the presence of noise. We consider a number of different recovery criteria, including the exact recovery of the support of the signal, which was previously considered in the literature, as well as new criteria for the recovery of a large fraction of the support of the signal, and the recovery of a large fraction of the energy of the signal. For these recovery criteria, we prove that O(L) (an asymptotically linear multiple of L) measurements are necessary and sufficient for signal recovery, whenever L grows linearly as a function of M. This improves on the existing literature that is mostly focused on variants of a specific recovery algorithm based on convex programming, for which O(L log(M - L)) measurements are required. In contrast, the implementation of our proof method would have a higher complexity. We also show that O(L log(M - L)) measurements are required in the sublinear regime (L - o(M)). For our sufficiency proofs, we introduce a Shannon-theoretic decoder based on joint typicality, which allows error events to be defined in terms of a single random variable in contrast to previous information-theoretic work, where comparison of random variables are required. We also prove concentration results for our error bounds implying that a randomly selected Gaussian matrix will suffice with high probability. For our necessity proofs, we rely on results from channel coding and rate-distortion theory.
  • Keywords
    Gaussian processes; channel coding; computational complexity; convex programming; rate distortion theory; signal sampling; Gaussian matrix; Shannon-theoretic limits; channel coding; convex programming; noisy compressive sampling; rate distortion theory; signal recovery; sparse signal; Additive noise; Channel coding; Data acquisition; Decoding; Linear matrix inequalities; Noise measurement; Pollution measurement; Random variables; Sampling methods; Sparse matrices; Compressed sensing; Fano´s inequality; Shannon theory; compressive sampling; estimation error; joint typicality; linear regime; sublinear regime; support recovery;
  • fLanguage
    English
  • Journal_Title
    Information Theory, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0018-9448
  • Type

    jour

  • DOI
    10.1109/TIT.2009.2034796
  • Filename
    5361481