• DocumentCode
    60421
  • Title

    On the Theorem of Uniform Recovery of Random Sampling Matrices

  • Author

    Andersson, Jon ; Stromberg, Jan-Olov

  • Author_Institution
    Dept. of Math., R. Inst. of Technol., Stockholm, Sweden
  • Volume
    60
  • Issue
    3
  • fYear
    2014
  • fDate
    Mar-14
  • Firstpage
    1700
  • Lastpage
    1710
  • Abstract
    We consider two theorems from the theory of compressive sensing. Mainly a theorem concerning uniform recovery of random sampling matrices, where the number of samples needed in order to recover an s-sparse signal from linear measurements (with high probability) is known to be m ≳ s(ln s)<;sup>3<;/sup>lnN. We present new and improved constants together with what we consider to be a more explicit proof. A proof that also allows for a slightly larger class of m × N-matrices, by considering what is called effective sparsity. We also present a condition on the so-called restricted isometry constants, δ<;sub>s<;/sub>, ensuring sparse recovery via ℓ<;sup>1<;/sup>-minimization. We show that is sufficient and that this can be improved further to almost allow for a sufficient condition of the type .
  • Keywords
    compressed sensing; matrix algebra; compressive sensing; effective sparsity; explicit proof; linear measurements; random sampling matrices; restricted isometry constants; s-sparse signal; sparse recovery; uniform recovery; Compressed sensing; Linear matrix inequalities; Materials; Null space; Random variables; Sparse matrices; Vectors; $ell^{1}$-minimization; Bounded orthogonal systems; compressive sensing; effective sparsity; random sampling matrices; restricted isometry property;
  • fLanguage
    English
  • Journal_Title
    Information Theory, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0018-9448
  • Type

    jour

  • DOI
    10.1109/TIT.2014.2300092
  • Filename
    6712132