Title :
Approximate Sparsity Pattern Recovery: Information-Theoretic Lower Bounds
Author :
Reeves, G. ; Gastpar, Michael C.
Author_Institution :
Dept. of Electr. Eng. & Comput. Sci., Univ. of California, Berkeley, Berkeley, CA, USA
Abstract :
Recovery of the sparsity pattern (or support) of an unknown sparse vector from a small number of noisy linear measurements is an important problem in compressed sensing. In this paper, the high-dimensional setting is considered. It is shown that if the measurement rate and per-sample signal-to-noise ratio (SNR) are finite constants independent of the length of the vector, then the optimal sparsity pattern estimate will have a constant fraction of errors. Lower bounds on the measurement rate needed to attain a desired fraction of errors are given in terms of the SNR and various key parameters of the unknown vector. The tightness of the bounds in a scaling sense, as a function of the SNR and the fraction of errors, is established by comparison with existing achievable bounds. Near optimality is shown for a wide variety of practically motivated signal models.
Keywords :
approximation theory; information theory; SNR; approximate sparsity pattern recovery; compressed sensing; constant fraction; information theoretic lower bounds; noisy linear measurements; optimal sparsity pattern estimation; signal-to-noise ratio; unknown sparse vector; Distortion measurement; Entropy; Noise measurement; Rate-distortion; Signal to noise ratio; Vectors; Compressed sensing; information-theoretic bounds; random matrix theory; sparsity; support recovery;
Journal_Title :
Information Theory, IEEE Transactions on
DOI :
10.1109/TIT.2013.2253852