DocumentCode :
2024397
Title :
Information-theoretic bounds on sparsity recovery in the high-dimensional and noisy setting
Author :
Wainwright, M.
Author_Institution :
UC Berkeley, Berkeley
fYear :
2007
fDate :
24-29 June 2007
Firstpage :
961
Lastpage :
965
Abstract :
The problem of recovering the sparsity pattern of a fixed but unknown vector beta* and Rp based on a set of n noisy observations arises in a variety of settings, including subset selection in regression, graphical model selection, signal denoising, compressive sensing, and constructive approximation. Of interest are conditions on the model dimension p, the sparsity index s (number of non-zero entries in beta*), and the number of observations n that are necessary and/or sufficient to ensure asymptotically perfect recovery of the sparsity pattern. This paper focuses on the information-theoretic limits of sparsity recovery: in particular, for a noisy linear observation model based on measurement vectors drawn from the standard Gaussian ensemble, we derive both a set of sufficient conditions for asymptotically perfect recovery using the optimal decoder, as well as a set of necessary conditions that any decoder must satisfy for perfect recovery. This analysis of optimal decoding limits complements our previous work on thresholds for the behavior of l1 -constrained quadratic programming for Gaussian measurement ensembles.
Keywords :
Gaussian noise; decoding; information-theoretic bounds; measurement vectors; noisy linear observation model; optimal decoding; sparsity recovery; standard Gaussian ensemble; Decoding; Gaussian noise; Graphical models; Information analysis; Measurement standards; Pollution measurement; Quadratic programming; Signal denoising; Statistics; Sufficient conditions;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Information Theory, 2007. ISIT 2007. IEEE International Symposium on
Conference_Location :
Nice
Print_ISBN :
978-1-4244-1397-3
Type :
conf
DOI :
10.1109/ISIT.2007.4557348
Filename :
4557348
Link To Document :
بازگشت