DocumentCode :
3122487
Title :
Applications of the Shannon-Hartley theorem to data streams and sparse recovery
Author :
Price, Eric ; Woodruff, David P.
fYear :
2012
fDate :
1-6 July 2012
Firstpage :
2446
Lastpage :
2450
Abstract :
The Shannon-Hartley theorem bounds the maximum rate at which information can be transmitted over a Gaussian channel in terms of the ratio of the signal to noise power. We show two unexpected applications of this theorem in computer science: (1) we give a much simpler proof of an Ω(η1-2/ρ) bound on the number of linear measurements required to approximate the p-th frequency moment in a data stream, and show a new distribution which is hard for this problem, (2) we show that the number of measurements needed to solve the k-sparse recovery problem on an n-dimensional vector x with the C-approximate ℓ2/ℓ2 guarantee is Ω(k log(n/k)/log C). We complement this result with an almost matching O(k log* k log(n/k)/log C) upper bound.
Keywords :
Gaussian channels; compressed sensing; computational complexity; information theory; signal restoration; Gaussian channel; Shannon-Hartley theorem; data stream; sparse recovery problem; Approximation methods; Complexity theory; Frequency estimation; Noise; Random variables; Upper bound; Vectors;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Information Theory Proceedings (ISIT), 2012 IEEE International Symposium on
Conference_Location :
Cambridge, MA
ISSN :
2157-8095
Print_ISBN :
978-1-4673-2580-6
Electronic_ISBN :
2157-8095
Type :
conf
DOI :
10.1109/ISIT.2012.6283954
Filename :
6283954
Link To Document :
بازگشت