DocumentCode :
1340219
Title :
Minimax Rates of Estimation for High-Dimensional Linear Regression Over \\ell _q -Balls
Author :
Raskutti, Garvesh ; Wainwright, Martin J. ; Yu, Bin
Author_Institution :
Dept. of Stat., Univ. of California at Berkeley, Berkeley, CA, USA
Volume :
57
Issue :
10
fYear :
2011
Firstpage :
6976
Lastpage :
6994
Abstract :
Consider the high-dimensional linear regression model y = X β* + w, where y ∈ BBRn is an observation vector, X ∈ BBRn × d is a design matrix with d >; n, β* ∈ BBRd is an unknown regression vector, and w ~ N(0, σ2I) is additive Gaussian noise. This paper studies the minimax rates of convergence for estimating β* in either l2-loss and l2-prediction loss, assuming that β* belongs to an lq -ball BBBq(Rq) for some q ∈ [0,1]. It is shown that under suitable regularity conditions on the design matrix X, the minimax optimal rate in l2-loss and l2-prediction loss scales as Θ(Rq ([(logd)/(n)])1-q/2). The analysis in this paper reveals that conditions on the design matrix X enter into the rates for l2-error and l2-prediction error in complementary ways in the upper and lower bounds. Our proofs of the lower bounds are information theoretic in nature, based on Fano´s inequality and results on the metric entropy of the balls BBBq(Rq), whereas our proofs of the upper bounds are constructive, involving direct analysis of least squares over lq-balls. For the special case q=0, corresponding to models with an exact sparsity constraint, our results show that although computationally efficient l1-based methods can achieve the minimax rates up to constant factors, they require slightly stronger assumptions on the design matrix X than optimal algorithms involving least-squares over the l0-ball.
Keywords :
Gaussian noise; entropy; matrix algebra; regression analysis; signal reconstruction; Fano inequality; additive Gaussian noise; compressed sensing; high-dimensional linear regression; least-square method; lq -balls; matrix; metric entropy; optimal algorithms; sparsity constraint; Eigenvalues and eigenfunctions; Linear regression; Measurement; Noise; Null space; Upper bound; Vectors; Compressed sensing; minimax techniques; regression analysis;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/TIT.2011.2165799
Filename :
6034739
Link To Document :
بازگشت