Title :
Nearly optimal sample size in hypothesis testing for high-dimensional regression
Author :
Javanmard, Adel ; Montanari, Alessandro
Author_Institution :
Dept. of Electr. Eng., Stanford Univ., Stanford, CA, USA
Abstract :
We consider the problem of fitting the parameters of a high-dimensional linear regression model. In the regime where the number of parameters p is comparable to or exceeds the sample size n, a successful approach uses an ℓ1-penalized least squares estimator, known as Lasso. Unfortunately, unlike for linear estimators (e.g. ordinary least squares), no well-established method exists to compute confidence intervals or p-values on the basis of the Lasso estimator. Very recently, a line of work [8], [7], [13] has addressed this problem by constructing a debiased version of the Lasso estimator. We propose a special debiasing method that is well suited for random designs with sparse inverse covariance. Our approach improves over the state of the art in that it yields nearly optimal average testing power if sample size n asymptotically dominates s0(logp)2, with s0 being the sparsity level (number of non-zero coefficients). Earlier work achieved similar performances only for much larger sample size, namely it requires n to asymptotically dominates (s0 log p)2. We evaluate our method on synthetic data, and compare it with earlier proposals.
Keywords :
covariance analysis; estimation theory; regression analysis; ℓ1-penalized least squares estimator; confidence intervals; debiasing method; high-dimensional linear regression model; high-dimensional regression; hypothesis testing; lasso estimator; nearly optimal sample size; p-values; random designs; sparse inverse covariance; Context; Educational institutions; Testing;
Conference_Titel :
Communication, Control, and Computing (Allerton), 2013 51st Annual Allerton Conference on
Conference_Location :
Monticello, IL
Print_ISBN :
978-1-4799-3409-6
DOI :
10.1109/Allerton.2013.6736695