• DocumentCode
    112069
  • Title

    Hypothesis Testing in High-Dimensional Regression Under the Gaussian Random Design Model: Asymptotic Theory

  • Author

    Javanmard, Adel ; Montanari, Alessandro

  • Author_Institution
    Dept. of Electr. Eng., Stanford Univ., Stanford, CA, USA
  • Volume
    60
  • Issue
    10
  • fYear
    2014
  • fDate
    Oct. 2014
  • Firstpage
    6522
  • Lastpage
    6554
  • Abstract
    We consider linear regression in the high-dimensional regime where the number of observations n is smaller than the number of parameters p. A very successful approach in this setting uses 11-penalized least squares (also known as the Lasso) to search for a subset of s0 <; n parameters that best explain the data, while setting the other parameters to zero. Considerable amount of work has been devoted to characterizing the estimation and model selection problems within this approach. In this paper, we consider instead the fundamental, but far less understood, question of statistical significance. More precisely, we address the problem of computing p-values for single regression coefficients. On one hand, we develop a general upper bound on the minimax power of tests with a given significance level. We show that rigorous guarantees for earlier methods do not allow to achieve this bound, except in special cases. On the other, we prove that this upper bound is (nearly) achievable through a practical procedure in the case of random design matrices with independent entries. Our approach is based on a debiasing of the Lasso estimator. The analysis builds on a rigorous characterization of the asymptotic distribution of the Lasso estimator and its debiased version. Our result holds for optimal sample size, i.e., when n is at least on the order of s0 log(p/s0). We generalize our approach to random design matrices with independent identically distributed Gaussian rows xi ~ N(0, Σ). In this case, we prove that a similar distributional characterization (termed standard distributional limit) holds for n much larger than s0(log p)2.Our analysis assumes Σ is known. To cope with unknown Σ, we suggest a plug-in estimator for sparse covariances Σ and validate the method through numerical simulations. Finally, we show that for optimal sample size, n being at least of order s0 log(p/s0), the stand- rd distributional limit for general Gaussian designs can be derived from the replica heuristics in statistical physics. This derivation suggests a stronger conjecture than the result we prove, and near-optimality of the statistical power for a large class of Gaussian designs.
  • Keywords
    Gaussian distribution; covariance matrices; minimax techniques; random processes; regression analysis; sparse matrices; statistical testing; 11-penalized least square; Gaussian design; Gaussian random design model; Gaussian row distribution; Lasso estimator debiasing; asymptotic distribution; asymptotic theory; hypothesis testing; linear regression; minimax power; model selection problem; numerical simulations; optimal sample size; p-values computing; plug-in estimator; random design matrices; regression coefficient; replica heuristics; sparse covariance; standard distributional limit; statistical physics; upper bound; Covariance matrices; Estimation; Linear regression; Noise; Standards; Testing; Upper bound; High-dimensional regression; Lasso; hypothesis testing; p-value; uncertainty assessment;
  • fLanguage
    English
  • Journal_Title
    Information Theory, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0018-9448
  • Type

    jour

  • DOI
    10.1109/TIT.2014.2343629
  • Filename
    6866880