The Lasso with general Gaussian designs with applications to hypothesis testing

07/27/2020
by   Michael Celentano, et al.
15

The Lasso is a method for high-dimensional regression, which is now commonly used when the number of covariates p is of the same order or larger than the number of observations n. Classical asymptotic normality theory is not applicable for this model due to two fundamental reasons: (1) The regularized risk is non-smooth; (2) The distance between the estimator θ and the true parameters vector θ^⋆ cannot be neglected. As a consequence, standard perturbative arguments that are the traditional basis for asymptotic normality fail. On the other hand, the Lasso estimator can be precisely characterized in the regime in which both n and p are large, while n/p is of order one. This characterization was first obtained in the case of standard Gaussian designs, and subsequently generalized to other high-dimensional estimation procedures. Here we extend the same characterization to Gaussian correlated designs with non-singular covariance structure. This characterization is expressed in terms of a simpler “fixed design” model. We establish non-asymptotic bounds on the distance between distributions of various quantities in the two models, which hold uniformly over signals θ^⋆ in a suitable sparsity class, and values of the regularization parameter. As applications, we study the distribution of the debiased Lasso, and show that a degrees-of-freedom correction is necessary for computing valid confidence intervals.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset