Non-Asymptotic Bounds for the ℓ_∞ Estimator in Linear Regression with Uniform Noise

08/17/2021
by   Yufei Yi, et al.
0

The Chebyshev or ℓ_∞ estimator is an unconventional alternative to the ordinary least squares in solving linear regressions. It is defined as the minimizer of the ℓ_∞ objective function β̂ := min_βY - 𝐗β_∞. The asymptotic distribution of the Chebyshev estimator under fixed number of covariates were recently studied (Knight, 2020), yet finite sample guarantees and generalizations to high-dimensional settings remain open. In this paper, we develop non-asymptotic upper bounds on the estimation error β̂-β^*_2 for a Chebyshev estimator β̂, in a regression setting with uniformly distributed noise ε_i∼ U([-a,a]) where a is either known or unknown. With relatively mild assumptions on the (random) design matrix 𝐗, we can bound the error rate by C_p/n with high probability, for some constant C_p depending on the dimension p and the law of the design. Furthermore, we illustrate that there exist designs for which the Chebyshev estimator is (nearly) minimax optimal. In addition we show that "Chebyshev's LASSO" has advantages over the regular LASSO in high dimensional situations, provided that the noise is uniform. Specifically, we argue that it achieves a much faster rate of estimation under certain assumptions on the growth rate of the sparsity level and the ambient dimension with respect to the sample size.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset