Finite sample rates for logistic regression with small noise or few samples

05/25/2023
by   Felix Kuchelmeister, et al.
0

The logistic regression estimator is known to inflate the magnitude of its coefficients if the sample size n is small, the dimension p is (moderately) large or the signal-to-noise ratio 1/σ is large (probabilities of observing a label are close to 0 or 1). With this in mind, we study the logistic regression estimator with p≪ n/log n, assuming Gaussian covariates and labels generated by the Gaussian link function, with a mild optimization constraint on the estimator's length to ensure existence. We provide finite sample guarantees for its direction, which serves as a classifier, and its Euclidean norm, which is an estimator for the signal-to-noise ratio. We distinguish between two regimes. In the low-noise/small-sample regime (nσ≲ plog n), we show that the estimator's direction (and consequentially the classification error) achieve the rate (plog n)/n - as if the problem was noiseless. In this case, the norm of the estimator is at least of order n/(plog n). If instead nσ≳ plog n, the estimator's direction achieves the rate √(σ plog n/n), whereas its norm converges to the true norm at the rate √(plog n/(nσ^3)). As a corollary, the data are not linearly separable with high probability in this regime. The logistic regression estimator allows to conclude which regime occurs with high probability. Therefore, inference for logistic regression is possible in the regime nσ≳ plog n. In either case, logistic regression provides a competitive classifier.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset