Near Optimal Heteroscedastic Regression with Symbiotic Learning

06/25/2023
โˆ™
by   Dheeraj Baby, et al.
โˆ™
0
โˆ™

We consider the problem of heteroscedastic linear regression, where, given n samples (๐ฑ_i, y_i) from y_i = โŸจ๐ฐ^*, ๐ฑ_i โŸฉ + ฯต_i ยทโŸจ๐Ÿ^*, ๐ฑ_i โŸฉ with ๐ฑ_i โˆผ N(0,๐ˆ), ฯต_i โˆผ N(0,1), we aim to estimate ๐ฐ^*. Beyond classical applications of such models in statistics, econometrics, time series analysis etc., it is also particularly relevant in machine learning when data is collected from multiple sources of varying but apriori unknown quality. Our work shows that we can estimate ๐ฐ^* in squared norm up to an error of ร•(๐Ÿ^*^2 ยท(1/n + (d/n)^2)) and prove a matching lower bound (upto log factors). This represents a substantial improvement upon the previous best known upper bound of ร•(๐Ÿ^*^2ยทd/n). Our algorithm is an alternating minimization procedure with two key subroutines 1. An adaptation of the classical weighted least squares heuristic to estimate ๐ฐ^*, for which we provide the first non-asymptotic guarantee. 2. A nonconvex pseudogradient descent procedure for estimating ๐Ÿ^* inspired by phase retrieval. As corollaries, we obtain fast non-asymptotic rates for two important problems, linear regression with multiplicative noise and phase retrieval with multiplicative noise, both of which are of independent interest. Beyond this, the proof of our lower bound, which involves a novel adaptation of LeCam's method for handling infinite mutual information quantities (thereby preventing a direct application of standard techniques like Fano's method), could also be of broader interest for establishing lower bounds for other heteroscedastic or heavy-tailed statistical problems.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset