Near Optimal Heteroscedastic Regression with Symbiotic Learning
We consider the problem of heteroscedastic linear regression, where, given n samples (๐ฑ_i, y_i) from y_i = โจ๐ฐ^*, ๐ฑ_i โฉ + ฯต_i ยทโจ๐^*, ๐ฑ_i โฉ with ๐ฑ_i โผ N(0,๐), ฯต_i โผ N(0,1), we aim to estimate ๐ฐ^*. Beyond classical applications of such models in statistics, econometrics, time series analysis etc., it is also particularly relevant in machine learning when data is collected from multiple sources of varying but apriori unknown quality. Our work shows that we can estimate ๐ฐ^* in squared norm up to an error of ร(๐^*^2 ยท(1/n + (d/n)^2)) and prove a matching lower bound (upto log factors). This represents a substantial improvement upon the previous best known upper bound of ร(๐^*^2ยทd/n). Our algorithm is an alternating minimization procedure with two key subroutines 1. An adaptation of the classical weighted least squares heuristic to estimate ๐ฐ^*, for which we provide the first non-asymptotic guarantee. 2. A nonconvex pseudogradient descent procedure for estimating ๐^* inspired by phase retrieval. As corollaries, we obtain fast non-asymptotic rates for two important problems, linear regression with multiplicative noise and phase retrieval with multiplicative noise, both of which are of independent interest. Beyond this, the proof of our lower bound, which involves a novel adaptation of LeCam's method for handling infinite mutual information quantities (thereby preventing a direct application of standard techniques like Fano's method), could also be of broader interest for establishing lower bounds for other heteroscedastic or heavy-tailed statistical problems.
READ FULL TEXT