Max-Linear Regression by Scalable and Guaranteed Convex Programming

03/12/2021
by   Seonho Kim, et al.
0

We consider the multivariate max-linear regression problem where the model parameters β_1,…,β_k∈ℝ^p need to be estimated from n independent samples of the (noisy) observations y = max_1≤ j ≤ kβ_j^𝖳x + noise. The max-linear model vastly generalizes the conventional linear model, and it can approximate any convex function to an arbitrary accuracy when the number of linear models k is large enough. However, the inherent nonlinearity of the max-linear model renders the estimation of the regression parameters computationally challenging. Particularly, no estimator based on convex programming is known in the literature. We formulate and analyze a scalable convex program as the estimator for the max-linear regression problem. Under the standard Gaussian observation setting, we present a non-asymptotic performance guarantee showing that the convex program recovers the parameters with high probability. When the k linear components are equally likely to achieve the maximum, our result shows that a sufficient number of observations scales as k^2p up to a logarithmic factor. This significantly improves on the analogous prior result based on alternating minimization (Ghosh et al., 2019). Finally, through a set of Monte Carlo simulations, we illustrate that our theoretical result is consistent with empirical behavior, and the convex estimator for max-linear regression is as competitive as the alternating minimization algorithm in practice.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro