Efficient algorithms for multivariate shape-constrained convex regression problems

02/26/2020
by   Meixia Lin, et al.
0

Shape-constrained convex regression problem deals with fitting a convex function to the observed data, where additional constraints are imposed, such as component-wise monotonicity and uniform Lipschitz continuity. This paper provides a comprehensive mechanism for computing the least squares estimator of a multivariate shape-constrained convex regression function in R^d. We prove that the least squares estimator is computable via solving a constrained convex quadratic programming (QP) problem with (n+1)d variables and at least n(n-1) linear inequality constraints, where n is the number of data points. For solving the generally very large-scale convex QP, we design two efficient algorithms, one is the symmetric Gauss-Seidel based alternating direction method of multipliers ( sGS-ADMM), and the other is the proximal augmented Lagrangian method ( pALM) with the subproblems solved by the semismooth Newton method ( SSN). Comprehensive numerical experiments, including those in the pricing of basket options and estimation of production functions in economics, demonstrate that both of our proposed algorithms outperform the state-of-the-art algorithm. The pALM is more efficient than the sGS-ADMM but the latter has the advantage of being simpler to implement.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro