On Non-Negative Quadratic Programming in Geometric Optimization

07/16/2022
by   Siu-Wing Cheng, et al.
0

We present experimental and theoretical results on a method that applies a numerical solver iteratively to solve several non-negative quadratic programming problems in geometric optimization. The method gains efficiency by exploiting the potential sparsity of the intermediate solutions. We implemented the method to call quadprog of MATLAB iteratively. In comparison with a single call of quadprog, we obtain a 10-fold speedup on two proximity graph problems in ℝ^d on some public data sets, a 10-fold speedup on the minimum enclosing ball problem on random points in a unit cube in ℝ^d, and a 5-fold speedup on the polytope distance problem on random points from a cube in ℝ^d when the input size is significantly larger than the dimension; we also obtain a 2-fold or more speedup on deblurring some gray-scale space and thermal images via non-negative least square. We compare with two minimum enclosing ball software by Gärtner and Fischer et al.; for 1000 nearly cospherical points or random points in a unit cube, the iterative method overtakes the software by Gärtner at 20 dimensions and the software by Fischer et al. at 170 dimensions. In the image deblurring experiments, the iterative method compares favorably with other software that can solve non-negative least square, including FISTA with backtracking, SBB, FNNLS, and lsqnonneg of MATLAB. We analyze theoretically the number of iterations taken by the iterative scheme to reduce the gap between the current solution value and the optimum by a factor e. Under certain assumptions, we prove a bound proportional to the square root of the number of variables.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset