Computational Sufficiency, Reflection Groups, and Generalized Lasso Penalties

09/08/2018
by   Vincent Q. Vu, et al.
0

We study estimators with generalized lasso penalties within the computational sufficiency framework introduced by Vu (2018, arXiv:1807.05985). By representing these penalties as support functions of zonotopes and more generally Minkowski sums of line segments and rays, we show that there is a natural reflection group associated with the underlying optimization problem. A consequence of this point of view is that for large classes of estimators sharing the same penalty, the penalized least squares estimator is computationally minimal sufficient. This means that all such estimators can be computed by refining the output of any algorithm for the least squares case. An interesting technical component is our analysis of coordinate descent on the dual problem. A key insight is that the iterates are obtained by reflecting and averaging, so they converge to an element of the dual feasible set that is minimal with respect to a ordering induced by the group associated with the penalty. Our main application is fused lasso/total variation denoising and isotonic regression on arbitrary graphs. In those cases the associated group is a permutation group.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset