Relative Lipschitzness in Extragradient Methods and a Direct Recipe for Acceleration

11/12/2020
by   Michael B. Cohen, et al.
0

We show that standard extragradient methods (i.e. mirror prox and dual extrapolation) recover optimal accelerated rates for first-order minimization of smooth convex functions. To obtain this result we provide fine-grained characterization of the convergence rates of extragradient methods for solving monotone variational inequalities in terms of a natural condition we call relative Lipschitzness. We further generalize this framework to handle local and randomized notions of relative Lipschitzness and thereby recover rates for box-constrained ℓ_∞ regression based on area convexity and complexity bounds achieved by accelerated (randomized) coordinate descent for smooth convex function minimization.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset