A unified variance-reduced accelerated gradient method for convex optimization

05/29/2019
by   Guanghui Lan, et al.
0

We propose a novel randomized incremental gradient algorithm, namely, VAriance-Reduced Accelerated Gradient (Varag), for finite-sum optimization. Equipped with a unified step-size policy that adjusts itself to the value of the conditional number, Varag exhibits the unified optimal rates of convergence for solving smooth convex finite-sum problems directly regardless of their strong convexity. Moreover, Varag is the first of its kind that benefits from the strong convexity of the data-fidelity term, and solves a wide class of problems only satisfying an error bound condition rather than strong convexity, both resulting in the optimal linear rate of convergence. Varag can also be extended to solve stochastic finite-sum problems.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset