Direct Acceleration of SAGA using Sampled Negative Momentum

06/28/2018
by   Kaiwen Zhou, et al.
0

Variance reduction is a simple and effective technique that accelerates convex (or non-convex) stochastic optimization. Among existing variance reduction methods, SVRG and SAGA adopt unbiased gradient estimators and have become the most popular variance reduction methods in recent years. Although various accelerated variants of SVRG (e.g., Katyusha, Acc-Prox-SVRG) have been proposed, the direct acceleration of SAGA still remains unknown. In this paper, we propose a direct accelerated variant of SAGA using Sampled Negative Momentum (SSNM), which achieves the best known oracle complexities for strongly convex problems. Consequently, our work fills the void of direct accelerated SAGA.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset