Fast Incremental Method for Nonconvex Optimization

03/19/2016
by   Sashank J Reddi, et al.
0

We analyze a fast incremental aggregated gradient method for optimizing nonconvex problems of the form _x ∑_i f_i(x). Specifically, we analyze the SAGA algorithm within an Incremental First-order Oracle framework, and show that it converges to a stationary point provably faster than both gradient descent and stochastic gradient descent. We also discuss a Polyak's special class of nonconvex problems for which SAGA converges at a linear rate to the global optimum. Finally, we analyze the practically valuable regularized and minibatch variants of SAGA. To our knowledge, this paper presents the first analysis of fast convergence for an incremental aggregated gradient method for nonconvex problems.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset