SPIDER: Near-Optimal Non-Convex Optimization via Stochastic Path Integrated Differential Estimator

07/04/2018
by   Cong Fang, et al.
0

In this paper, we propose a new technique named Stochastic Path-Integrated Differential EstimatoR (SPIDER), which can be used to track many deterministic quantities of interest with significantly reduced computational cost. Combining SPIDER with the method of normalized gradient descent, we propose two new algorithms, namely SPIDER-SFO and SPIDER-SSO, that solve non-convex stochastic optimization problems using stochastic gradients only. We provide sharp error-bound results on their convergence rates. Specially, we prove that the SPIDER-SFO and SPIDER-SSO algorithms achieve a record-breaking Õ(ϵ^-3) gradient computation cost to find an ϵ-approximate first-order and (ϵ, O(ϵ^0.5))-approximate second-order stationary point, respectively. In addition, we prove that SPIDER-SFO nearly matches the algorithmic lower bound for finding stationary point under the gradient Lipschitz assumption in the finite-sum setting.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset