Fast Stochastic Composite Minimization and an Accelerated Frank-Wolfe Algorithm under Parallelization

05/25/2022
by   Benjamin Dubois-Taine, et al.
0

We consider the problem of minimizing the sum of two convex functions. One of those functions has Lipschitz-continuous gradients, and can be accessed via stochastic oracles, whereas the other is "simple". We provide a Bregman-type algorithm with accelerated convergence in function values to a ball containing the minimum. The radius of this ball depends on problem-dependent constants, including the variance of the stochastic oracle. We further show that this algorithmic setup naturally leads to a variant of Frank-Wolfe achieving acceleration under parallelization. More precisely, when minimizing a smooth convex function on a bounded domain, we show that one can achieve an ϵ primal-dual gap (in expectation) in Õ(1/ √(ϵ)) iterations, by only accessing gradients of the original function and a linear maximization oracle with O(1/√(ϵ)) computing units in parallel. We illustrate this fast convergence on synthetic numerical experiments.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro