Stochastic Conditional Gradient Method for Composite Convex Minimization

01/29/2019
by   Francesco Locatello, et al.
0

In this paper, we propose the first practical algorithm to minimize stochastic composite optimization problems over compact convex sets. This template allows for affine constraints and therefore covers stochastic semidefinite programs (SDPs), which are vastly applicable in both machine learning and statistics. In this setup, stochastic algorithms with convergence guarantees are either not known or not tractable. We tackle this general problem and propose a convergent, easy to implement and tractable algorithm. We prove O(k^-1/3) convergence rate in expectation on the objective residual and O(k^-5/12) in expectation on the feasibility gap. These rates are achieved without increasing the batchsize, which can contain a single sample. We present extensive empirical evidence demonstrating the superiority of our algorithm on a broad range of applications including optimization of stochastic SDPs.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset