Boosting Frank-Wolfe by Chasing Gradients

03/13/2020
by   Cyrille W. Combettes, et al.
0

The Frank-Wolfe algorithm has become a popular first-order optimization algorithm for it is simple and projection-free, and it has been successfully applied to a variety of real-world problems. Its main drawback however lies in its convergence rate, which can be excessively slow due to naive descent directions. We propose to speed-up the Frank-Wolfe algorithm by better aligning the descent direction with that of the negative gradient via a subroutine. This subroutine chases the negative gradient direction in a matching pursuit-style while still preserving the projection-free property. Although the approach is reasonably natural, it produces very significant results. We derive convergence rates O(1/t) to O(e^-ω t^p) of our method where p∈]0,1], and we demonstrate its competitive advantage both per iteration and in CPU time over the state-of-the-art in a series of computational experiments.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset