An Accelerated Directional Derivative Method for Smooth Stochastic Convex Optimization

04/08/2018
by   Pavel Dvurechensky, et al.
0

We consider smooth stochastic convex optimization problems in the context of algorithms which are based on directional derivatives of the objective function. This context can be considered as an intermediate one between derivative-free optimization and gradient-based optimization. We assume that at any given point and for any given direction, a stochastic approximation for the directional derivative of the objective function at this point and in this direction is available with some additive noise. The noise is assumed to be of an unknown nature, but bounded in the absolute value. We underline that we consider directional derivatives in any direction, as opposed to coordinate descent methods which use only derivatives in coordinate directions. For this setting, we propose a non-accelerated and an accelerated directional derivative method and provide their complexity bounds. Despite that our algorithms do not use gradient information, our non-accelerated algorithm has a complexity bound which is, up to a factor logarithmic in problem dimension, similar to the complexity bound of gradient-based algorithms. Our accelerated algorithm has a complexity bound which coincides with the complexity bound of the accelerated gradient-based algorithm up to a factor of square root of the problem dimension, whereas for existing directional derivative methods this factor is of the order of problem dimension. We also extend these results to strongly convex problems. Finally, we consider derivative-free optimization as a particular case of directional derivative optimization with noise in the directional derivative and obtain complexity bounds for non-accelerated and accelerated derivative-free methods. Complexity bounds for these algorithms inherit the gain in the dimension dependent factors from our directional derivative methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/25/2018

An Accelerated Method for Derivative-Free Smooth Stochastic Convex Optimization

We consider an unconstrained problem of minimization of a smooth convex ...
research
09/11/2012

On the Complexity of Bandit and Derivative-Free Stochastic Convex Optimization

The problem of stochastic convex optimization with bandit feedback (in t...
research
10/06/2020

SCOBO: Sparsity-Aware Comparison Oracle Based Optimization

We study derivative-free optimization for convex functions where we furt...
research
06/05/2023

Nonlinear Distributionally Robust Optimization

This article focuses on a class of distributionally robust optimization ...
research
10/08/2018

Towards Gradient Free and Projection Free Stochastic Optimization

This paper focuses on the problem of constrainedstochastic optimization....
research
08/16/2018

Experiential Robot Learning with Accelerated Neuroevolution

Derivative-based optimization techniques such as Stochastic Gradient Des...

Please sign up or login with your details

Forgot password? Click here to reset