Towards Unified Acceleration of High-Order Algorithms under Hölder Continuity and Uniform Convexity

06/03/2019
by   Chaobing Song, et al.
0

In this paper, through a very intuitive vanilla proximal method perspective, we derive accelerated high-order optimization algorithms for minimizing a convex function that has Hölder continuous derivatives. In this general convex setting, we propose a unified acceleration algorithm with an iteration complexity that matches the lower iteration complexity bound given in grapiglia2019tensor. If the function is further uniformly convex, we propose a general restart scheme. The iteration complexity of the algorithm matches existing lower bounds in most important cases. For practical implementation, we introduce a new and effective heuristic that significantly simplifies the binary search procedure required by the algorithm, which makes the algorithm in general settings as efficient as the special case grapiglia2019tensor. On large-scale classification datasets, our algorithm demonstrates clear and consistent advantages of high-order acceleration methods over first-order ones, in terms of run-time complexity. Our formulation considers the more general composite setting in which the objective function may contain a second possibly non-smooth convex term. Our analysis and proofs are also applicable to the general case in which the high-order smoothness conditions are with respect to non-Euclidean norms.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/15/2023

Continuized Acceleration for Quasar Convex Functions in Non-Convex Optimization

Quasar convexity is a condition that allows some first-order methods to ...
research
02/19/2022

Generalized Optimistic Methods for Convex-Concave Saddle Point Problems

The optimistic gradient method has seen increasing popularity as an effi...
research
07/14/2023

First-order Methods for Affinely Constrained Composite Non-convex Non-smooth Problems: Lower Complexity Bound and Near-optimal Methods

Many recent studies on first-order methods (FOMs) focus on composite non...
research
01/26/2021

Complementary Composite Minimization, Small Gradients in General Norms, and Applications to Regression Problems

Composite minimization is a powerful framework in large-scale convex opt...
research
06/06/2017

Limitations on Variance-Reduction and Acceleration Schemes for Finite Sum Optimization

We study the conditions under which one is able to efficiently apply var...
research
02/25/2020

Distributed Algorithms for Composite Optimization: Unified and Tight Convergence Analysis

We study distributed composite optimization over networks: agents minimi...
research
02/13/2023

Algorithmic Aspects of the Log-Laplace Transform and a Non-Euclidean Proximal Sampler

The development of efficient sampling algorithms catering to non-Euclide...

Please sign up or login with your details

Forgot password? Click here to reset