A general system of differential equations to model first order adaptive algorithms

10/31/2018
by   André Belotto da Silva, et al.
0

First order optimization algorithms play a major role in large scale machine learning. A new class of methods, called adaptive algorithms, were recently introduced to adjust iteratively the learning rate for each coordinate. Despite great practical success in deep learning, their behavior and performance on more general loss functions are not well understood. In this paper, we derive a non-autonomous system of differential equations, which is the continuous time limit of adaptive optimization methods. We prove global well-posedness of the system and we investigate the numerical time convergence of its forward Euler approximation. We study, furthermore, the convergence of its trajectories and give conditions under which the differential system, underlying all adaptive algorithms, is suitable for optimization. We discuss convergence to a critical point in the non-convex case and give conditions for the dynamics to avoid saddle points and local maxima. For convex and deterministic loss function, we introduce a suitable Lyapunov functional which allow us to study its rate of convergence. Several other properties of both the continuous and discrete systems are briefly discussed. The differential system studied in the paper is general enough to encompass many other classical algorithms (such as Heavy ball and Nesterov's accelerated method) and allow us to recover several known results for these algorithms.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/01/2020

The connections between Lyapunov functions for some optimization algorithms and differential equations

In this manuscript we study the properties of a family of a second order...
research
10/21/2018

Understanding the Acceleration Phenomenon via High-Resolution Differential Equations

Gradient-based optimization algorithms can be studied from the perspecti...
research
05/15/2023

On the connections between optimization algorithms, Lyapunov functions, and differential equations: theory and insights

We study connections between differential equations and optimization alg...
research
02/14/2017

Hybrid System Modelling and Simulation with Dirac Deltas

For a wide variety of problems, creating detailed continuous models of (...
research
10/04/2019

Discrete Processes and their Continuous Limits

The possibility that a discrete process can be fruitfully approximated b...
research
08/30/2010

Fixed-point and coordinate descent algorithms for regularized kernel methods

In this paper, we study two general classes of optimization algorithms f...
research
05/28/2021

Polygonal Unadjusted Langevin Algorithms: Creating stable and efficient adaptive algorithms for neural networks

We present a new class of adaptive stochastic optimization algorithms, w...

Please sign up or login with your details

Forgot password? Click here to reset