A Control Theoretic Framework for Adaptive Gradient Optimizers in Machine Learning

06/04/2022
by   Kushal Chakrabarti, et al.
0

Adaptive gradient methods have become popular in optimizing deep neural networks; recent examples include AdaGrad and Adam. Although Adam usually converges faster, variations of Adam, for instance, the AdaBelief algorithm, have been proposed to enhance Adam's poor generalization ability compared to the classical stochastic gradient method. This paper develops a generic framework for adaptive gradient methods that solve non-convex optimization problems. We first model the adaptive gradient methods in a state-space framework, which allows us to present simpler convergence proofs of adaptive optimizers such as AdaGrad, Adam, and AdaBelief. We then utilize the transfer function paradigm from classical control theory to propose a new variant of Adam, coined AdamSSM. We add an appropriate pole-zero pair in the transfer function from squared gradients to the second moment estimate. We prove the convergence of the proposed AdamSSM algorithm. Applications on benchmark machine learning tasks of image classification using CNN architectures and language modeling using LSTM architecture demonstrate that the AdamSSM algorithm improves the gap between generalization accuracy and faster convergence than the recent adaptive gradient methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/23/2018

Adaptive Stochastic Gradient Langevin Dynamics: Taming Convergence and Saddle Point Escape Time

In this paper, we propose a new adaptive stochastic gradient Langevin dy...
research
11/06/2018

Double Adaptive Stochastic Gradient Optimization

Adaptive moment methods have been remarkably successful in deep learning...
research
07/15/2021

SA-GD: Improved Gradient Descent Learning Strategy with Simulated Annealing

Gradient descent algorithm is the most utilized method when optimizing m...
research
06/22/2021

Adapting Stepsizes by Momentumized Gradients Improves Optimization and Generalization

Adaptive gradient methods, such as Adam, have achieved tremendous succes...
research
02/21/2020

Asynchronous parallel adaptive stochastic gradient methods

Stochastic gradient methods (SGMs) are the predominant approaches to tra...
research
06/15/2021

SUPER-ADAM: Faster and Universal Framework of Adaptive Gradients

Adaptive gradient methods have shown excellent performance for solving m...
research
02/22/2022

Connecting Optimization and Generalization via Gradient Flow Path Length

Optimization and generalization are two essential aspects of machine lea...

Please sign up or login with your details

Forgot password? Click here to reset