Unified analysis of SGD-type methods

03/29/2023
by   Eduard Gorbunov, et al.
0

This note focuses on a simple approach to the unified analysis of SGD-type methods from (Gorbunov et al., 2020) for strongly convex smooth optimization problems. The similarities in the analyses of different stochastic first-order methods are discussed along with the existing extensions of the framework. The limitations of the analysis and several alternative approaches are mentioned as well.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/20/2020

Unified Analysis of Stochastic Gradient Methods for Composite Convex and Smooth Optimization

We present a unified theorem for the convergence analysis of stochastic ...
research
06/08/2022

A Unified Convergence Theorem for Stochastic Optimization Methods

In this work, we provide a fundamental unified convergence theorem used ...
research
12/08/2012

Stochastic Gradient Descent for Non-smooth Optimization: Convergence Results and Optimal Averaging Schemes

Stochastic Gradient Descent (SGD) is one of the simplest and most popula...
research
10/27/2019

Improved Zeroth-Order Variance Reduced Algorithms and Analysis for Nonconvex Optimization

Two types of zeroth-order stochastic algorithms have recently been desig...
research
10/25/2020

Local SGD for Saddle-Point Problems

GAN is one of the most popular and commonly used neural network models. ...
research
11/03/2020

Local SGD: Unified Theory and New Efficient Methods

We present a unified framework for analyzing local SGD methods in the co...
research
07/06/2021

Unifying Width-Reduced Methods for Quasi-Self-Concordant Optimization

We provide several algorithms for constrained optimization of a large cl...

Please sign up or login with your details

Forgot password? Click here to reset