General Proximal Incremental Aggregated Gradient Algorithms: Better and Novel Results under General Scheme

10/11/2019
by   Tao Sun, et al.
0

The incremental aggregated gradient algorithm is popular in network optimization and machine learning research. However, the current convergence results require the objective function to be strongly convex. And the existing convergence rates are also limited to linear convergence. Due to the mathematical techniques, the stepsize in the algorithm is restricted by the strongly convex constant, which may make the stepsize be very small (the strongly convex constant may be small). In this paper, we propose a general proximal incremental aggregated gradient algorithm, which contains various existing algorithms including the basic incremental aggregated gradient method. Better and new convergence results are proved even with the general scheme. The novel results presented in this paper, which have not appeared in previous literature, include: a general scheme, nonconvex analysis, the sublinear convergence rates of the function values, much larger stepsizes that guarantee the convergence, the convergence when noise exists, the line search strategy of the proximal incremental aggregated gradient algorithm and its convergence.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/23/2018

On the complexity of convex inertial proximal algorithms

The inertial proximal gradient algorithm is efficient for the composite ...
research
07/01/2014

SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly Convex Composite Objectives

In this work we introduce a new optimisation method called SAGA in the s...
research
09/10/2023

Linear Speedup of Incremental Aggregated Gradient Methods on Streaming Data

This paper considers a type of incremental aggregated gradient (IAG) met...
research
01/17/2018

On the Proximal Gradient Algorithm with Alternated Inertia

In this paper, we investigate the attractive properties of the proximal ...
research
05/31/2018

On Curvature-aided Incremental Aggregated Gradient Methods

This paper studies an acceleration technique for incremental aggregated ...
research
10/18/2016

Analysis and Implementation of an Asynchronous Optimization Algorithm for the Parameter Server

This paper presents an asynchronous incremental aggregated gradient algo...
research
06/28/2023

Ordering for Non-Replacement SGD

One approach for reducing run time and improving efficiency of machine l...

Please sign up or login with your details

Forgot password? Click here to reset