Push–Pull with Device Sampling

06/08/2022
by   Yu-Guan Hsieh, et al.
12

We consider decentralized optimization problems in which a number of agents collaborate to minimize the average of their local functions by exchanging over an underlying communication graph. Specifically, we place ourselves in an asynchronous model where only a random portion of nodes perform computation at each iteration, while the information exchange can be conducted between all the nodes and in an asymmetric fashion. For this setting, we propose an algorithm that combines gradient tracking and variance reduction over the entire network. This enables each node to track the average of the gradients of the objective functions. Our theoretical analysis shows that the algorithm converges linearly, when the local objective functions are strongly convex, under mild connectivity conditions on the expected mixing matrices. In particular, our result does not require the mixing matrices to be doubly stochastic. In the experiments, we investigate a broadcast mechanism that transmits information from computing nodes to their neighbors, and confirm the linear convergence of our method on both synthetic and real-world datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/14/2021

Compressed Gradient Tracking for Decentralized Optimization Over General Directed Networks

In this paper, we propose two communication-efficient algorithms for dec...
research
03/20/2018

A Push-Pull Gradient Method for Distributed Optimization in Networks

In this paper, we focus on solving a distributed convex optimization pro...
research
10/15/2018

Push-Pull Gradient Methods for Distributed Optimization in Networks

In this paper, we focus on solving a distributed convex optimization pro...
research
06/12/2023

On the Computation-Communication Trade-Off with A Flexible Gradient Tracking Approach

We propose a flexible gradient tracking approach with adjustable computa...
research
07/26/2021

Provably Accelerated Decentralized Gradient Method Over Unbalanced Directed Graphs

In this work, we consider the decentralized optimization problem in whic...
research
02/09/2020

Linearly Convergent Algorithm with Variance Reduction for Distributed Stochastic Optimization

This paper considers a distributed stochastic strongly convex optimizati...
research
01/21/2019

Distributed Nesterov gradient methods over arbitrary graphs

In this letter, we introduce a distributed Nesterov method, termed as AB...

Please sign up or login with your details

Forgot password? Click here to reset