Compression for Distributed Optimization and Timely Updates

01/11/2023
by   Prathamesh Mayekar, et al.
0

The goal of this thesis is to study the compression problems arising in distributed computing systematically. In the first part of the thesis, we study gradient compression for distributed first-order optimization. We begin by establishing information theoretic lower bounds on optimization accuracy when only finite precision gradients are used. Also, we develop fast quantizers for gradient compression, which, when used with standard first-order optimization algorithms, match the aforementioned lower bounds. In the second part of the thesis, we study distributed mean estimation, an important primitive for distributed optimization algorithms. We develop efficient estimators which improve over state of the art by efficiently using the side information present at the center. We also revisit the Gaussian rate-distortion problem and develop efficient quantizers for this problem in both the side-information and the no side information setting. Finally, we study the problem of entropic compression of the symbols transmitted by the edge devices to the center, which commonly arise in cyber-physical systems. Our goal is to design entropic compression schemes that allow the information to be transmitted in a 'timely' manner, which, in turn, enables the center to have access to the latest information for computation. We shed light on the structure of the optimal entropic compression scheme and, using this structure, we develop efficient algorithms to compute this optimal compression scheme.

READ FULL TEXT
research
09/01/2021

The Minimax Complexity of Distributed Optimization

In this thesis, I study the minimax oracle complexity of distributed sto...
research
06/18/2018

Distributed learning with compressed gradients

Asynchronous computation and gradient compression have emerged as two ke...
research
05/12/2023

Lower Bounds and Accelerated Algorithms in Distributed Stochastic Optimization with Communication Compression

Communication compression is an essential strategy for alleviating commu...
research
12/20/2021

Distributed and Stochastic Optimization Methods with Gradient Compression and Local Steps

In this thesis, we propose new theoretical frameworks for the analysis o...
research
01/24/2020

Limits on Gradient Compression for Stochastic Optimization

We consider stochastic optimization over ℓ_p spaces using access to a fi...
research
11/24/2020

Wyner-Ziv Estimators: Efficient Distributed Mean Estimation with Side Information

Communication efficient distributed mean estimation is an important prim...
research
12/10/2021

Learning distributed channel access policies for networked estimation: data-driven optimization in the mean-field regime

The problem of communicating sensor measurements over shared networks is...

Please sign up or login with your details

Forgot password? Click here to reset