Distributed Linearly Separable Computation

07/01/2020
by   Kai Wan, et al.
0

This paper formulates a distributed computation problem, where a master asks N distributed workers to compute a linearly separable function. The task function can be expressed as K_c linear combinations to K messages, where each message is a function of one dataset. Our objective is to find the optimal tradeoff between the computation cost (number of datasets assigned to each worker) and the communication cost (number of symbols the master should download), such that from the answers of any N_r out of N workers the master can recover the task function. The formulated problem can be seen as the generalized version of some existing problems, such as distributed gradient descent and distributed linear transform. In this paper, we consider the specific case where the computation cost is minimum, and propose novel converse and achievable bounds on the optimal communication cost. The proposed bounds coincide for some system parameters; when they do not match, we prove that the achievable distributed computing scheme is optimal under the constraint of a widely used `cyclic assignment' on the datasets. Our results also show that when K = N, with the same communication cost as the optimal distributed gradient descent coding scheme propose by Tandon et al. from which the master recovers one linear combination of K messages, our proposed scheme can let the master recover any additional N_r -1 independent linear combinations of messages with high probability.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/04/2020

On the Tradeoff Between Computation and Communication Costs for Distributed Linearly Separable Computation

This paper studies the distributed linearly separable computation proble...
research
02/19/2021

On Gradient Coding with Partial Recovery

We consider a generalization of the recently proposed gradient coding fr...
research
01/27/2019

Heterogeneity-aware Gradient Coding for Straggler Tolerance

Gradient descent algorithms are widely used in machine learning. In orde...
research
06/25/2021

Hierarchical Online Convex Optimization

We consider online convex optimization (OCO) over a heterogeneous networ...
research
03/02/2021

Optimal Communication-Computation Trade-Off in Heterogeneous Gradient Coding

Gradient coding allows a master node to derive the aggregate of the part...
research
02/01/2021

On Secure Distributed Linearly Separable Computation

Distributed linearly separable computation, where a user asks some distr...
research
01/11/2019

Coded Distributed Computing over Packet Erasure Channels

Coded computation is a framework which provides redundancy in distribute...

Please sign up or login with your details

Forgot password? Click here to reset