Coded Distributed Computing for Hierarchical Multi-task Learning

12/16/2022
by   Haoyang Hu, et al.
0

In this paper, we consider a hierarchical distributed multi-task learning (MTL) system where distributed users wish to jointly learn different models orchestrated by a central server with the help of a layer of multiple relays. Since the users need to download different learning models in the downlink transmission, the distributed MTL suffers more severely from the communication bottleneck compared to the single-task learning system. To address this issue, we propose a coded hierarchical MTL scheme that exploits the connection topology and introduces coding techniques to reduce communication loads. It is shown that the proposed scheme can significantly reduce the communication loads both in the uplink and downlink transmissions between relays and the server. Moreover, we provide information-theoretic lower bounds on the optimal uplink and downlink communication loads, and prove that the gaps between achievable upper bounds and lower bounds are within the minimum number of connected users among all relays. In particular, when the network connection topology can be delicately designed, the proposed scheme can achieve the information-theoretic optimal communication loads. Experiments on real datasets show that our proposed scheme can reduce the overall training time by 17 to the conventional uncoded scheme.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset