RSA: Byzantine-Robust Stochastic Aggregation Methods for Distributed Learning from Heterogeneous Datasets

11/09/2018
by   Liping Li, et al.
0

In this paper, we propose a class of robust stochastic subgradient methods for distributed learning from heterogeneous datasets at presence of an unknown number of Byzantine workers. The Byzantine workers, during the learning process, may send arbitrary incorrect messages to the master due to data corruptions, communication failures or malicious attacks, and consequently bias the learned model. The key to the proposed methods is a regularization term incorporated with the objective function so as to robustify the learning task and mitigate the negative effects of Byzantine attacks. The resultant subgradient-based algorithms are termed Byzantine-Robust Stochastic Aggregation methods, justifying our acronym RSA used henceforth. In contrast to most of the existing algorithms, RSA does not rely on the assumption that the data are independent and identically distributed (i.i.d.) on the workers, and hence fits for a wider class of applications. Theoretically, we show that: i) RSA converges to a near-optimal solution with the learning error dependent on the number of Byzantine workers; ii) the convergence rate of RSA under Byzantine attacks is the same as that of the stochastic gradient descent method, which is free of Byzantine attacks. Numerically, experiments on real dataset corroborate the competitive performance of RSA and a complexity reduction compared to the state-of-the-art alternatives.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/17/2020

Byzantine-Robust Variance-Reduced Federated Learning over Distributed Non-i.i.d. Data

We propose a Byzantine-robust variance-reduced stochastic gradient desce...
research
06/16/2020

Byzantine-Robust Learning on Heterogeneous Datasets via Resampling

In Byzantine robust distributed optimization, a central server wants to ...
research
06/13/2021

Stochastic Alternating Direction Method of Multipliers for Byzantine-Robust Distributed Learning

This paper aims to solve a distributed learning problem under Byzantine ...
research
06/09/2022

Byzantine-Resilient Decentralized Stochastic Optimization with Robust Aggregation Rules

This work focuses on decentralized stochastic optimization in the presen...
research
10/14/2019

Election Coding for Distributed Learning: Protecting SignSGD against Byzantine Attacks

Recent advances in large-scale distributed learning algorithms have enab...
research
05/23/2023

On the Optimal Batch Size for Byzantine-Robust Distributed Learning

Byzantine-robust distributed learning (BRDL), in which computing devices...
research
03/08/2023

Byzantine-Robust Loopless Stochastic Variance-Reduced Gradient

Distributed optimization with open collaboration is a popular field sinc...

Please sign up or login with your details

Forgot password? Click here to reset