Bristle: Decentralized Federated Learning in Byzantine, Non-i.i.d. Environments

10/21/2021
by   Joost Verbraeken, et al.
0

Federated learning (FL) is a privacy-friendly type of machine learning where devices locally train a model on their private data and typically communicate model updates with a server. In decentralized FL (DFL), peers communicate model updates with each other instead. However, DFL is challenging since (1) the training data possessed by different peers is often non-i.i.d. (i.e., distributed differently between the peers) and (2) malicious, or Byzantine, attackers can share arbitrary model updates with other peers to subvert the training process. We address these two challenges and present Bristle, middleware between the learning application and the decentralized network layer. Bristle leverages transfer learning to predetermine and freeze the non-output layers of a neural network, significantly speeding up model training and lowering communication costs. To securely update the output layer with model updates from other peers, we design a fast distance-based prioritizer and a novel performance-based integrator. Their combined effect results in high resilience to Byzantine attackers and the ability to handle non-i.i.d. classes. We empirically show that Bristle converges to a consistent 95 Byzantine environments, outperforming all evaluated baselines. In non-Byzantine environments, Bristle requires 83 compared to state-of-the-art methods. We show that when the training classes are non-i.i.d., Bristle significantly outperforms the accuracy of the most Byzantine-resilient baselines by 2.3x while reducing communication costs by 90

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/07/2023

Byzantine-Robust Federated Learning with Variance Reduction and Differential Privacy

Federated learning (FL) is designed to preserve data privacy during mode...
research
06/21/2021

Secure Distributed Training at Scale

Some of the hardest problems in deep learning can be solved with the com...
research
02/03/2022

Byzantine-Robust Decentralized Learning via Self-Centered Clipping

In this paper, we study the challenging task of Byzantine-robust decentr...
research
08/07/2023

A Four-Pronged Defense Against Byzantine Attacks in Federated Learning

Federated learning (FL) is a nascent distributed learning paradigm to tr...
research
09/14/2023

FedFNN: Faster Training Convergence Through Update Predictions in Federated Recommender Systems

Federated Learning (FL) has emerged as a key approach for distributed ma...
research
11/24/2022

FedCut: A Spectral Analysis Framework for Reliable Detection of Byzantine Colluders

This paper proposes a general spectral analysis framework that thwarts a...
research
03/07/2023

Can Decentralized Learning be more robust than Federated Learning?

Decentralized Learning (DL) is a peer–to–peer learning approach that all...

Please sign up or login with your details

Forgot password? Click here to reset