Byzantine-Robust Loopless Stochastic Variance-Reduced Gradient

03/08/2023
by   Nikita Fedin, et al.
0

Distributed optimization with open collaboration is a popular field since it provides an opportunity for small groups/companies/universities, and individuals to jointly solve huge-scale problems. However, standard optimization algorithms are fragile in such settings due to the possible presence of so-called Byzantine workers – participants that can send (intentionally or not) incorrect information instead of the one prescribed by the protocol (e.g., send anti-gradient instead of stochastic gradients). Thus, the problem of designing distributed methods with provable robustness to Byzantine workers has been receiving a lot of attention recently. In particular, several works consider a very promising way to achieve Byzantine tolerance via exploiting variance reduction and robust aggregation. The existing approaches use SAGA- and SARAH-type variance-reduced estimators, while another popular estimator – SVRG – is not studied in the context of Byzantine-robustness. In this work, we close this gap in the literature and propose a new method – Byzantine-Robust Loopless Stochastic Variance Reduced Gradient (BR-LSVRG). We derive non-asymptotic convergence guarantees for the new method in the strongly convex case and compare its performance with existing approaches in numerical experiments.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/29/2019

Federated Variance-Reduced Stochastic Gradient Descent with Robustness to Byzantine Attacks

This paper deals with distributed finite-sum optimization for learning o...
research
01/16/2023

A Robust Classification Framework for Byzantine-Resilient Stochastic Gradient Descent

This paper proposes a Robust Gradient Classification Framework (RGCF) fo...
research
06/01/2022

Variance Reduction is an Antidote to Byzantines: Better Rates, Weaker Assumptions and Communication Compression as a Cherry on the Top

Byzantine-robustness has been gaining a lot of attention due to the grow...
research
11/09/2018

RSA: Byzantine-Robust Stochastic Aggregation Methods for Distributed Learning from Heterogeneous Datasets

In this paper, we propose a class of robust stochastic subgradient metho...
research
07/15/2023

Byzantine-robust distributed one-step estimation

This paper proposes a Robust One-Step Estimator(ROSE) to solve the Byzan...
research
09/17/2020

Byzantine-Robust Variance-Reduced Federated Learning over Distributed Non-i.i.d. Data

We propose a Byzantine-robust variance-reduced stochastic gradient desce...
research
09/30/2022

Online Multi-Agent Decentralized Byzantine-robust Gradient Estimation

In this paper, we propose an iterative scheme for distributed Byzantiner...

Please sign up or login with your details

Forgot password? Click here to reset