Zeno++: robust asynchronous SGD with arbitrary number of Byzantine workers

03/17/2019
by   Cong Xie, et al.
0

We propose Zeno++, a new robust asynchronous synchronous Stochastic Gradient Descent (SGD) under a general Byzantine failure model with unbounded number of Byzantine workers.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/25/2018

Zeno: Byzantine-suspicious stochastic gradient descent

We propose Zeno, a new robust aggregation rule, for distributed synchron...
research
01/16/2023

A Robust Classification Framework for Byzantine-Resilient Stochastic Gradient Descent

This paper proposes a Robust Gradient Classification Framework (RGCF) fo...
research
02/22/2018

Asynchronous Byzantine Machine Learning

Asynchronous distributed machine learning solutions have proven very eff...
research
03/08/2017

Byzantine-Tolerant Machine Learning

The growth of data, the need for scalability and the complexity of model...
research
06/08/2019

Making Asynchronous Stochastic Gradient Descent Work for Transformers

Asynchronous stochastic gradient descent (SGD) is attractive from a spee...
research
03/02/2020

BASGD: Buffered Asynchronous SGD for Byzantine Learning

Distributed learning has become a hot research topic, due to its wide ap...
research
02/27/2019

Distributed Byzantine Tolerant Stochastic Gradient Descent in the Era of Big Data

The recent advances in sensor technologies and smart devices enable the ...

Please sign up or login with your details

Forgot password? Click here to reset