Online Multi-Agent Decentralized Byzantine-robust Gradient Estimation

In this paper, we propose an iterative scheme for distributed Byzantineresilient estimation of a gradient associated with a black-box model. Our algorithm is based on simultaneous perturbation, secure state estimation and two-timescale stochastic approximations. We also show the performance of our algorithm through numerical experiments.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/17/2019

DSPG: Decentralized Simultaneous Perturbations Gradient Descent Scheme

In this paper, we present an asynchronous approximate gradient method th...
research
05/19/2021

Distributionally Constrained Black-Box Stochastic Gradient Estimation and Optimization

We consider stochastic gradient estimation using only black-box function...
research
11/27/2015

Gradient Estimation with Simultaneous Perturbation and Compressive Sensing

This paper aims at achieving a "good" estimator for the gradient of a fu...
research
08/10/2023

Byzantine-Robust Decentralized Stochastic Optimization with Stochastic Gradient Noise-Independent Learning Error

This paper studies Byzantine-robust stochastic optimization over a decen...
research
03/10/2023

Gaussian Max-Value Entropy Search for Multi-Agent Bayesian Optimization

We study the multi-agent Bayesian optimization (BO) problem, where multi...
research
03/08/2023

Byzantine-Robust Loopless Stochastic Variance-Reduced Gradient

Distributed optimization with open collaboration is a popular field sinc...
research
07/27/2022

INTERACT: Achieving Low Sample and Communication Complexities in Decentralized Bilevel Learning over Networks

In recent years, decentralized bilevel optimization problems have receiv...

Please sign up or login with your details

Forgot password? Click here to reset