Near Optimal Private and Robust Linear Regression

01/30/2023
by   Xiyang Liu, et al.
0

We study the canonical statistical estimation problem of linear regression from n i.i.d. examples under (ε,δ)-differential privacy when some response variables are adversarially corrupted. We propose a variant of the popular differentially private stochastic gradient descent (DP-SGD) algorithm with two innovations: a full-batch gradient descent to improve sample complexity and a novel adaptive clipping to guarantee robustness. When there is no adversarial corruption, this algorithm improves upon the existing state-of-the-art approach and achieves a near optimal sample complexity. Under label-corruption, this is the first efficient linear regression algorithm to guarantee both (ε,δ)-DP and robustness. Synthetic experiments confirm the superiority of our approach.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/11/2022

(Nearly) Optimal Private Linear Regression via Adaptive Clipping

We study the problem of differentially private linear regression where e...
research
10/22/2020

Differentially Private (Gradient) Expectation Maximization Algorithm with Statistical Guarantees

(Gradient) Expectation Maximization (EM) is a widely used algorithm for ...
research
03/07/2018

Revisiting differentially private linear regression: optimal and adaptive prediction & estimation in unbounded domain

We revisit the problem of linear regression under a differential privacy...
research
02/18/2021

Robust and Differentially Private Mean Estimation

Differential privacy has emerged as a standard requirement in a variety ...
research
03/06/2023

Improved Differentially Private Regression via Gradient Boosting

We revisit the problem of differentially private squared error linear re...
research
06/22/2021

Robust Regression Revisited: Acceleration and Improved Estimation Rates

We study fast algorithms for statistical regression problems under the s...

Please sign up or login with your details

Forgot password? Click here to reset