Deciding Differential Privacy of Online Algorithms with Multiple Variables

by   Rohit Chadha, et al.

We consider the problem of checking the differential privacy of online randomized algorithms that process a stream of inputs and produce outputs corresponding to each input. This paper generalizes an automaton model called DiP automata (See arXiv:2104.14519) to describe such algorithms by allowing multiple real-valued storage variables. A DiP automaton is a parametric automaton whose behavior depends on the privacy budget ϵ. An automaton A will be said to be differentially private if, for some 𝔇, the automaton is 𝔇ϵ-differentially private for all values of ϵ>0. We identify a precise characterization of the class of all differentially private DiP automata. We show that the problem of determining if a given DiP automaton belongs to this class is PSPACE-complete. Our PSPACE algorithm also computes a value for 𝔇 when the given automaton is differentially private. The algorithm has been implemented, and experiments demonstrating its effectiveness are presented.


page 1

page 2

page 3

page 4

∙ 04/29/2021

On Linear Time Decidability of Differential Privacy for Programs with Unbounded Inputs

We introduce an automata model for describing interesting classes of dif...
∙ 04/10/2019

What Storage Access Privacy is Achievable with Small Overhead?

Oblivious RAM (ORAM) and private information retrieval (PIR) are classic...
∙ 10/30/2018

Private Algorithms Can Always Be Extended

We consider the following fundamental question on ϵ-differential privacy...
∙ 05/06/2015

Fast Differentially Private Matrix Factorization

Differentially private collaborative filtering is a challenging task, bo...
∙ 07/21/2022

Differentially Private Partial Set Cover with Applications to Facility Location

It was observed in <cit.> that the Set Cover problem has strong impossib...
∙ 07/02/2020

Private Optimization Without Constraint Violations

We study the problem of differentially private optimization with linear ...
∙ 05/29/2023

Unleashing the Power of Randomization in Auditing Differentially Private ML

We present a rigorous methodology for auditing differentially private ma...

Please sign up or login with your details

Forgot password? Click here to reset