The Privacy-Utility Tradeoff of Robust Local Differential Privacy
We consider data release protocols for data X=(S,U), where S is sensitive; the released data Y contains as much information about X as possible, measured as I(X;Y), without leaking too much about S. We introduce the Robust Local Differential Privacy (RLDP) framework to measure privacy. This framework relies on the underlying distribution of the data, which needs to be estimated from available data. Robust privacy guarantees are ensuring privacy for all distributions in a given set ℱ, for which we study two cases: when ℱ is the set of all distributions, and when ℱ is a confidence set arising from a χ^2 test on a publicly available dataset. In the former case we introduce a new release protocol which we prove to be optimal in the low privacy regime. In the latter case we present four algorithms that construct RLDP protocols from a given dataset. One of these approximates ℱ by a polytope and uses results from robust optimisation to yield high utility release protocols. However, this algorithm relies on vertex enumeration and becomes computationally inaccessible for large input spaces. The other three algorithms are low-complexity and build on randomised response. Experiments verify that all four algorithms offer significantly improved utility over regular LDP.
READ FULL TEXT