Differentially Private Obfuscation Mechanisms for Hiding Probability Distributions

12/03/2018
by   Yusuke Kawamoto, et al.
0

We propose a formal model for the privacy of user attributes in terms of differential privacy. In particular, we introduce a notion, called distribution privacy, as the differential privacy for probability distributions. Roughly, a local obfuscation mechanism with distribution privacy perturbs each single input so that the attacker cannot significantly gain any information on the probability distribution of inputs by observing an output of the mechanism. Then we show that existing local obfuscation mechanisms have a limited effect on distribution privacy. For instance, we prove that, to provide distribution privacy w.r.t. the approximate max-divergence (resp. f-divergence), the amount of noise added by the Laplace mechanism should be proportional to the infinite Wasserstein (resp. the Earth mover's) distance between the two distributions we want to make indistinguishable. To provide a stronger level of distribution privacy, we introduce an obfuscation mechanism, called the tupling mechanism, that perturbs a given input and adds random dummy data. Then we apply the tupling mechanism to the protection of user attributes in location based services, and demonstrate by experiments that the tupling mechanism outperforms the popular local (extended) differentially private mechanisms in terms of distribution privacy and utility. Finally, we discuss the relationships among utility, privacy, and the cost of adding dummy data.

READ FULL TEXT
research
12/03/2018

Local Obfuscation Mechanisms for Hiding Probability Distributions

We introduce a formal model for the information leakage of probability d...
research
07/13/2019

Local Distribution Obfuscation via Probability Coupling

We introduce a general model for the local obfuscation of probability di...
research
01/28/2019

Utility Preserving Secure Private Data Release

Differential privacy mechanisms that also make reconstruction of the dat...
research
07/27/2020

Learning discrete distributions: user vs item-level privacy

Much of the literature on differential privacy focuses on item-level pri...
research
07/23/2018

On the Anonymization of Differentially Private Location Obfuscation

Obfuscation techniques in location-based services (LBSs) have been shown...
research
01/14/2020

Differentially Private and Fair Classification via Calibrated Functional Mechanism

Machine learning is increasingly becoming a powerful tool to make decisi...
research
09/26/2018

Optimal Noise-Adding Mechanism in Additive Differential Privacy

We derive the optimal (0, δ)-differentially private query-output indepen...

Please sign up or login with your details

Forgot password? Click here to reset