ProGAP: Progressive Graph Neural Networks with Differential Privacy Guarantees

by   Sina Sajadmanesh, et al.

Graph Neural Networks (GNNs) have become a popular tool for learning on graphs, but their widespread use raises privacy concerns as graph data can contain personal or sensitive information. Differentially private GNN models have been recently proposed to preserve privacy while still allowing for effective learning over graph-structured datasets. However, achieving an ideal balance between accuracy and privacy in GNNs remains challenging due to the intrinsic structural connectivity of graphs. In this paper, we propose a new differentially private GNN called ProGAP that uses a progressive training scheme to improve such accuracy-privacy trade-offs. Combined with the aggregation perturbation technique to ensure differential privacy, ProGAP splits a GNN into a sequence of overlapping submodels that are trained progressively, expanding from the first submodel to the complete model. Specifically, each submodel is trained over the privately aggregated node embeddings learned and cached by the previous submodels, leading to an increased expressive power compared to previous approaches while limiting the incurred privacy costs. We formally prove that ProGAP ensures edge-level and node-level privacy guarantees for both training and inference stages, and evaluate its performance on benchmark graph datasets. Experimental results demonstrate that ProGAP can achieve up to 5 state-of-the-art differentially private GNNs.


page 1

page 2

page 3

page 4


Towards Training Graph Neural Networks with Node-Level Differential Privacy

Graph Neural Networks (GNNs) have achieved great success in mining graph...

Differentially Private Graph Classification with GNNs

Graph Neural Networks (GNNs) have established themselves as the state-of...

Releasing Graph Neural Networks with Differential Privacy Guarantees

With the increasing popularity of Graph Neural Networks (GNNs) in severa...

Degree-Preserving Randomized Response for Graph Neural Networks under Local Differential Privacy

Differentially private GNNs (Graph Neural Networks) have been recently s...

Privacy-Utility Trade-offs in Neural Networks for Medical Population Graphs: Insights from Differential Privacy and Graph Structure

We initiate an empirical investigation into differentially private graph...

Training Differentially Private Graph Neural Networks with Random Walk Sampling

Deep learning models are known to put the privacy of their training data...

SoK: Differential Privacy on Graph-Structured Data

In this work, we study the applications of differential privacy (DP) in ...

Please sign up or login with your details

Forgot password? Click here to reset