When Differential Privacy Meets Graph Neural Networks
Type of publication: | Idiap-RR |
Citation: | Sajadmanesh_Idiap-RR-06-2023 |
Number: | Idiap-RR-06-2023 |
Year: | 2023 |
Month: | 7 |
Institution: | Idiap |
Abstract: | Graph Neural Networks have demonstrated superior performance in learning graph representations for several subsequent downstream inference tasks. However, learning over graph data types can raise privacy concerns when nodes represent people or human-related variables that involve personal information about individuals. Previous works have presented various techniques for privacy-preserving deep learning over non-relational data, such as image, audio, video, and text, but there is less work addressing the privacy issues involved in applying deep learning algorithms on graphs. As a result and for the first time, in this paper, we develop a privacy-preserving learning algorithm with formal privacy guarantees for Graph Convolutional Networks (GCNs) based on Local Differential Privacy (LDP) to tackle the problem of node-level privacy, where graph nodes have potentially sensitive features that need to be kept private, but they could be beneficial for learning rich node representations in a centralized learning setting. Specifically, we propose an LDP algorithm in which a central server can communicate with graph nodes to privately collect their data and estimate the graph convolution layer of a GCN. We then analyze the theoretical characteristics of the method and compare it with state-of-the-art mechanisms. Experimental results over real-world graph datasets demonstrate the effectiveness of the proposed method for both privacy-preserving node classification and link prediction tasks and verify our theoretical findings. |
Keywords: | deep learning, Differential Privacy, Graph Convolutional Networks, Graph Neural Networks, Graph Representation Learning |
Projects |
DUSK2DAWN |
Authors | |
Added by: | [ADM] |
Total mark: | 0 |
Attachments
|
|
Notes
|
|
|