Indexed by:
Abstract:
With the popularity of graph-structured data and the promulgation of various data privacy protection laws, machine unlearning in Graph Convolutional Network (GCN) has attracted more and more attention. However, machine unlearning in GCN scenarios faces multiple challenges. For example, many unlearning algorithms require large computational resources and storage space or cannot be applied to graph-structured data, and so on. In this paper, we design a novel, lightweight unlearning method using knowledge distillation to solve the class unlearning problem in GCN scenarios. Unlike other methods using knowledge distillation to unlearn Euclidean spatial data, we use a single retrained deep Graph Convolutional Network via Initial residual and Identity mapping (GCNII) model as the teacher network and the shallow GCN model as a student network. During the training stage, the teacher’s network transfers the knowledge of the retained set to the student network, enabling the student network to forget some or more categories of information. Compared with the baseline methods, Graph Unlearning using Knowledge Distillation (GUKD) shows state-of-the-art model performance and unlearning quality on five real datasets. Specifically, our method outperforms all baseline methods by 33.77% on average in the multi-class experiments on the Citeseer dataset. © 2023, The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
Keyword:
Reprint 's Address:
Email:
Source :
ISSN: 0302-9743
Year: 2023
Volume: 14252 LNCS
Page: 485-501
Language: English
0 . 4 0 2
JCR@2005
Cited Count:
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 0
Affiliated Colleges: