Federated Unlearning via Active Forgetting

07/07/2023
by   Yuyuan Li, et al.
0

The increasing concerns regarding the privacy of machine learning models have catalyzed the exploration of machine unlearning, i.e., a process that removes the influence of training data on machine learning models. This concern also arises in the realm of federated learning, prompting researchers to address the federated unlearning problem. However, federated unlearning remains challenging. Existing unlearning methods can be broadly categorized into two approaches, i.e., exact unlearning and approximate unlearning. Firstly, implementing exact unlearning, which typically relies on the partition-aggregation framework, in a distributed manner does not improve time efficiency theoretically. Secondly, existing federated (approximate) unlearning methods suffer from imprecise data influence estimation, significant computational burden, or both. To this end, we propose a novel federated unlearning framework based on incremental learning, which is independent of specific models and federated settings. Our framework differs from existing federated unlearning methods that rely on approximate retraining or data influence estimation. Instead, we leverage new memories to overwrite old ones, imitating the process of active forgetting in neurology. Specifically, the model, intended to unlearn, serves as a student model that continuously learns from randomly initiated teacher models. To preserve catastrophic forgetting of non-target data, we utilize elastic weight consolidation to elastically constrain weight change. Extensive experiments on three benchmark datasets demonstrate the efficiency and effectiveness of our proposed method. The result of backdoor attacks demonstrates that our proposed method achieves satisfying completeness.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/27/2023

FedET: A Communication-Efficient Federated Class-Incremental Learning Framework Based on Enhanced Transformer

Federated Learning (FL) has been widely concerned for it enables decentr...
research
10/17/2019

Overcoming Forgetting in Federated Learning on Non-IID Data

We tackle the problem of Federated Learning in the non i.i.d. case, in w...
research
07/02/2023

Don't Memorize; Mimic The Past: Federated Class Incremental Learning Without Episodic Memory

Deep learning models are prone to forgetting information learned in the ...
research
09/03/2023

Federated Orthogonal Training: Mitigating Global Catastrophic Forgetting in Continual Federated Learning

Federated Learning (FL) has gained significant attraction due to its abi...
research
06/02/2023

On Knowledge Editing in Federated Learning: Perspectives, Challenges, and Future Directions

As Federated Learning (FL) has gained increasing attention, it has becom...
research
01/26/2022

An Efficient and Robust System for Vertically Federated Random Forest

As there is a growing interest in utilizing data across multiple resourc...
research
05/12/2021

An Efficient Learning Framework For Federated XGBoost Using Secret Sharing And Distributed Optimization

XGBoost is one of the most widely used machine learning models in the in...

Please sign up or login with your details

Forgot password? Click here to reset