ConDistFL: Conditional Distillation for Federated Learning from Partially Annotated Data

08/08/2023
by   Pochuan Wang, et al.
0

Developing a generalized segmentation model capable of simultaneously delineating multiple organs and diseases is highly desirable. Federated learning (FL) is a key technology enabling the collaborative development of a model without exchanging training data. However, the limited access to fully annotated training data poses a major challenge to training generalizable models. We propose "ConDistFL", a framework to solve this problem by combining FL with knowledge distillation. Local models can extract the knowledge of unlabeled organs and tumors from partially annotated data from the global model with an adequately designed conditional probability representation. We validate our framework on four distinct partially annotated abdominal CT datasets from the MSD and KiTS19 challenges. The experimental results show that the proposed framework significantly outperforms FedAvg and FedOpt baselines. Moreover, the performance on an external test dataset demonstrates superior generalizability compared to models trained on each dataset separately. Our ablation study suggests that ConDistFL can perform well without frequent aggregation, reducing the communication cost of FL. Our implementation will be available at https://github.com/NVIDIA/NVFlare/tree/dev/research/condist-fl.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/31/2023

Federated Learning for Data and Model Heterogeneity in Medical Imaging

Federated Learning (FL) is an evolving machine learning method in which ...
research
03/05/2021

Distributed Dynamic Map Fusion via Federated Learning for Intelligent Networked Vehicles

The technology of dynamic map fusion among networked vehicles has been d...
research
09/29/2022

Label driven Knowledge Distillation for Federated Learning with non-IID Data

In real-world applications, Federated Learning (FL) meets two challenges...
research
08/14/2020

Distillation-Based Semi-Supervised Federated Learning for Communication-Efficient Collaborative Training with Non-IID Private Data

This study develops a federated learning (FL) framework overcoming large...
research
05/07/2023

MrTF: Model Refinery for Transductive Federated Learning

We consider a real-world scenario in which a newly-established pilot pro...
research
11/11/2022

From Competition to Collaboration: Making Toy Datasets on Kaggle Clinically Useful for Chest X-Ray Diagnosis Using Federated Learning

Chest X-ray (CXR) datasets hosted on Kaggle, though useful from a data s...
research
03/10/2023

Digital Twin-Assisted Knowledge Distillation Framework for Heterogeneous Federated Learning

In this paper, to deal with the heterogeneity in federated learning (FL)...

Please sign up or login with your details

Forgot password? Click here to reset