RoS-KD: A Robust Stochastic Knowledge Distillation Approach for Noisy Medical Imaging

10/15/2022
by   Ajay Jaiswal, et al.
13

AI-powered Medical Imaging has recently achieved enormous attention due to its ability to provide fast-paced healthcare diagnoses. However, it usually suffers from a lack of high-quality datasets due to high annotation cost, inter-observer variability, human annotator error, and errors in computer-generated labels. Deep learning models trained on noisy labelled datasets are sensitive to the noise type and lead to less generalization on the unseen samples. To address this challenge, we propose a Robust Stochastic Knowledge Distillation (RoS-KD) framework which mimics the notion of learning a topic from multiple sources to ensure deterrence in learning noisy information. More specifically, RoS-KD learns a smooth, well-informed, and robust student manifold by distilling knowledge from multiple teachers trained on overlapping subsets of training data. Our extensive experiments on popular medical imaging classification tasks (cardiopulmonary disease and lesion classification) using real-world datasets, show the performance benefit of RoS-KD, its ability to distill knowledge from many popular large networks (ResNet-50, DenseNet-121, MobileNet-V2) in a comparatively small network, and its robustness to adversarial attacks (PGD, FSGM). More specifically, RoS-KD achieves >2 improvement on F1-score for lesion classification and cardiopulmonary disease classification tasks, respectively, when the underlying student is ResNet-18 against recent competitive knowledge distillation baseline. Additionally, on cardiopulmonary disease classification task, RoS-KD outperforms most of the SOTA baselines by  1

READ FULL TEXT
research
08/06/2020

MED-TEX: Transferring and Explaining Knowledge with Less Data from Pretrained Medical Imaging Models

Deep neural network based image classification methods usually require a...
research
09/12/2021

On the Efficiency of Subclass Knowledge Distillation in Classification Tasks

This work introduces a novel knowledge distillation framework for classi...
research
07/17/2022

Subclass Knowledge Distillation with Known Subclass Labels

This work introduces a novel knowledge distillation framework for classi...
research
02/13/2022

AI can evolve without labels: self-evolving vision transformer for chest X-ray diagnosis through knowledge distillation

Although deep learning-based computer-aided diagnosis systems have recen...
research
10/11/2019

Improving Generalization and Robustness with Noisy Collaboration in Knowledge Distillation

Inspired by trial-to-trial variability in the brain that can result from...
research
09/22/2020

Improving Medical Annotation Quality to Decrease Labeling Burden Using Stratified Noisy Cross-Validation

As machine learning has become increasingly applied to medical imaging d...
research
11/08/2022

Hilbert Distillation for Cross-Dimensionality Networks

3D convolutional neural networks have revealed superior performance in p...

Please sign up or login with your details

Forgot password? Click here to reset