Self-Knowledge Distillation based Self-Supervised Learning for Covid-19 Detection from Chest X-Ray Images

06/07/2022
by   Guang Li, et al.
0

The global outbreak of the Coronavirus 2019 (COVID-19) has overloaded worldwide healthcare systems. Computer-aided diagnosis for COVID-19 fast detection and patient triage is becoming critical. This paper proposes a novel self-knowledge distillation based self-supervised learning method for COVID-19 detection from chest X-ray images. Our method can use self-knowledge of images based on similarities of their visual features for self-supervised learning. Experimental results show that our method achieved an HM score of 0.988, an AUC of 0.999, and an accuracy of 0.957 on the largest open COVID-19 chest X-ray dataset.

READ FULL TEXT

page 2

page 3

research
12/19/2022

COVID-19 Detection Based on Self-Supervised Transfer Learning Using Chest X-Ray Images

Purpose: Considering several patients screened due to COVID-19 pandemic,...
research
12/19/2022

Boosting Automatic COVID-19 Detection Performance with Self-Supervised Learning and Batch Knowledge Ensembling

Background and objective: COVID-19 and its variants have caused signific...
research
02/13/2022

AI can evolve without labels: self-evolving vision transformer for chest X-ray diagnosis through knowledge distillation

Although deep learning-based computer-aided diagnosis systems have recen...
research
11/01/2022

RGMIM: Region-Guided Masked Image Modeling for COVID-19 Detection

Self-supervised learning has developed rapidly and also advances compute...
research
08/07/2021

A distillation based approach for the diagnosis of diseases

Presently, Covid-19 is a serious threat to the world at large. Efforts a...
research
12/14/2021

Performance or Trust? Why Not Both. Deep AUC Maximization with Self-Supervised Learning for COVID-19 Chest X-ray Classifications

Effective representation learning is the key in improving model performa...
research
04/13/2021

Unifying domain adaptation and self-supervised learning for CXR segmentation via AdaIN-based knowledge distillation

As the segmentation labels are scarce, extensive researches have been co...

Please sign up or login with your details

Forgot password? Click here to reset