Frozen Language Model Helps ECG Zero-Shot Learning

03/22/2023
by   Jun Li, et al.
0

The electrocardiogram (ECG) is one of the most commonly used non-invasive, convenient medical monitoring tools that assist in the clinical diagnosis of heart diseases. Recently, deep learning (DL) techniques, particularly self-supervised learning (SSL), have demonstrated great potential in the classification of ECG. SSL pre-training has achieved competitive performance with only a small amount of annotated data after fine-tuning. However, current SSL methods rely on the availability of annotated data and are unable to predict labels not existing in fine-tuning datasets. To address this challenge, we propose Multimodal ECG-Text Self-supervised pre-training (METS), the first work to utilize the auto-generated clinical reports to guide ECG SSL pre-training. We use a trainable ECG encoder and a frozen language model to embed paired ECG and automatically machine-generated clinical reports separately. The SSL aims to maximize the similarity between paired ECG and auto-generated report while minimize the similarity between ECG and other reports. In downstream classification tasks, METS achieves around 10 improvement in performance without using any annotated data via zero-shot classification, compared to other supervised and SSL baselines that rely on annotated data. Furthermore, METS achieves the highest recall and F1 scores on the MIT-BIH dataset, despite MIT-BIH containing different classes of ECG compared to the pre-trained dataset. The extensive experiments have demonstrated the advantages of using ECG-Text multimodal self-supervised learning in terms of generalizability, effectiveness, and efficiency.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/23/2021

Self-supervised representation learning from 12-lead ECG data

We put forward a comprehensive assessment of self-supervised representat...
research
09/10/2021

Towards Zero-shot Commonsense Reasoning with Self-supervised Refinement of Language Models

Can we get existing language models and refine them for zero-shot common...
research
01/21/2023

Transfer Knowledge from Natural Language to Electrocardiography: Can We Detect Cardiovascular Disease Through Language Models?

Recent advancements in Large Language Models (LLMs) have drawn increasin...
research
01/05/2023

MedKLIP: Medical Knowledge Enhanced Language-Image Pre-Training

In this paper, we consider the problem of enhancing self-supervised visu...
research
11/15/2022

Pretraining ECG Data with Adversarial Masking Improves Model Generalizability for Data-Scarce Tasks

Medical datasets often face the problem of data scarcity, as ground trut...
research
09/02/2021

VIbCReg: Variance-Invariance-better-Covariance Regularization for Self-Supervised Learning on Time Series

Self-supervised learning for image representations has recently had many...

Please sign up or login with your details

Forgot password? Click here to reset