Domain-specific Language Pre-training for Dialogue Comprehension on Clinical Inquiry-Answering Conversations

06/06/2022
by   Zhengyuan Liu, et al.
0

There is growing interest in the automated extraction of relevant information from clinical dialogues. However, it is difficult to collect and construct large annotated resources for clinical dialogue tasks. Recent developments in natural language processing suggest that large-scale pre-trained language backbones could be leveraged for such machine comprehension and information extraction tasks. Yet, due to the gap between pre-training and downstream clinical domains, it remains challenging to exploit the generic backbones for domain-specific applications. Therefore, in this work, we propose a domain-specific language pre-training, to improve performance on downstream tasks like dialogue comprehension. Aside from the common token-level masking pre-training method, according to the nature of human conversations and interactive flow of multi-topic inquiry-answering dialogues, we further propose sample generation strategies with speaker and utterance manipulation. The conversational pre-training guides the language backbone to reconstruct the utterances coherently based on the remaining context, thus bridging the gap between general and specific domains. Experiments are conducted on a clinical conversation dataset for symptom checking, where nurses inquire and discuss symptom information with patients. We empirically show that the neural model with our proposed approach brings improvement in the dialogue comprehension task, and can achieve favorable results in the low resource training scenario.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/14/2021

Different Strokes for Different Folks: Investigating Appropriate Further Pre-training Approaches for Diverse Dialogue Tasks

Loading models pre-trained on the large-scale corpus in the general doma...
research
02/14/2023

SwitchPrompt: Learning Domain-Specific Gated Soft Prompts for Classification in Low-Resource Domains

Prompting pre-trained language models leads to promising results across ...
research
12/16/2021

CLIN-X: pre-trained language models and a study on cross-task transfer for concept extraction in the clinical domain

The field of natural language processing (NLP) has recently seen a large...
research
03/08/2019

Fast Prototyping a Dialogue Comprehension System for Nurse-Patient Conversations on Symptom Monitoring

Data for human-human spoken dialogues for research and development are c...
research
06/01/2021

Dialogue-oriented Pre-training

Pre-trained language models (PrLM) has been shown powerful in enhancing ...
research
10/12/2020

Multi-Stage Pre-training for Low-Resource Domain Adaptation

Transfer learning techniques are particularly useful in NLP tasks where ...
research
09/20/2021

PLATO-XL: Exploring the Large-scale Pre-training of Dialogue Generation

To explore the limit of dialogue generation pre-training, we present the...

Please sign up or login with your details

Forgot password? Click here to reset