Meta-Context Transformers for Domain-Specific Response Generation

by   Debanjana Kar, et al.

Despite the tremendous success of neural dialogue models in recent years, it suffers a lack of relevance, diversity, and some times coherence in generated responses. Lately, transformer-based models, such as GPT-2, have revolutionized the landscape of dialogue generation by capturing the long-range structures through language modeling. Though these models have exhibited excellent language coherence, they often lack relevance and terms when used for domain-specific response generation. In this paper, we present DSRNet (Domain Specific Response Network), a transformer-based model for dialogue response generation by reinforcing domain-specific attributes. In particular, we extract meta attributes from context and infuse them with the context utterances for better attention over domain-specific key terms and relevance. We study the use of DSRNet in a multi-turn multi-interlocutor environment for domain-specific response generation. In our experiments, we evaluate DSRNet on Ubuntu dialogue datasets, which are mainly composed of various technical domain related dialogues for IT domain issue resolutions and also on CamRest676 dataset, which contains restaurant domain conversations. Trained with maximum likelihood objective, our model shows significant improvement over the state-of-the-art for multi-turn dialogue systems supported by better BLEU and semantic similarity (BertScore) scores. Besides, we also observe that the responses produced by our model carry higher relevance due to the presence of domain-specific key attributes that exhibit better overlap with the attributes of the context. Our analysis shows that the performance improvement is mostly due to the infusion of key terms along with dialogues which result in better attention over domain-relevant terms. Other contributing factors include joint modeling of dialogue context with the domain-specific meta attributes and topics.


page 1

page 2

page 3

page 4


Multi-turn Dialogue Response Generation with Autoregressive Transformer Models

Neural dialogue models, despite their successes, still suffer from lack ...

Improving Response Selection in Multi-turn Dialogue Systems

Building systems that can communicate with humans is a core problem in A...

Towards Generalized and Explainable Long-Range Context Representation for Dialogue Systems

Context representation is crucial to both dialogue understanding and gen...

Learning to Select Context in a Hierarchical and Global Perspective for Open-domain Dialogue Generation

Open-domain multi-turn conversations mainly have three features, which a...

Adversarial Bootstrapping for Dialogue Model Training

Open domain neural dialogue models, despite their successes, are known t...

Logical Reasoning for Task Oriented Dialogue Systems

In recent years, large pretrained models have been used in dialogue syst...

Better Conversations by Modeling,Filtering,and Optimizing for Coherence and Diversity

We present three enhancements to existing encoder-decoder models for ope...

Please sign up or login with your details

Forgot password? Click here to reset