RefGPT: Reference -> Truthful Customized Dialogues Generation by GPTs and for GPTs

by   Dongjie Yang, et al.

General chat models, like ChatGPT, have attained impressive capability to resolve a wide range of NLP tasks by tuning Large Language Models (LLMs) with high-quality instruction data. However, collecting human-written high-quality data, especially multi-turn dialogues, is expensive and unattainable for most people. Though previous studies have used powerful LLMs to generate the dialogues automatically, but they all suffer from generating untruthful dialogues because of the LLMs hallucination. Therefore, we propose a method called RefGPT to generate enormous truthful and customized dialogues without worrying about factual errors caused by the model hallucination. RefGPT solves the model hallucination in dialogue generation by restricting the LLMs to leverage the given reference instead of reciting their own knowledge to generate dialogues. Additionally, RefGPT adds detailed controls on every utterances to enable highly customization capability, which previous studies have ignored. On the basis of RefGPT, we also propose two high-quality dialogue datasets generated by GPT-4, namely RefGPT-Fact and RefGPT-Code. RefGPT-Fact is 100k multi-turn dialogue datasets based on factual knowledge and RefGPT-Code is 76k multi-turn dialogue dataset covering a wide range of coding scenarios. Our code and datasets are released in


ChatPLUG: Open-Domain Generative Dialogue System with Internet-Augmented Instruction Tuning for Digital Human

In this paper, we present ChatPLUG, a Chinese open-domain dialogue syste...

Enhancing Chat Language Models by Scaling High-quality Instructional Conversations

Fine-tuning on instruction data has been widely validated as an effectiv...

The Gutenberg Dialogue Dataset

Large datasets are essential for many NLP tasks. Current publicly availa...

Baize: An Open-Source Chat Model with Parameter-Efficient Tuning on Self-Chat Data

Chat models, such as ChatGPT, have shown impressive capabilities and hav...

Back to the Future: Bidirectional Information Decoupling Network for Multi-turn Dialogue Modeling

Multi-turn dialogue modeling as a challenging branch of natural language...

An Effective Data Creation Pipeline to Generate High-quality Financial Instruction Data for Large Language Model

At the beginning era of large language model, it is quite critical to ge...

DailyDialog: A Manually Labelled Multi-turn Dialogue Dataset

We develop a high-quality multi-turn dialog dataset, DailyDialog, which ...

Please sign up or login with your details

Forgot password? Click here to reset