DeepAI AI Chat
Log In Sign Up

Chain of Knowledge: A Framework for Grounding Large Language Models with Structured Knowledge Bases

05/22/2023
by   Xingxuan Li, et al.
0

We introduce Chain of Knowledge (CoK), a framework that augments large language models with structured knowledge bases to improve factual correctness and reduce hallucination. Compared to previous works which only retrieve unstructured texts, CoK leverages structured knowledge bases which support complex queries and offer more direct factual statements. To assist large language models to effectively query knowledge bases, we propose a query generator model with contrastive instruction-tuning. As the query generator is separate from the frozen large language model, our framework is modular and thus easily adapted to various knowledge sources and models. Experiments show that our framework significantly enhances the factual correctness of large language models on knowledge-intensive tasks.

READ FULL TEXT
04/25/2023

Unstructured and structured data: Can we have the best of both worlds with large language models?

This paper presents an opinion on the potential of using large language ...
08/25/2023

LLM2KB: Constructing Knowledge Bases using instruction tuned context aware Large Language Models

The advent of Large Language Models (LLM) has revolutionized the field o...
10/10/2022

Knowledge Prompts: Injecting World Knowledge into Language Models through Soft Prompts

Soft prompts have been recently proposed as a tool for adapting large fr...
07/26/2023

Utilizing Large Language Models for Natural Interface to Pharmacology Databases

The drug development process necessitates that pharmacologists undertake...
08/17/2023

KnowledGPT: Enhancing Large Language Models with Retrieval and Storage Access on Knowledge Bases

Large language models (LLMs) have demonstrated impressive impact in the ...
04/12/2022

A Review on Language Models as Knowledge Bases

Recently, there has been a surge of interest in the NLP community on the...
03/09/2021

BERTese: Learning to Speak to BERT

Large pre-trained language models have been shown to encode large amount...