Task-specific Pre-training and Prompt Decomposition for Knowledge Graph Population with Language Models

08/26/2022
by   Tianyi Li, et al.
1

We present a system for knowledge graph population with Language Models, evaluated on the Knowledge Base Construction from Pre-trained Language Models (LM-KBC) challenge at ISWC 2022. Our system involves task-specific pre-training to improve LM representation of the masked object tokens, prompt decomposition for progressive generation of candidate objects, among other methods for higher-quality retrieval. Our system is the winner of track 1 of the LM-KBC challenge, based on BERT LM; it achieves 55.0 of the challenge.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/20/2023

FoodGPT: A Large Language Model in Food Testing Domain with Incremental Pre-training and Knowledge Graph Prompt

Currently, the construction of large language models in specific domains...
research
12/04/2020

Pre-trained language models as knowledge bases for Automotive Complaint Analysis

Recently it has been shown that large pre-trained language models like B...
research
01/26/2023

Understanding Finetuning for Factual Knowledge Extraction from Language Models

Language models (LMs) pretrained on large corpora of text from the web h...
research
09/15/2023

Using Large Language Models for Knowledge Engineering (LLMKE): A Case Study on Wikidata

In this work, we explore the use of Large Language Models (LLMs) for kno...
research
10/22/2022

LMPriors: Pre-Trained Language Models as Task-Specific Priors

Particularly in low-data regimes, an outstanding challenge in machine le...
research
07/06/2023

Extracting Multi-valued Relations from Language Models

The widespread usage of latent language representations via pre-trained ...
research
02/03/2023

A Case Study for Compliance as Code with Graphs and Language Models: Public release of the Regulatory Knowledge Graph

The paper presents a study on using language models to automate the cons...

Please sign up or login with your details

Forgot password? Click here to reset