Self-Attention Enhanced Selective Gate with Entity-Aware Embedding for Distantly Supervised Relation Extraction

by   Yang Li, et al.

Distantly supervised relation extraction intrinsically suffers from noisy labels due to the strong assumption of distant supervision. Most prior works adopt a selective attention mechanism over sentences in a bag to denoise from wrongly labeled data, which however could be incompetent when there is only one sentence in a bag. In this paper, we propose a brand-new light-weight neural framework to address the distantly supervised relation extraction problem and alleviate the defects in previous selective attention framework. Specifically, in the proposed framework, 1) we use an entity-aware word embedding method to integrate both relative position information and head/tail entity embeddings, aiming to highlight the essence of entities for this task; 2) we develop a self-attention mechanism to capture the rich contextual dependencies as a complement for local dependencies captured by piecewise CNN; and 3) instead of using selective attention, we design a pooling-equipped gate, which is based on rich contextual representations, as an aggregator to generate bag-level representation for final relation classification. Compared to selective attention, one major advantage of the proposed gating mechanism is that, it performs stably and promisingly even if only one sentence appears in a bag and thus keeps the consistency across all training examples. The experiments on NYT dataset demonstrate that our approach achieves a new state-of-the-art performance in terms of both AUC and top-n precision metrics.


page 1

page 2

page 3

page 4


Cross-relation Cross-bag Attention for Distantly-supervised Relation Extraction

Distant supervision leverages knowledge bases to automatically label ins...

Hybrid Attention-Based Transformer Block Model for Distant Supervision Relation Extraction

With an exponential explosive growth of various digital text information...

Entity Structure Within and Throughout: Modeling Mention Dependencies for Document-Level Relation Extraction

Entities, as the essential elements in relation extraction tasks, exhibi...

HiCLRE: A Hierarchical Contrastive Learning Framework for Distantly Supervised Relation Extraction

Distant supervision assumes that any sentence containing the same entity...

Multi-Level Structured Self-Attentions for Distantly Supervised Relation Extraction

Attention mechanisms are often used in deep neural networks for distantl...

Distantly Supervised Relation Extraction via Recursive Hierarchy-Interactive Attention and Entity-Order Perception

Distantly supervised relation extraction has drawn significant attention...

How Knowledge Graph and Attention Help? A Quantitative Analysis into Bag-level Relation Extraction

Knowledge Graph (KG) and attention mechanism have been demonstrated effe...

Please sign up or login with your details

Forgot password? Click here to reset