Relation-aware Graph Attention Model With Adaptive Self-adversarial Training

02/14/2021
by   Xiao Qin, et al.
11

This paper describes an end-to-end solution for the relationship prediction task in heterogeneous, multi-relational graphs. We particularly address two building blocks in the pipeline, namely heterogeneous graph representation learning and negative sampling. Existing message passing-based graph neural networks use edges either for graph traversal and/or selection of message encoding functions. Ignoring the edge semantics could have severe repercussions on the quality of embeddings, especially when dealing with two nodes having multiple relations. Furthermore, the expressivity of the learned representation depends on the quality of negative samples used during training. Although existing hard negative sampling techniques can identify challenging negative relationships for optimization, new techniques are required to control false negatives during training as false negatives could corrupt the learning process. To address these issues, first, we propose RelGNN – a message passing-based heterogeneous graph attention model. In particular, RelGNN generates the states of different relations and leverages them along with the node states to weigh the messages. RelGNN also adopts a self-attention mechanism to balance the importance of attribute features and topological features for generating the final entity embeddings. Second, we introduce a parameter-free negative sampling technique – adaptive self-adversarial (ASA) negative sampling. ASA reduces the false-negative rate by leveraging positive relationships to effectively guide the identification of true negative samples. Our experimental evaluation demonstrates that RelGNN optimized by ASA for relationship prediction improves state-of-the-art performance across established benchmarks as well as on a real industrial dataset.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/01/2022

Unbiased Heterogeneous Scene Graph Generation with Relation-aware Message Passing Neural Network

Recent scene graph generation (SGG) frameworks have focused on learning ...
research
02/22/2023

HINormer: Representation Learning On Heterogeneous Information Networks with Graph Transformer

Recent studies have highlighted the limitations of message-passing based...
research
10/03/2022

Learning from the Dark: Boosting Graph Convolutional Neural Networks with Diverse Negative Samples

Graph Convolutional Neural Networks (GCNs) has been generally accepted t...
research
04/13/2022

AHP: Learning to Negative Sample for Hyperedge Prediction

Hypergraphs (i.e., sets of hyperedges) naturally represent group relatio...
research
12/05/2022

Graph Convolutional Neural Networks with Diverse Negative Samples via Decomposed Determinant Point Processes

Graph convolutional networks (GCNs) have achieved great success in graph...
research
04/26/2022

Function-words Enhanced Attention Networks for Few-Shot Inverse Relation Classification

The relation classification is to identify semantic relations between tw...
research
06/02/2022

Unified Recurrence Modeling for Video Action Anticipation

Forecasting future events based on evidence of current conditions is an ...

Please sign up or login with your details

Forgot password? Click here to reset