Graph Attention Auto-Encoders

05/26/2019
by   Amin Salehi, et al.
0

Auto-encoders have emerged as a successful framework for unsupervised learning. However, conventional auto-encoders are incapable of utilizing explicit relations in structured data. To take advantage of relations in graph-structured data, several graph auto-encoders have recently been proposed, but they neglect to reconstruct either the graph structure or node attributes. In this paper, we present the graph attention auto-encoder (GATE), a neural network architecture for unsupervised representation learning on graph-structured data. Our architecture is able to reconstruct graph-structured inputs, including both node attributes and the graph structure, through stacked encoder/decoder layers equipped with self-attention mechanisms. In the encoder, by considering node attributes as initial node representations, each layer generates new representations of nodes by attending over their neighbors' representations. In the decoder, we attempt to reverse the encoding process to reconstruct node attributes. Moreover, node representations are regularized to reconstruct the graph structure. Our proposed architecture does not need to know the graph structure upfront, and thus it can be applied to inductive learning. Our experiments demonstrate competitive performance on several node classification benchmark datasets for transductive and inductive tasks, even exceeding the performance of supervised learning baselines in most cases.

READ FULL TEXT

page 8

page 9

research
11/26/2019

Effective Decoding in Graph Auto-Encoder using Triadic Closure

The (variational) graph auto-encoder and its variants have been popularl...
research
02/18/2022

Graph Auto-Encoder Via Neighborhood Wasserstein Reconstruction

Graph neural networks (GNNs) have drawn significant research attention r...
research
11/03/2022

Relating graph auto-encoders to linear models

Graph auto-encoders are widely used to construct graph representations i...
research
06/08/2020

Graph Representation Learning Network via Adaptive Sampling

Graph Attention Network (GAT) and GraphSAGE are neural network architect...
research
01/16/2018

Unsupervised Representation Learning with Laplacian Pyramid Auto-encoders

Scale-space representation has been popular in computer vision community...
research
04/28/2018

Ladder Networks for Emotion Recognition: Using Unsupervised Auxiliary Tasks to Improve Predictions of Emotional Attributes

Recognizing emotions using few attribute dimensions such as arousal, val...

Please sign up or login with your details

Forgot password? Click here to reset