Covariant Compositional Networks For Learning Graphs

01/07/2018
by   Risi Kondor, et al.
0

Most existing neural networks for learning graphs address permutation invariance by conceiving of the network as a message passing scheme, where each node sums the feature vectors coming from its neighbors. We argue that this imposes a limitation on their representation power, and instead propose a new general architecture for representing objects consisting of a hierarchy of parts, which we call Covariant Compositional Networks (CCNs). Here, covariance means that the activation of each neuron must transform in a specific way under permutations, similarly to steerability in CNNs. We achieve covariance by making each activation transform according to a tensor representation of the permutation group, and derive the corresponding tensor aggregation rules that each neuron must implement. Experiments show that CCNs can outperform competing methods on standard graph learning benchmarks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/26/2020

Building powerful and equivariant graph neural networks with message-passing

Message-passing has proved to be an effective way to design graph neural...
research
03/17/2022

On the expressive power of message-passing neural networks as global feature map transformers

We investigate the power of message-passing neural networks (MPNNs) in t...
research
05/08/2019

PiNet: A Permutation Invariant Graph Neural Network for Graph Classification

We propose an end-to-end deep learning learning model for graph classifi...
research
11/23/2021

Local Permutation Equivariance For Graph Neural Networks

In this work we develop a new method, named locally permutation-equivari...
research
07/16/2020

Natural Graph Networks

Conventional neural message passing algorithms are invariant under permu...
research
02/02/2023

Double Permutation Equivariance for Knowledge Graph Completion

This work provides a formalization of Knowledge Graphs (KGs) as a new cl...

Please sign up or login with your details

Forgot password? Click here to reset