Characterizing Graph Datasets for Node Classification: Beyond Homophily-Heterophily Dichotomy

09/13/2022
by   Oleg Platonov, et al.
0

Homophily is a graph property describing the tendency of edges to connect similar nodes; the opposite is called heterophily. While homophily is natural for many real-world networks, there are also networks without this property. It is often believed that standard message-passing graph neural networks (GNNs) do not perform well on non-homophilous graphs, and thus such datasets need special attention. While a lot of effort has been put into developing graph representation learning methods for heterophilous graphs, there is no universally agreed upon measure of homophily. Several metrics for measuring homophily have been used in the literature, however, we show that all of them have critical drawbacks preventing comparison of homophily levels between different datasets. We formalize desirable properties for a proper homophily measure and show how existing literature on the properties of classification performance metrics can be linked to our problem. In doing so we find a measure that we call adjusted homophily that satisfies more desirable properties than existing homophily measures. Interestingly, this measure is related to two classification performance metrics - Cohen's Kappa and Matthews correlation coefficient. Then, we go beyond the homophily-heterophily dichotomy and propose a new property that we call label informativeness (LI) that characterizes how much information a neighbor's label provides about a node's label. We theoretically show that LI is comparable across datasets with different numbers of classes and class size balance. Through a series of experiments we show that LI is a better predictor of the performance of GNNs on a dataset than homophily. We show that LI explains why GNNs can sometimes perform well on heterophilous datasets - a phenomenon recently observed in the literature.

READ FULL TEXT

page 8

page 21

research
02/22/2023

A critical look at the evaluation of GNNs under heterophily: are we really making progress?

Node classification is a classical graph representation learning task on...
research
01/22/2022

Good Classification Measures and How to Find Them

Several performance measures can be used for evaluating classification r...
research
09/05/2020

A Simple and General Graph Neural Network with Stochastic Message Passing

Graph neural networks (GNNs) are emerging machine learning models on gra...
research
10/06/2022

Expander Graph Propagation

Deploying graph neural networks (GNNs) on whole-graph classification or ...
research
03/02/2021

Graph Information Vanishing Phenomenon inImplicit Graph Neural Networks

One of the key problems of GNNs is how to describe the importance of nei...
research
06/06/2023

How does over-squashing affect the power of GNNs?

Graph Neural Networks (GNNs) are the state-of-the-art model for machine ...
research
04/20/2022

Simplicial Attention Networks

Graph representation learning methods have mostly been limited to the mo...

Please sign up or login with your details

Forgot password? Click here to reset