CrysGNN : Distilling pre-trained knowledge to enhance property prediction for crystalline materials

by   Kishalay Das, et al.

In recent years, graph neural network (GNN) based approaches have emerged as a powerful technique to encode complex topological structure of crystal materials in an enriched representation space. These models are often supervised in nature and using the property-specific training data, learn relationship between crystal structure and different properties like formation energy, bandgap, bulk modulus, etc. Most of these methods require a huge amount of property-tagged data to train the system which may not be available for different properties. However, there is an availability of a huge amount of crystal data with its chemical composition and structural bonds. To leverage these untapped data, this paper presents CrysGNN, a new pre-trained GNN framework for crystalline materials, which captures both node and graph level structural information of crystal graphs using a huge amount of unlabelled material data. Further, we extract distilled knowledge from CrysGNN and inject into different state of the art property predictors to enhance their property prediction accuracy. We conduct extensive experiments to show that with distilled knowledge from the pre-trained model, all the SOTA algorithms are able to outperform their own vanilla version with good margins. We also observe that the distillation process provides a significant improvement over the conventional approach of finetuning the pre-trained model. We have released the pre-trained model along with the large dataset of 800K crystal graph which we carefully curated; so that the pretrained model can be plugged into any existing and upcoming models to enhance their prediction accuracy.


page 7

page 16


Gode – Integrating Biochemical Knowledge Graph into Pre-training Molecule Graph Neural Network

The precise prediction of molecular properties holds paramount importanc...

Prompt Tuning for Graph Neural Networks

In recent years, prompt tuning has set off a research boom in the adapta...

Crystal Twins: Self-supervised Learning for Crystalline Material Property Prediction

Machine learning (ML) models have been widely successful in the predicti...

MolCPT: Molecule Continuous Prompt Tuning to Generalize Molecular Representation Learning

Molecular representation learning is crucial for the problem of molecula...

K-Adapter: Infusing Knowledge into Pre-Trained Models with Adapters

We study the problem of injecting knowledge into large pre-trained model...

Learning To Rank Resources with GNN

As the content on the Internet continues to grow, many new dynamically c...

Using Rule-Based Labels for Weak Supervised Learning: A ChemNet for Transferable Chemical Property Prediction

With access to large datasets, deep neural networks (DNN) have achieved ...

Please sign up or login with your details

Forgot password? Click here to reset