Scalable deeper graph neural networks for high-performance materials property prediction

Summary: Machine-learning-based materials property prediction models have emerged as a promising approach for new materials discovery, among which the graph neural networks (GNNs) have shown the best performance due to their capability to learn high-level features from crystal structures. However, e...

Full description

Bibliographic Details
Main Authors: Sadman Sadeed Omee, Steph-Yves Louis, Nihang Fu, Lai Wei, Sourin Dey, Rongzhi Dong, Qinyang Li, Jianjun Hu
Format: Article
Language:English
Published: Elsevier 2022-05-01
Series:Patterns
Subjects:
Online Access:http://www.sciencedirect.com/science/article/pii/S2666389922000769
Description
Summary:Summary: Machine-learning-based materials property prediction models have emerged as a promising approach for new materials discovery, among which the graph neural networks (GNNs) have shown the best performance due to their capability to learn high-level features from crystal structures. However, existing GNN models suffer from their lack of scalability, high hyperparameter tuning complexity, and constrained performance due to over-smoothing. We propose a scalable global graph attention neural network model DeeperGATGNN with differentiable group normalization (DGN) and skip connections for high-performance materials property prediction. Our systematic benchmark studies show that our model achieves the state-of-the-art prediction results on five out of six datasets, outperforming five existing GNN models by up to 10%. Our model is also the most scalable one in terms of graph convolution layers, which allows us to train very deep networks (e.g., >30 layers) without significant performance degradation. Our implementation is available at https://github.com/usccolumbia/deeperGATGNN. The bigger picture: Modern deep-learning-based generative models have made it possible to computationally design millions of hypothetical materials. However, fast and accurate materials property prediction models are needed to conduct large-scale screening of these candidates for new materials discovery. Graph neural networks (GNNs) have emerged as the most competitive models for materials property prediction, with their performance and scalability, however, still being constrained by the over-smoothing issue. We present DeeperGATGNN, a global attention-based GNN with differentiable group normalization and residual connection to achieve not only state-of-the-art performance for five out of six datasets but also high scalability. Our technique allows us to build very deep GNNs without significant performance degradation as other GNNs do. Our models can be generally used to build scalable GNNs for any application domain, especially where large deep learning models are needed.
ISSN:2666-3899