Accelerating defect predictions in semiconductors using graph neural networks
First-principles computations reliably predict the energetics of point defects in semiconductors but are constrained by the expense of using large supercells and advanced levels of theory. Machine learning models trained on computational data, especially ones that sufficiently encode defect coordina...
Main Authors: | , , , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
AIP Publishing LLC
2024-03-01
|
Series: | APL Machine Learning |
Online Access: | http://dx.doi.org/10.1063/5.0176333 |
_version_ | 1827297344325943296 |
---|---|
author | Md Habibur Rahman Prince Gollapalli Panayotis Manganaris Satyesh Kumar Yadav Ghanshyam Pilania Brian DeCost Kamal Choudhary Arun Mannodi-Kanakkithodi |
author_facet | Md Habibur Rahman Prince Gollapalli Panayotis Manganaris Satyesh Kumar Yadav Ghanshyam Pilania Brian DeCost Kamal Choudhary Arun Mannodi-Kanakkithodi |
author_sort | Md Habibur Rahman |
collection | DOAJ |
description | First-principles computations reliably predict the energetics of point defects in semiconductors but are constrained by the expense of using large supercells and advanced levels of theory. Machine learning models trained on computational data, especially ones that sufficiently encode defect coordination environments, can be used to accelerate defect predictions. Here, we develop a framework for the prediction and screening of native defects and functional impurities in a chemical space of group IV, III–V, and II–VI zinc blende semiconductors, powered by crystal Graph-based Neural Networks (GNNs) trained on high-throughput density functional theory (DFT) data. Using an innovative approach of sampling partially optimized defect configurations from DFT calculations, we generate one of the largest computational defect datasets to date, containing many types of vacancies, self-interstitials, anti-site substitutions, impurity interstitials and substitutions, as well as some defect complexes. We applied three types of established GNN techniques, namely crystal graph convolutional neural network, materials graph network, and Atomistic Line Graph Neural Network (ALIGNN), to rigorously train models for predicting defect formation energy (DFE) in multiple charge states and chemical potential conditions. We find that ALIGNN yields the best DFE predictions with root mean square errors around 0.3 eV, which represents a prediction accuracy of 98% given the range of values within the dataset, improving significantly on the state-of-the-art. We further show that GNN-based defective structure optimization can take us close to DFT-optimized geometries at a fraction of the cost of full DFT. The current models are based on the semi-local generalized gradient approximation-Perdew–Burke–Ernzerhof (PBE) functional but are highly promising because of the correlation of computed energetics and defect levels with higher levels of theory and experimental data, the accuracy and necessity of discovering novel metastable and low energy defect structures at the PBE level of theory before advanced methods could be applied, and the ability to train multi-fidelity models in the future with new data from non-local functionals. The DFT-GNN models enable prediction and screening across thousands of hypothetical defects based on both unoptimized and partially optimized defective structures, helping identify electronically active defects in technologically important semiconductors. |
first_indexed | 2024-04-24T14:53:52Z |
format | Article |
id | doaj.art-d69c2a05bee64b1d8b4edec5e91a559c |
institution | Directory Open Access Journal |
issn | 2770-9019 |
language | English |
last_indexed | 2024-04-24T14:53:52Z |
publishDate | 2024-03-01 |
publisher | AIP Publishing LLC |
record_format | Article |
series | APL Machine Learning |
spelling | doaj.art-d69c2a05bee64b1d8b4edec5e91a559c2024-04-02T19:46:06ZengAIP Publishing LLCAPL Machine Learning2770-90192024-03-0121016122016122-1810.1063/5.0176333Accelerating defect predictions in semiconductors using graph neural networksMd Habibur Rahman0Prince Gollapalli1Panayotis Manganaris2Satyesh Kumar Yadav3Ghanshyam Pilania4Brian DeCost5Kamal Choudhary6Arun Mannodi-Kanakkithodi7School of Materials Engineering, Purdue University, West Lafayette, Indiana 47907, USADepartment of Metallurgical and Materials Engineering, Indian Institute of Technology (IIT) Madras, Chennai 600036, IndiaSchool of Materials Engineering, Purdue University, West Lafayette, Indiana 47907, USADepartment of Metallurgical and Materials Engineering, Indian Institute of Technology (IIT) Madras, Chennai 600036, IndiaGE Research, Schenectady, New York 12309, USAMaterials Measurement Laboratory, National Institute of Standards and Technology, Gaithersburg, Maryland 20899, USAMaterials Measurement Laboratory, National Institute of Standards and Technology, Gaithersburg, Maryland 20899, USASchool of Materials Engineering, Purdue University, West Lafayette, Indiana 47907, USAFirst-principles computations reliably predict the energetics of point defects in semiconductors but are constrained by the expense of using large supercells and advanced levels of theory. Machine learning models trained on computational data, especially ones that sufficiently encode defect coordination environments, can be used to accelerate defect predictions. Here, we develop a framework for the prediction and screening of native defects and functional impurities in a chemical space of group IV, III–V, and II–VI zinc blende semiconductors, powered by crystal Graph-based Neural Networks (GNNs) trained on high-throughput density functional theory (DFT) data. Using an innovative approach of sampling partially optimized defect configurations from DFT calculations, we generate one of the largest computational defect datasets to date, containing many types of vacancies, self-interstitials, anti-site substitutions, impurity interstitials and substitutions, as well as some defect complexes. We applied three types of established GNN techniques, namely crystal graph convolutional neural network, materials graph network, and Atomistic Line Graph Neural Network (ALIGNN), to rigorously train models for predicting defect formation energy (DFE) in multiple charge states and chemical potential conditions. We find that ALIGNN yields the best DFE predictions with root mean square errors around 0.3 eV, which represents a prediction accuracy of 98% given the range of values within the dataset, improving significantly on the state-of-the-art. We further show that GNN-based defective structure optimization can take us close to DFT-optimized geometries at a fraction of the cost of full DFT. The current models are based on the semi-local generalized gradient approximation-Perdew–Burke–Ernzerhof (PBE) functional but are highly promising because of the correlation of computed energetics and defect levels with higher levels of theory and experimental data, the accuracy and necessity of discovering novel metastable and low energy defect structures at the PBE level of theory before advanced methods could be applied, and the ability to train multi-fidelity models in the future with new data from non-local functionals. The DFT-GNN models enable prediction and screening across thousands of hypothetical defects based on both unoptimized and partially optimized defective structures, helping identify electronically active defects in technologically important semiconductors.http://dx.doi.org/10.1063/5.0176333 |
spellingShingle | Md Habibur Rahman Prince Gollapalli Panayotis Manganaris Satyesh Kumar Yadav Ghanshyam Pilania Brian DeCost Kamal Choudhary Arun Mannodi-Kanakkithodi Accelerating defect predictions in semiconductors using graph neural networks APL Machine Learning |
title | Accelerating defect predictions in semiconductors using graph neural networks |
title_full | Accelerating defect predictions in semiconductors using graph neural networks |
title_fullStr | Accelerating defect predictions in semiconductors using graph neural networks |
title_full_unstemmed | Accelerating defect predictions in semiconductors using graph neural networks |
title_short | Accelerating defect predictions in semiconductors using graph neural networks |
title_sort | accelerating defect predictions in semiconductors using graph neural networks |
url | http://dx.doi.org/10.1063/5.0176333 |
work_keys_str_mv | AT mdhabiburrahman acceleratingdefectpredictionsinsemiconductorsusinggraphneuralnetworks AT princegollapalli acceleratingdefectpredictionsinsemiconductorsusinggraphneuralnetworks AT panayotismanganaris acceleratingdefectpredictionsinsemiconductorsusinggraphneuralnetworks AT satyeshkumaryadav acceleratingdefectpredictionsinsemiconductorsusinggraphneuralnetworks AT ghanshyampilania acceleratingdefectpredictionsinsemiconductorsusinggraphneuralnetworks AT briandecost acceleratingdefectpredictionsinsemiconductorsusinggraphneuralnetworks AT kamalchoudhary acceleratingdefectpredictionsinsemiconductorsusinggraphneuralnetworks AT arunmannodikanakkithodi acceleratingdefectpredictionsinsemiconductorsusinggraphneuralnetworks |