BLASTNet: A call for community-involved big data in combustion machine learning
Many state-of-the-art machine learning (ML) fields rely on large datasets and massive deep learning models (with O(109) trainable parameters) to predict target variables accurately without overfitting. Within combustion, a wealth of data exists in the form of high-fidelity simulation data and detail...
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Elsevier
2022-12-01
|
Series: | Applications in Energy and Combustion Science |
Subjects: | |
Online Access: | http://www.sciencedirect.com/science/article/pii/S2666352X22000309 |
_version_ | 1811315722567024640 |
---|---|
author | Wai Tong Chung Ki Sung Jung Jacqueline H. Chen Matthias Ihme |
author_facet | Wai Tong Chung Ki Sung Jung Jacqueline H. Chen Matthias Ihme |
author_sort | Wai Tong Chung |
collection | DOAJ |
description | Many state-of-the-art machine learning (ML) fields rely on large datasets and massive deep learning models (with O(109) trainable parameters) to predict target variables accurately without overfitting. Within combustion, a wealth of data exists in the form of high-fidelity simulation data and detailed measurements that have been accumulating since the past decade. Yet, this data remains distributed and can be difficult to access. In this work, we present a realistic and feasible framework which combines (i) community involvement, (ii) public data repositories, and (iii) lossy compression algorithms for enabling broad access to high-fidelity data via a network-of-datasets approach. This Bearable Large Accessible Scientific Training Network-of-Datasets (BLASTNet) is consolidated on a community-hosted web-platform (at https://blastnet.github.io/), and is targeted towards improving accessibility to diverse scientific data for deep learning algorithms. For datasets that exceed the storage limitations in public ML repositories, we propose employing lossy compression algorithms on high-fidelity data, at the cost of introducing controllable amounts of error to the data. This framework leverages the well-known robustness of modern deep learning methods to noisy data, which we demonstrate is also applicable in combustion by training deep learning models on lossy direct numerical simulation (DNS) data in two completely different ML problems — one in combustion regime classification and the other in filtered reaction rate regression. Our results show that combustion DNS data can be compressed by at least 10-fold without affecting deep learning models, and that the resulting lossy errors can even improve their training. We thus call on the research community to help contribute to opening a bearable pathway towards accessible big data in combustion. |
first_indexed | 2024-04-13T11:35:30Z |
format | Article |
id | doaj.art-abb66e0b52ed4056a8616fe487e2a0c2 |
institution | Directory Open Access Journal |
issn | 2666-352X |
language | English |
last_indexed | 2024-04-13T11:35:30Z |
publishDate | 2022-12-01 |
publisher | Elsevier |
record_format | Article |
series | Applications in Energy and Combustion Science |
spelling | doaj.art-abb66e0b52ed4056a8616fe487e2a0c22022-12-22T02:48:27ZengElsevierApplications in Energy and Combustion Science2666-352X2022-12-0112100087BLASTNet: A call for community-involved big data in combustion machine learningWai Tong Chung0Ki Sung Jung1Jacqueline H. Chen2Matthias Ihme3Department of Mechanical Engineering, Stanford University, Stanford, CA 94305, USA; Corresponding author.Combustion Research Facility, Sandia National Laboratories, Livermore, CA 94550, USACombustion Research Facility, Sandia National Laboratories, Livermore, CA 94550, USADepartment of Mechanical Engineering, Stanford University, Stanford, CA 94305, USA; Department of Photon Science, SLAC National Accelerator Laboratory, Menlo Park, CA 94025, USAMany state-of-the-art machine learning (ML) fields rely on large datasets and massive deep learning models (with O(109) trainable parameters) to predict target variables accurately without overfitting. Within combustion, a wealth of data exists in the form of high-fidelity simulation data and detailed measurements that have been accumulating since the past decade. Yet, this data remains distributed and can be difficult to access. In this work, we present a realistic and feasible framework which combines (i) community involvement, (ii) public data repositories, and (iii) lossy compression algorithms for enabling broad access to high-fidelity data via a network-of-datasets approach. This Bearable Large Accessible Scientific Training Network-of-Datasets (BLASTNet) is consolidated on a community-hosted web-platform (at https://blastnet.github.io/), and is targeted towards improving accessibility to diverse scientific data for deep learning algorithms. For datasets that exceed the storage limitations in public ML repositories, we propose employing lossy compression algorithms on high-fidelity data, at the cost of introducing controllable amounts of error to the data. This framework leverages the well-known robustness of modern deep learning methods to noisy data, which we demonstrate is also applicable in combustion by training deep learning models on lossy direct numerical simulation (DNS) data in two completely different ML problems — one in combustion regime classification and the other in filtered reaction rate regression. Our results show that combustion DNS data can be compressed by at least 10-fold without affecting deep learning models, and that the resulting lossy errors can even improve their training. We thus call on the research community to help contribute to opening a bearable pathway towards accessible big data in combustion.http://www.sciencedirect.com/science/article/pii/S2666352X22000309Big dataDeep learningDirect numerical simulationBLASTNet |
spellingShingle | Wai Tong Chung Ki Sung Jung Jacqueline H. Chen Matthias Ihme BLASTNet: A call for community-involved big data in combustion machine learning Applications in Energy and Combustion Science Big data Deep learning Direct numerical simulation BLASTNet |
title | BLASTNet: A call for community-involved big data in combustion machine learning |
title_full | BLASTNet: A call for community-involved big data in combustion machine learning |
title_fullStr | BLASTNet: A call for community-involved big data in combustion machine learning |
title_full_unstemmed | BLASTNet: A call for community-involved big data in combustion machine learning |
title_short | BLASTNet: A call for community-involved big data in combustion machine learning |
title_sort | blastnet a call for community involved big data in combustion machine learning |
topic | Big data Deep learning Direct numerical simulation BLASTNet |
url | http://www.sciencedirect.com/science/article/pii/S2666352X22000309 |
work_keys_str_mv | AT waitongchung blastnetacallforcommunityinvolvedbigdataincombustionmachinelearning AT kisungjung blastnetacallforcommunityinvolvedbigdataincombustionmachinelearning AT jacquelinehchen blastnetacallforcommunityinvolvedbigdataincombustionmachinelearning AT matthiasihme blastnetacallforcommunityinvolvedbigdataincombustionmachinelearning |