LiCROM: Linear-Subspace Continuous Reduced Order Modeling with Neural Fields
Linear reduced-order modeling (ROM) simplifies complex simulations by approximating the behavior of a system using a simplified kinematic representation. Typically, ROM is trained on input simulations created with a specific spatial discretization, and then serves to accelerate simulations with the...
Main Authors: | , , , , , |
---|---|
Other Authors: | |
Format: | Article |
Language: | English |
Published: |
ACM|SIGGRAPH Asia 2023 Conference Papers
2024
|
Online Access: | https://hdl.handle.net/1721.1/153261 |
_version_ | 1811079843921526784 |
---|---|
author | Chang, Yue Chen, Peter Yichen Wang, Zhecheng Chiaramonte, Maurizio M. Carlberg, Kevin Grinspun, Eitan |
author2 | Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory |
author_facet | Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory Chang, Yue Chen, Peter Yichen Wang, Zhecheng Chiaramonte, Maurizio M. Carlberg, Kevin Grinspun, Eitan |
author_sort | Chang, Yue |
collection | MIT |
description | Linear reduced-order modeling (ROM) simplifies complex simulations by approximating the behavior of a system using a simplified kinematic representation. Typically, ROM is trained on input simulations created with a specific spatial discretization, and then serves to accelerate simulations with the same discretization. This discretization-dependence is restrictive.
Becoming independent of a specific discretization would provide flexibility to mix and match mesh resolutions, connectivity, and type (tetrahedral, hexahedral) in training data; to accelerate simulations with novel discretizations unseen during training; and to accelerate adaptive simulations that temporally or parametrically change the discretization.
We present a flexible, discretization-independent approach to reduced-order modeling. Like traditional ROM, we represent the configuration as a linear combination of displacement fields. Unlike traditional ROM, our displacement fields are continuous maps from every point on the reference domain to a corresponding displacement vector; these maps are represented as implicit neural fields.
With linear continuous ROM (LiCROM), our training set can include multiple geometries undergoing multiple loading conditions, independent of their discretization. This opens the door to novel applications of reduced order modeling. We can now accelerate simulations that modify the geometry at runtime, for instance via cutting, hole punching, and even swapping the entire mesh. We can also accelerate simulations of geometries unseen during training. We demonstrate one-shot generalization, training on a single geometry and subsequently simulating various unseen geometries. |
first_indexed | 2024-09-23T11:21:25Z |
format | Article |
id | mit-1721.1/153261 |
institution | Massachusetts Institute of Technology |
language | English |
last_indexed | 2024-09-23T11:21:25Z |
publishDate | 2024 |
publisher | ACM|SIGGRAPH Asia 2023 Conference Papers |
record_format | dspace |
spelling | mit-1721.1/1532612024-01-29T15:49:54Z LiCROM: Linear-Subspace Continuous Reduced Order Modeling with Neural Fields Chang, Yue Chen, Peter Yichen Wang, Zhecheng Chiaramonte, Maurizio M. Carlberg, Kevin Grinspun, Eitan Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory Linear reduced-order modeling (ROM) simplifies complex simulations by approximating the behavior of a system using a simplified kinematic representation. Typically, ROM is trained on input simulations created with a specific spatial discretization, and then serves to accelerate simulations with the same discretization. This discretization-dependence is restrictive. Becoming independent of a specific discretization would provide flexibility to mix and match mesh resolutions, connectivity, and type (tetrahedral, hexahedral) in training data; to accelerate simulations with novel discretizations unseen during training; and to accelerate adaptive simulations that temporally or parametrically change the discretization. We present a flexible, discretization-independent approach to reduced-order modeling. Like traditional ROM, we represent the configuration as a linear combination of displacement fields. Unlike traditional ROM, our displacement fields are continuous maps from every point on the reference domain to a corresponding displacement vector; these maps are represented as implicit neural fields. With linear continuous ROM (LiCROM), our training set can include multiple geometries undergoing multiple loading conditions, independent of their discretization. This opens the door to novel applications of reduced order modeling. We can now accelerate simulations that modify the geometry at runtime, for instance via cutting, hole punching, and even swapping the entire mesh. We can also accelerate simulations of geometries unseen during training. We demonstrate one-shot generalization, training on a single geometry and subsequently simulating various unseen geometries. 2024-01-02T20:08:15Z 2024-01-02T20:08:15Z 2023-12-10 2024-01-01T08:45:59Z Article http://purl.org/eprint/type/ConferencePaper 979-8-4007-0315-7 https://hdl.handle.net/1721.1/153261 Chang, Yue, Chen, Peter Yichen, Wang, Zhecheng, Chiaramonte, Maurizio M., Carlberg, Kevin et al. 2023. "LiCROM: Linear-Subspace Continuous Reduced Order Modeling with Neural Fields." PUBLISHER_POLICY en https://doi.org/10.1145/3610548.3618158 Article is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use. The author(s) application/pdf ACM|SIGGRAPH Asia 2023 Conference Papers Association for Computing Machinery |
spellingShingle | Chang, Yue Chen, Peter Yichen Wang, Zhecheng Chiaramonte, Maurizio M. Carlberg, Kevin Grinspun, Eitan LiCROM: Linear-Subspace Continuous Reduced Order Modeling with Neural Fields |
title | LiCROM: Linear-Subspace Continuous Reduced Order Modeling with Neural Fields |
title_full | LiCROM: Linear-Subspace Continuous Reduced Order Modeling with Neural Fields |
title_fullStr | LiCROM: Linear-Subspace Continuous Reduced Order Modeling with Neural Fields |
title_full_unstemmed | LiCROM: Linear-Subspace Continuous Reduced Order Modeling with Neural Fields |
title_short | LiCROM: Linear-Subspace Continuous Reduced Order Modeling with Neural Fields |
title_sort | licrom linear subspace continuous reduced order modeling with neural fields |
url | https://hdl.handle.net/1721.1/153261 |
work_keys_str_mv | AT changyue licromlinearsubspacecontinuousreducedordermodelingwithneuralfields AT chenpeteryichen licromlinearsubspacecontinuousreducedordermodelingwithneuralfields AT wangzhecheng licromlinearsubspacecontinuousreducedordermodelingwithneuralfields AT chiaramontemauriziom licromlinearsubspacecontinuousreducedordermodelingwithneuralfields AT carlbergkevin licromlinearsubspacecontinuousreducedordermodelingwithneuralfields AT grinspuneitan licromlinearsubspacecontinuousreducedordermodelingwithneuralfields |