An Improved Method for Physics-Informed Neural Networks That Accelerates Convergence
Physics-Informed Neural Networks (PINNs) have proven highly effective for solving high-dimensional Partial Differential Equations (PDEs), having demonstrated tremendous potential in a variety of challenging scenarios. However, traditional PINNs (vanilla PINNs), typically based on fully connected neu...
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2024-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/10399482/ |
_version_ | 1797302656954269696 |
---|---|
author | Liangliang Yan You Zhou Huan Liu Lingqi Liu |
author_facet | Liangliang Yan You Zhou Huan Liu Lingqi Liu |
author_sort | Liangliang Yan |
collection | DOAJ |
description | Physics-Informed Neural Networks (PINNs) have proven highly effective for solving high-dimensional Partial Differential Equations (PDEs), having demonstrated tremendous potential in a variety of challenging scenarios. However, traditional PINNs (vanilla PINNs), typically based on fully connected neural networks (FCNN), often face issues with convergence and parameter redundancy. This paper proposes a novel approach that utilizes a multi-input residual network, incorporating a multi-step training paradigm to facilitate unsupervised training. This improved method, which we named MultiInNet PINNs, can enhance the convergence speed and the stability of traditional PINNs. Our experiments demonstrate that MultiInNet PINNs achieve better convergence with fewer parameters than other networks like FCNN, ResNet, and UNet. Specifically, the multi-step training increases convergence speed by approximately 45%, while the MultiInNet enhancement contributes an additional 50%, leading to a total improvement of about 70%. This accelerated convergence speed allows PINNs to lower computational costs by achieving faster convergence. Moreover, our MultiInNet PINNs provides a potential method for handling initial and boundary conditions (I/BCs) separately within PINNs. |
first_indexed | 2024-03-07T23:41:00Z |
format | Article |
id | doaj.art-0a6fcf311883490296853018af08c4d2 |
institution | Directory Open Access Journal |
issn | 2169-3536 |
language | English |
last_indexed | 2024-03-07T23:41:00Z |
publishDate | 2024-01-01 |
publisher | IEEE |
record_format | Article |
series | IEEE Access |
spelling | doaj.art-0a6fcf311883490296853018af08c4d22024-02-20T00:00:53ZengIEEEIEEE Access2169-35362024-01-0112239432395310.1109/ACCESS.2024.335405810399482An Improved Method for Physics-Informed Neural Networks That Accelerates ConvergenceLiangliang Yan0https://orcid.org/0009-0000-4256-8807You Zhou1https://orcid.org/0000-0001-6527-2208Huan Liu2Lingqi Liu3https://orcid.org/0009-0005-9075-8345Planetary Science Research Center and School of Computer and Security, Chengdu University of Technology, Chengdu, ChinaPlanetary Science Research Center and School of Computer and Security, Chengdu University of Technology, Chengdu, ChinaCollege of Electronic and Information Engineering, Jinggangshan University, Ji’an, ChinaPlanetary Science Research Center and School of Computer and Security, Chengdu University of Technology, Chengdu, ChinaPhysics-Informed Neural Networks (PINNs) have proven highly effective for solving high-dimensional Partial Differential Equations (PDEs), having demonstrated tremendous potential in a variety of challenging scenarios. However, traditional PINNs (vanilla PINNs), typically based on fully connected neural networks (FCNN), often face issues with convergence and parameter redundancy. This paper proposes a novel approach that utilizes a multi-input residual network, incorporating a multi-step training paradigm to facilitate unsupervised training. This improved method, which we named MultiInNet PINNs, can enhance the convergence speed and the stability of traditional PINNs. Our experiments demonstrate that MultiInNet PINNs achieve better convergence with fewer parameters than other networks like FCNN, ResNet, and UNet. Specifically, the multi-step training increases convergence speed by approximately 45%, while the MultiInNet enhancement contributes an additional 50%, leading to a total improvement of about 70%. This accelerated convergence speed allows PINNs to lower computational costs by achieving faster convergence. Moreover, our MultiInNet PINNs provides a potential method for handling initial and boundary conditions (I/BCs) separately within PINNs.https://ieeexplore.ieee.org/document/10399482/Physics-informed neural networkpartial differential equationsmulti-input residual networkconvergence speedunsupervised learning |
spellingShingle | Liangliang Yan You Zhou Huan Liu Lingqi Liu An Improved Method for Physics-Informed Neural Networks That Accelerates Convergence IEEE Access Physics-informed neural network partial differential equations multi-input residual network convergence speed unsupervised learning |
title | An Improved Method for Physics-Informed Neural Networks That Accelerates Convergence |
title_full | An Improved Method for Physics-Informed Neural Networks That Accelerates Convergence |
title_fullStr | An Improved Method for Physics-Informed Neural Networks That Accelerates Convergence |
title_full_unstemmed | An Improved Method for Physics-Informed Neural Networks That Accelerates Convergence |
title_short | An Improved Method for Physics-Informed Neural Networks That Accelerates Convergence |
title_sort | improved method for physics informed neural networks that accelerates convergence |
topic | Physics-informed neural network partial differential equations multi-input residual network convergence speed unsupervised learning |
url | https://ieeexplore.ieee.org/document/10399482/ |
work_keys_str_mv | AT liangliangyan animprovedmethodforphysicsinformedneuralnetworksthatacceleratesconvergence AT youzhou animprovedmethodforphysicsinformedneuralnetworksthatacceleratesconvergence AT huanliu animprovedmethodforphysicsinformedneuralnetworksthatacceleratesconvergence AT lingqiliu animprovedmethodforphysicsinformedneuralnetworksthatacceleratesconvergence AT liangliangyan improvedmethodforphysicsinformedneuralnetworksthatacceleratesconvergence AT youzhou improvedmethodforphysicsinformedneuralnetworksthatacceleratesconvergence AT huanliu improvedmethodforphysicsinformedneuralnetworksthatacceleratesconvergence AT lingqiliu improvedmethodforphysicsinformedneuralnetworksthatacceleratesconvergence |