Using neural networks for approximating functions and equations

In this report, we develop the approximation rates of ReLU neural networks for solutions to the elliptic two-scale problems, the stochastic parabolic initial boundary value problems, and the parametric elliptic problems. We obtain bounds on network complexities - in terms of the depth size and the n...

Full description

Bibliographic Details
Main Author: Li, Yongming
Other Authors: Hoang Viet Ha
Format: Final Year Project (FYP)
Language:English
Published: Nanyang Technological University 2019
Subjects:
Online Access:https://hdl.handle.net/10356/136490
_version_ 1826113231868919808
author Li, Yongming
author2 Hoang Viet Ha
author_facet Hoang Viet Ha
Li, Yongming
author_sort Li, Yongming
collection NTU
description In this report, we develop the approximation rates of ReLU neural networks for solutions to the elliptic two-scale problems, the stochastic parabolic initial boundary value problems, and the parametric elliptic problems. We obtain bounds on network complexities - in terms of the depth size and the number of non-zero weights, of the ReLU neural network approximations for the problem solutions. In Chapter 2, we begin with the recent results on neural network approximation theory, and operations used to construct neural networks. In Chapter 3, we employ the sparse tensor product interpolation method to construct ReLU neural networks for approximating solutions of the two-scale homogenized elliptic equations, with essential network size for a prescribed accuracy. The numerical experiments illustrate the theoretical results on how to solve the elliptic problems in Chapter 2 and Chapter 3. In Chapter 4, we assume that the random coefficients have an infinite affine representation for the stochastic parabolic problem and we reduce the problem into an infinite parametric problem. We express the parametric solution as a Taylor generalized polynomial chaos (gpc) expansion and we perform an adaptive discretization on both the spatial-temporal and parameter domains. Using this optimized discretization, we show that for a prescribed accuracy, there is a ReLU neural network for the parametric solution with essentially optimal network complexities. Lastly, in Chapter 5, we consider the parametric elliptic problems, where the random coefficients depend on the parameters in a Lipchitz manner (a weaker assumption than the problem of Chapter 4). We employ the hierarchical finite element method to construct the ReLU neural networks for approximating the solutions to the parametric problem. Our work illustrates the expressive power and approximation capabilities of deep neural networks to approximate functions and solutions to PDE problems.
first_indexed 2024-10-01T03:19:49Z
format Final Year Project (FYP)
id ntu-10356/136490
institution Nanyang Technological University
language English
last_indexed 2024-10-01T03:19:49Z
publishDate 2019
publisher Nanyang Technological University
record_format dspace
spelling ntu-10356/1364902023-02-28T23:12:24Z Using neural networks for approximating functions and equations Li, Yongming Hoang Viet Ha School of Physical and Mathematical Sciences VHHOANG@ntu.edu.sg Science::Mathematics::Analysis Science::Mathematics::Applied mathematics::Numerical analysis In this report, we develop the approximation rates of ReLU neural networks for solutions to the elliptic two-scale problems, the stochastic parabolic initial boundary value problems, and the parametric elliptic problems. We obtain bounds on network complexities - in terms of the depth size and the number of non-zero weights, of the ReLU neural network approximations for the problem solutions. In Chapter 2, we begin with the recent results on neural network approximation theory, and operations used to construct neural networks. In Chapter 3, we employ the sparse tensor product interpolation method to construct ReLU neural networks for approximating solutions of the two-scale homogenized elliptic equations, with essential network size for a prescribed accuracy. The numerical experiments illustrate the theoretical results on how to solve the elliptic problems in Chapter 2 and Chapter 3. In Chapter 4, we assume that the random coefficients have an infinite affine representation for the stochastic parabolic problem and we reduce the problem into an infinite parametric problem. We express the parametric solution as a Taylor generalized polynomial chaos (gpc) expansion and we perform an adaptive discretization on both the spatial-temporal and parameter domains. Using this optimized discretization, we show that for a prescribed accuracy, there is a ReLU neural network for the parametric solution with essentially optimal network complexities. Lastly, in Chapter 5, we consider the parametric elliptic problems, where the random coefficients depend on the parameters in a Lipchitz manner (a weaker assumption than the problem of Chapter 4). We employ the hierarchical finite element method to construct the ReLU neural networks for approximating the solutions to the parametric problem. Our work illustrates the expressive power and approximation capabilities of deep neural networks to approximate functions and solutions to PDE problems. Bachelor of Science in Mathematical Sciences 2019-12-19T08:21:38Z 2019-12-19T08:21:38Z 2019 Final Year Project (FYP) https://hdl.handle.net/10356/136490 en application/pdf Nanyang Technological University
spellingShingle Science::Mathematics::Analysis
Science::Mathematics::Applied mathematics::Numerical analysis
Li, Yongming
Using neural networks for approximating functions and equations
title Using neural networks for approximating functions and equations
title_full Using neural networks for approximating functions and equations
title_fullStr Using neural networks for approximating functions and equations
title_full_unstemmed Using neural networks for approximating functions and equations
title_short Using neural networks for approximating functions and equations
title_sort using neural networks for approximating functions and equations
topic Science::Mathematics::Analysis
Science::Mathematics::Applied mathematics::Numerical analysis
url https://hdl.handle.net/10356/136490
work_keys_str_mv AT liyongming usingneuralnetworksforapproximatingfunctionsandequations