- Journal Home
- Volume 36 - 2024
- Volume 35 - 2024
- Volume 34 - 2023
- Volume 33 - 2023
- Volume 32 - 2022
- Volume 31 - 2022
- Volume 30 - 2021
- Volume 29 - 2021
- Volume 28 - 2020
- Volume 27 - 2020
- Volume 26 - 2019
- Volume 25 - 2019
- Volume 24 - 2018
- Volume 23 - 2018
- Volume 22 - 2017
- Volume 21 - 2017
- Volume 20 - 2016
- Volume 19 - 2016
- Volume 18 - 2015
- Volume 17 - 2015
- Volume 16 - 2014
- Volume 15 - 2014
- Volume 14 - 2013
- Volume 13 - 2013
- Volume 12 - 2012
- Volume 11 - 2012
- Volume 10 - 2011
- Volume 9 - 2011
- Volume 8 - 2010
- Volume 7 - 2010
- Volume 6 - 2009
- Volume 5 - 2009
- Volume 4 - 2008
- Volume 3 - 2008
- Volume 2 - 2007
- Volume 1 - 2006
Commun. Comput. Phys., 28 (2020), pp. 2042-2074.
Published online: 2020-11
Cited by
- BibTex
- RIS
- TXT
Physics informed neural networks (PINNs) are deep learning based techniques for solving partial differential equations (PDEs) encountered in computational
science and engineering. Guided by data and physical laws, PINNs find a neural network that approximates the solution to a system of PDEs. Such a neural network is
obtained by minimizing a loss function in which any prior knowledge of PDEs and
data are encoded. Despite its remarkable empirical success in one, two or three dimensional problems, there is little theoretical justification for PINNs.
As the number of data grows, PINNs generate a sequence of minimizers which correspond to a sequence of neural networks. We want to answer the question: Does the sequence of minimizers converge to the solution to the PDE? We consider two classes of
PDEs: linear second-order elliptic and parabolic. By adapting the Schauder approach
and the maximum principle, we show that the sequence of minimizers strongly converges to the PDE solution in $C^0$. Furthermore, we show that if each minimizer satisfies
the initial/boundary conditions, the convergence mode becomes $H^1$. Computational
examples are provided to illustrate our theoretical findings. To the best of our knowledge, this is the first theoretical work that shows the consistency of PINNs.
Physics informed neural networks (PINNs) are deep learning based techniques for solving partial differential equations (PDEs) encountered in computational
science and engineering. Guided by data and physical laws, PINNs find a neural network that approximates the solution to a system of PDEs. Such a neural network is
obtained by minimizing a loss function in which any prior knowledge of PDEs and
data are encoded. Despite its remarkable empirical success in one, two or three dimensional problems, there is little theoretical justification for PINNs.
As the number of data grows, PINNs generate a sequence of minimizers which correspond to a sequence of neural networks. We want to answer the question: Does the sequence of minimizers converge to the solution to the PDE? We consider two classes of
PDEs: linear second-order elliptic and parabolic. By adapting the Schauder approach
and the maximum principle, we show that the sequence of minimizers strongly converges to the PDE solution in $C^0$. Furthermore, we show that if each minimizer satisfies
the initial/boundary conditions, the convergence mode becomes $H^1$. Computational
examples are provided to illustrate our theoretical findings. To the best of our knowledge, this is the first theoretical work that shows the consistency of PINNs.