arrow
Volume 31, Issue 4
A Rate of Convergence of Physics Informed Neural Networks for the Linear Second Order Elliptic PDEs

Yuling Jiao, Yanming Lai, Dingwei Li, Xiliang Lu, Fengru Wang, Yang Wang & Jerry Zhijian Yang

Commun. Comput. Phys., 31 (2022), pp. 1272-1295.

Published online: 2022-03

Export citation
  • Abstract

In recent years, physical informed neural networks (PINNs) have been shown to be a powerful tool for solving PDEs empirically. However, numerical analysis of PINNs is still missing. In this paper, we prove the convergence rate to PINNs for the second order elliptic equations with Dirichlet boundary condition, by establishing the upper bounds on the number of training samples, depth and width of the deep neural networks to achieve desired accuracy. The error of PINNs is decomposed into approximation error and statistical error, where the approximation error is given in $C^2$ norm with ReLU$^3$ networks (deep network with activation function max$\{0,x^3\}$) and the statistical error is estimated by Rademacher complexity. We derive the bound on the Rademacher complexity of the non-Lipschitz composition of gradient norm with ReLU$^3$ network, which is of immense independent interest.

  • AMS Subject Headings

62G05, 65N12, 65N15, 68T07

  • Copyright

COPYRIGHT: © Global Science Press

  • Email address
  • BibTex
  • RIS
  • TXT
@Article{CiCP-31-1272, author = {Jiao , YulingLai , YanmingLi , DingweiLu , XiliangWang , FengruWang , Yang and Yang , Jerry Zhijian}, title = {A Rate of Convergence of Physics Informed Neural Networks for the Linear Second Order Elliptic PDEs}, journal = {Communications in Computational Physics}, year = {2022}, volume = {31}, number = {4}, pages = {1272--1295}, abstract = {

In recent years, physical informed neural networks (PINNs) have been shown to be a powerful tool for solving PDEs empirically. However, numerical analysis of PINNs is still missing. In this paper, we prove the convergence rate to PINNs for the second order elliptic equations with Dirichlet boundary condition, by establishing the upper bounds on the number of training samples, depth and width of the deep neural networks to achieve desired accuracy. The error of PINNs is decomposed into approximation error and statistical error, where the approximation error is given in $C^2$ norm with ReLU$^3$ networks (deep network with activation function max$\{0,x^3\}$) and the statistical error is estimated by Rademacher complexity. We derive the bound on the Rademacher complexity of the non-Lipschitz composition of gradient norm with ReLU$^3$ network, which is of immense independent interest.

}, issn = {1991-7120}, doi = {https://doi.org/10.4208/cicp.OA-2021-0186}, url = {http://global-sci.org/intro/article_detail/cicp/20384.html} }
TY - JOUR T1 - A Rate of Convergence of Physics Informed Neural Networks for the Linear Second Order Elliptic PDEs AU - Jiao , Yuling AU - Lai , Yanming AU - Li , Dingwei AU - Lu , Xiliang AU - Wang , Fengru AU - Wang , Yang AU - Yang , Jerry Zhijian JO - Communications in Computational Physics VL - 4 SP - 1272 EP - 1295 PY - 2022 DA - 2022/03 SN - 31 DO - http://doi.org/10.4208/cicp.OA-2021-0186 UR - https://global-sci.org/intro/article_detail/cicp/20384.html KW - PINNs, ReLU$^3$ neural network, B-splines, Rademacher complexity AB -

In recent years, physical informed neural networks (PINNs) have been shown to be a powerful tool for solving PDEs empirically. However, numerical analysis of PINNs is still missing. In this paper, we prove the convergence rate to PINNs for the second order elliptic equations with Dirichlet boundary condition, by establishing the upper bounds on the number of training samples, depth and width of the deep neural networks to achieve desired accuracy. The error of PINNs is decomposed into approximation error and statistical error, where the approximation error is given in $C^2$ norm with ReLU$^3$ networks (deep network with activation function max$\{0,x^3\}$) and the statistical error is estimated by Rademacher complexity. We derive the bound on the Rademacher complexity of the non-Lipschitz composition of gradient norm with ReLU$^3$ network, which is of immense independent interest.

Jiao , YulingLai , YanmingLi , DingweiLu , XiliangWang , FengruWang , Yang and Yang , Jerry Zhijian. (2022). A Rate of Convergence of Physics Informed Neural Networks for the Linear Second Order Elliptic PDEs. Communications in Computational Physics. 31 (4). 1272-1295. doi:10.4208/cicp.OA-2021-0186
Copy to clipboard
The citation has been copied to your clipboard