arrow
Volume 38, Issue 3
ReLU Deep Neural Networks and Linear Finite Elements

Juncai He, Lin Li, Jinchao Xu & Chunyue Zheng

J. Comp. Math., 38 (2020), pp. 502-527.

Published online: 2020-03

Export citation
  • Abstract

In this paper, we investigate the relationship between deep neural networks (DNN) with rectified linear unit (ReLU) function as the activation function and continuous piecewise linear (CPWL) functions, especially CPWL functions from the simplicial linear finite element method (FEM). We first consider the special case of FEM. By exploring the DNN representation of its nodal basis functions, we present a ReLU DNN representation of CPWL in FEM. We theoretically establish that at least $2$ hidden layers are needed in a ReLU DNN to represent any linear finite element functions in $\Omega \subseteq \mathbb{R}^d$ when $d\ge2$.  Consequently, for $d=2,3$ which are often encountered in scientific and engineering computing, the minimal number of two hidden layers are necessary and sufficient for any CPWL function to be represented by a ReLU DNN. Then we include a detailed account on how a general CPWL in $\mathbb R^d$ can be represented by a ReLU DNN with at most $\lceil\log_2(d+1)\rceil$ hidden layers and we also give an estimation of the number of neurons in DNN that are needed in such a representation.  Furthermore, using the relationship between DNN and FEM, we theoretically argue that a special class of DNN models with low bit-width are still expected to have an adequate representation power in applications.  Finally, as a proof of concept, we present some numerical results for using ReLU DNNs to solve a two-point boundary problem to demonstrate the potential of applying DNN for numerical solution of partial differential equations.


  • AMS Subject Headings

26B40, 65N30, 65N99

  • Copyright

COPYRIGHT: © Global Science Press

  • Email address

juncaihe@pku.edu.cn (Juncai He)

lilin1993@pku.edu.cn (Lin Li)

xu@math.psu.edu (Jinchao Xu)

cmz5199@psu.edu (Chunyue Zheng)

  • BibTex
  • RIS
  • TXT
@Article{JCM-38-502, author = {He , JuncaiLi , LinXu , Jinchao and Zheng , Chunyue}, title = {ReLU Deep Neural Networks and Linear Finite Elements}, journal = {Journal of Computational Mathematics}, year = {2020}, volume = {38}, number = {3}, pages = {502--527}, abstract = {

In this paper, we investigate the relationship between deep neural networks (DNN) with rectified linear unit (ReLU) function as the activation function and continuous piecewise linear (CPWL) functions, especially CPWL functions from the simplicial linear finite element method (FEM). We first consider the special case of FEM. By exploring the DNN representation of its nodal basis functions, we present a ReLU DNN representation of CPWL in FEM. We theoretically establish that at least $2$ hidden layers are needed in a ReLU DNN to represent any linear finite element functions in $\Omega \subseteq \mathbb{R}^d$ when $d\ge2$.  Consequently, for $d=2,3$ which are often encountered in scientific and engineering computing, the minimal number of two hidden layers are necessary and sufficient for any CPWL function to be represented by a ReLU DNN. Then we include a detailed account on how a general CPWL in $\mathbb R^d$ can be represented by a ReLU DNN with at most $\lceil\log_2(d+1)\rceil$ hidden layers and we also give an estimation of the number of neurons in DNN that are needed in such a representation.  Furthermore, using the relationship between DNN and FEM, we theoretically argue that a special class of DNN models with low bit-width are still expected to have an adequate representation power in applications.  Finally, as a proof of concept, we present some numerical results for using ReLU DNNs to solve a two-point boundary problem to demonstrate the potential of applying DNN for numerical solution of partial differential equations.


}, issn = {1991-7139}, doi = {https://doi.org/10.4208/jcm.1901-m2018-0160}, url = {http://global-sci.org/intro/article_detail/jcm/15798.html} }
TY - JOUR T1 - ReLU Deep Neural Networks and Linear Finite Elements AU - He , Juncai AU - Li , Lin AU - Xu , Jinchao AU - Zheng , Chunyue JO - Journal of Computational Mathematics VL - 3 SP - 502 EP - 527 PY - 2020 DA - 2020/03 SN - 38 DO - http://doi.org/10.4208/jcm.1901-m2018-0160 UR - https://global-sci.org/intro/article_detail/jcm/15798.html KW - Finite element method, deep neural network, piecewise linear function. AB -

In this paper, we investigate the relationship between deep neural networks (DNN) with rectified linear unit (ReLU) function as the activation function and continuous piecewise linear (CPWL) functions, especially CPWL functions from the simplicial linear finite element method (FEM). We first consider the special case of FEM. By exploring the DNN representation of its nodal basis functions, we present a ReLU DNN representation of CPWL in FEM. We theoretically establish that at least $2$ hidden layers are needed in a ReLU DNN to represent any linear finite element functions in $\Omega \subseteq \mathbb{R}^d$ when $d\ge2$.  Consequently, for $d=2,3$ which are often encountered in scientific and engineering computing, the minimal number of two hidden layers are necessary and sufficient for any CPWL function to be represented by a ReLU DNN. Then we include a detailed account on how a general CPWL in $\mathbb R^d$ can be represented by a ReLU DNN with at most $\lceil\log_2(d+1)\rceil$ hidden layers and we also give an estimation of the number of neurons in DNN that are needed in such a representation.  Furthermore, using the relationship between DNN and FEM, we theoretically argue that a special class of DNN models with low bit-width are still expected to have an adequate representation power in applications.  Finally, as a proof of concept, we present some numerical results for using ReLU DNNs to solve a two-point boundary problem to demonstrate the potential of applying DNN for numerical solution of partial differential equations.


He , JuncaiLi , LinXu , Jinchao and Zheng , Chunyue. (2020). ReLU Deep Neural Networks and Linear Finite Elements. Journal of Computational Mathematics. 38 (3). 502-527. doi:10.4208/jcm.1901-m2018-0160
Copy to clipboard
The citation has been copied to your clipboard