- Journal Home
- Volume 21 - 2024
- Volume 20 - 2023
- Volume 19 - 2022
- Volume 18 - 2021
- Volume 17 - 2020
- Volume 16 - 2019
- Volume 15 - 2018
- Volume 14 - 2017
- Volume 13 - 2016
- Volume 12 - 2015
- Volume 11 - 2014
- Volume 10 - 2013
- Volume 9 - 2012
- Volume 8 - 2011
- Volume 7 - 2010
- Volume 6 - 2009
- Volume 5 - 2008
- Volume 4 - 2007
- Volume 3 - 2006
- Volume 2 - 2005
- Volume 1 - 2004
Int. J. Numer. Anal. Mod., 21 (2024), pp. 609-628.
Published online: 2024-10
Cited by
- BibTex
- RIS
- TXT
The least-squares ReLU neural network (LSNN) method was introduced and studied for solving linear advection-reaction equation with discontinuous solution in [4, 5]. The method is based on an equivalent least-squares formulation and [5] employs ReLU neural network (NN) functions with ⌈${\rm log}_2(d+1)$⌉$+1$-layer representations for approximating solutions. In this paper, we show theoretically that the method is also capable of accurately approximating non-constant jumps along discontinuous interfaces that are not necessarily straight lines. Theoretical results are confirmed through multiple numerical examples with $d = 2, 3$ and various non-constant jumps and interface shapes, showing that the LSNN method with ⌈${\rm log}_2 (d + 1)$⌉$+1$ layers approximates solutions accurately with degrees of freedom less than that of mesh-based methods and without the common Gibbs phenomena along discontinuous interfaces having non-constant jumps.
}, issn = {2617-8710}, doi = {https://doi.org/10.4208/ijnam2024-1024}, url = {http://global-sci.org/intro/article_detail/ijnam/23445.html} }The least-squares ReLU neural network (LSNN) method was introduced and studied for solving linear advection-reaction equation with discontinuous solution in [4, 5]. The method is based on an equivalent least-squares formulation and [5] employs ReLU neural network (NN) functions with ⌈${\rm log}_2(d+1)$⌉$+1$-layer representations for approximating solutions. In this paper, we show theoretically that the method is also capable of accurately approximating non-constant jumps along discontinuous interfaces that are not necessarily straight lines. Theoretical results are confirmed through multiple numerical examples with $d = 2, 3$ and various non-constant jumps and interface shapes, showing that the LSNN method with ⌈${\rm log}_2 (d + 1)$⌉$+1$ layers approximates solutions accurately with degrees of freedom less than that of mesh-based methods and without the common Gibbs phenomena along discontinuous interfaces having non-constant jumps.