arrow
Volume 13, Issue 3
Approximation Analysis of Convolutional Neural Networks

Chenglong Bao, Qianxiao Li, Zuowei Shen, Cheng Tai, Lei Wu & Xueshuang Xiang

East Asian J. Appl. Math., 13 (2023), pp. 524-549.

Published online: 2023-05

[An open-access article; the PDF is free to any online user.]

Export citation
  • Abstract

In its simplest form, convolution neural networks (CNNs) consist of a fully connected two-layer network $g$ composed with a sequence of convolution layers $T.$ Although $g$ is known to have the universal approximation property, it is not known if CNNs, which have the form $g◦T$ inherit this property, especially when the kernel size in $T$ is small. In this paper, we show that under suitable conditions, CNNs do inherit the universal approximation property and its sample complexity can be characterized. In addition, we discuss concretely how the nonlinearity of $T$ can improve the approximation power. Finally, we show that when the target function class has a certain compositional form, convolutional networks are far more advantageous compared with fully connected networks, in terms of the number of parameters needed to achieve the desired accuracy.

  • AMS Subject Headings

41A63, 68T01

  • Copyright

COPYRIGHT: © Global Science Press

  • Email address
  • BibTex
  • RIS
  • TXT
@Article{EAJAM-13-524, author = {Bao , ChenglongLi , QianxiaoShen , ZuoweiTai , ChengWu , Lei and Xiang , Xueshuang}, title = {Approximation Analysis of Convolutional Neural Networks}, journal = {East Asian Journal on Applied Mathematics}, year = {2023}, volume = {13}, number = {3}, pages = {524--549}, abstract = {

In its simplest form, convolution neural networks (CNNs) consist of a fully connected two-layer network $g$ composed with a sequence of convolution layers $T.$ Although $g$ is known to have the universal approximation property, it is not known if CNNs, which have the form $g◦T$ inherit this property, especially when the kernel size in $T$ is small. In this paper, we show that under suitable conditions, CNNs do inherit the universal approximation property and its sample complexity can be characterized. In addition, we discuss concretely how the nonlinearity of $T$ can improve the approximation power. Finally, we show that when the target function class has a certain compositional form, convolutional networks are far more advantageous compared with fully connected networks, in terms of the number of parameters needed to achieve the desired accuracy.

}, issn = {2079-7370}, doi = {https://doi.org/10.4208/eajam.2022-270.070123 }, url = {http://global-sci.org/intro/article_detail/eajam/21721.html} }
TY - JOUR T1 - Approximation Analysis of Convolutional Neural Networks AU - Bao , Chenglong AU - Li , Qianxiao AU - Shen , Zuowei AU - Tai , Cheng AU - Wu , Lei AU - Xiang , Xueshuang JO - East Asian Journal on Applied Mathematics VL - 3 SP - 524 EP - 549 PY - 2023 DA - 2023/05 SN - 13 DO - http://doi.org/10.4208/eajam.2022-270.070123 UR - https://global-sci.org/intro/article_detail/eajam/21721.html KW - Convolutional networks, approximation, scaling analysis, compositional functions. AB -

In its simplest form, convolution neural networks (CNNs) consist of a fully connected two-layer network $g$ composed with a sequence of convolution layers $T.$ Although $g$ is known to have the universal approximation property, it is not known if CNNs, which have the form $g◦T$ inherit this property, especially when the kernel size in $T$ is small. In this paper, we show that under suitable conditions, CNNs do inherit the universal approximation property and its sample complexity can be characterized. In addition, we discuss concretely how the nonlinearity of $T$ can improve the approximation power. Finally, we show that when the target function class has a certain compositional form, convolutional networks are far more advantageous compared with fully connected networks, in terms of the number of parameters needed to achieve the desired accuracy.

Bao , ChenglongLi , QianxiaoShen , ZuoweiTai , ChengWu , Lei and Xiang , Xueshuang. (2023). Approximation Analysis of Convolutional Neural Networks. East Asian Journal on Applied Mathematics. 13 (3). 524-549. doi:10.4208/eajam.2022-270.070123
Copy to clipboard
The citation has been copied to your clipboard