arrow
Volume 39, Issue 1
On Approximation by Neural Networks with Optimized Activation Functions and Fixed Weights

Dansheng Yu, Yunyou Qian & Fengjun Li

Anal. Theory Appl., 39 (2023), pp. 93-104.

Published online: 2023-03

Export citation
  • Abstract

Recently, Li [16] introduced three kinds of single-hidden layer feed-forward neural networks with optimized piecewise linear activation functions and fixed weights, and obtained the upper and lower bound estimations on the approximation accuracy of the FNNs, for continuous function defined on bounded intervals. In the present paper, we point out that there are some errors both in the definitions of the FNNs and in the proof of the upper estimations in [16]. By using new methods, we also give right approximation rate estimations of the approximation by Li’s neural networks.

  • AMS Subject Headings

41A35, 41A25, 41A20

  • Copyright

COPYRIGHT: © Global Science Press

  • Email address
  • BibTex
  • RIS
  • TXT
@Article{ATA-39-93, author = {Yu , DanshengQian , Yunyou and Li , Fengjun}, title = {On Approximation by Neural Networks with Optimized Activation Functions and Fixed Weights}, journal = {Analysis in Theory and Applications}, year = {2023}, volume = {39}, number = {1}, pages = {93--104}, abstract = {

Recently, Li [16] introduced three kinds of single-hidden layer feed-forward neural networks with optimized piecewise linear activation functions and fixed weights, and obtained the upper and lower bound estimations on the approximation accuracy of the FNNs, for continuous function defined on bounded intervals. In the present paper, we point out that there are some errors both in the definitions of the FNNs and in the proof of the upper estimations in [16]. By using new methods, we also give right approximation rate estimations of the approximation by Li’s neural networks.

}, issn = {1573-8175}, doi = {https://doi.org/10.4208/ata.OA-2021-0006}, url = {http://global-sci.org/intro/article_detail/ata/21464.html} }
TY - JOUR T1 - On Approximation by Neural Networks with Optimized Activation Functions and Fixed Weights AU - Yu , Dansheng AU - Qian , Yunyou AU - Li , Fengjun JO - Analysis in Theory and Applications VL - 1 SP - 93 EP - 104 PY - 2023 DA - 2023/03 SN - 39 DO - http://doi.org/10.4208/ata.OA-2021-0006 UR - https://global-sci.org/intro/article_detail/ata/21464.html KW - Approximation rate, modulus of continuity, modulus of smoothness, neural network operators. AB -

Recently, Li [16] introduced three kinds of single-hidden layer feed-forward neural networks with optimized piecewise linear activation functions and fixed weights, and obtained the upper and lower bound estimations on the approximation accuracy of the FNNs, for continuous function defined on bounded intervals. In the present paper, we point out that there are some errors both in the definitions of the FNNs and in the proof of the upper estimations in [16]. By using new methods, we also give right approximation rate estimations of the approximation by Li’s neural networks.

Yu , DanshengQian , Yunyou and Li , Fengjun. (2023). On Approximation by Neural Networks with Optimized Activation Functions and Fixed Weights. Analysis in Theory and Applications. 39 (1). 93-104. doi:10.4208/ata.OA-2021-0006
Copy to clipboard
The citation has been copied to your clipboard