arrow
Volume 36, Issue 1
Convergence Analysis for Over-Parameterized Deep Learning

Yuling Jiao, Xiliang Lu, Peiying Wu & Jerry Zhijian Yang

Commun. Comput. Phys., 36 (2024), pp. 71-103.

Published online: 2024-07

Export citation
  • Abstract

The success of deep learning in various applications has generated a growing interest in understanding its theoretical foundations. This paper presents a theoretical framework that explains why over-parameterized neural networks can perform well. Our analysis begins from the perspective of approximation theory and argues that over-parameterized deep neural networks with bounded norms can effectively approximate the target. Additionally, we demonstrate that the metric entropy of such networks is independent of the number of network parameters. We utilize these findings to derive consistency results for over-parameterized deep regression and the deep Ritz method, respectively. Furthermore, we prove convergence rates when the target has higher regularity, which, to our knowledge, represents the first convergence rate for over-parameterized deep learning.

  • AMS Subject Headings

65M15, 65N15, 65Y20

  • Copyright

COPYRIGHT: © Global Science Press

  • Email address
  • BibTex
  • RIS
  • TXT
@Article{CiCP-36-71, author = {Jiao , YulingLu , XiliangWu , Peiying and Yang , Jerry Zhijian}, title = {Convergence Analysis for Over-Parameterized Deep Learning}, journal = {Communications in Computational Physics}, year = {2024}, volume = {36}, number = {1}, pages = {71--103}, abstract = {

The success of deep learning in various applications has generated a growing interest in understanding its theoretical foundations. This paper presents a theoretical framework that explains why over-parameterized neural networks can perform well. Our analysis begins from the perspective of approximation theory and argues that over-parameterized deep neural networks with bounded norms can effectively approximate the target. Additionally, we demonstrate that the metric entropy of such networks is independent of the number of network parameters. We utilize these findings to derive consistency results for over-parameterized deep regression and the deep Ritz method, respectively. Furthermore, we prove convergence rates when the target has higher regularity, which, to our knowledge, represents the first convergence rate for over-parameterized deep learning.

}, issn = {1991-7120}, doi = {https://doi.org/10.4208/cicp.OA-2023-0264}, url = {http://global-sci.org/intro/article_detail/cicp/23297.html} }
TY - JOUR T1 - Convergence Analysis for Over-Parameterized Deep Learning AU - Jiao , Yuling AU - Lu , Xiliang AU - Wu , Peiying AU - Yang , Jerry Zhijian JO - Communications in Computational Physics VL - 1 SP - 71 EP - 103 PY - 2024 DA - 2024/07 SN - 36 DO - http://doi.org/10.4208/cicp.OA-2023-0264 UR - https://global-sci.org/intro/article_detail/cicp/23297.html KW - Over-parameterization, convergence rate, approximation, generalization. AB -

The success of deep learning in various applications has generated a growing interest in understanding its theoretical foundations. This paper presents a theoretical framework that explains why over-parameterized neural networks can perform well. Our analysis begins from the perspective of approximation theory and argues that over-parameterized deep neural networks with bounded norms can effectively approximate the target. Additionally, we demonstrate that the metric entropy of such networks is independent of the number of network parameters. We utilize these findings to derive consistency results for over-parameterized deep regression and the deep Ritz method, respectively. Furthermore, we prove convergence rates when the target has higher regularity, which, to our knowledge, represents the first convergence rate for over-parameterized deep learning.

Jiao , YulingLu , XiliangWu , Peiying and Yang , Jerry Zhijian. (2024). Convergence Analysis for Over-Parameterized Deep Learning. Communications in Computational Physics. 36 (1). 71-103. doi:10.4208/cicp.OA-2023-0264
Copy to clipboard
The citation has been copied to your clipboard