Volume 3, Issue 1
On the Existence of Optimal Shallow Feedforward Networks with ReLU Activation

Steffen Dereich & Sebastian Kassing

J. Mach. Learn. , 3 (2024), pp. 1-22.

Published online: 2024-03

[An open-access article; the PDF is free to any online user.]

Export citation
  • Abstract

We prove existence of global minima in the loss landscape for the approximation of continuous target functions using shallow feedforward artificial neural networks with ReLU activation. This property is one of the fundamental artifacts separating ReLU from other commonly used activation functions. We propose a kind of closure of the search space so that in the extended space minimizers exist. In a second step, we show under mild assumptions that the newly added functions in the extension perform worse than appropriate representable ReLU networks. This then implies that the optimal response in the extended target space is indeed the response of a ReLU network.

  • AMS Subject Headings

  • Copyright

COPYRIGHT: © Global Science Press

  • Email address
  • BibTex
  • RIS
  • TXT
@Article{JML-3-1, author = {Dereich , Steffen and Kassing , Sebastian}, title = {On the Existence of Optimal Shallow Feedforward Networks with ReLU Activation}, journal = {Journal of Machine Learning}, year = {2024}, volume = {3}, number = {1}, pages = {1--22}, abstract = {

We prove existence of global minima in the loss landscape for the approximation of continuous target functions using shallow feedforward artificial neural networks with ReLU activation. This property is one of the fundamental artifacts separating ReLU from other commonly used activation functions. We propose a kind of closure of the search space so that in the extended space minimizers exist. In a second step, we show under mild assumptions that the newly added functions in the extension perform worse than appropriate representable ReLU networks. This then implies that the optimal response in the extended target space is indeed the response of a ReLU network.

}, issn = {2790-2048}, doi = {https://doi.org/10.4208/jml.230903}, url = {http://global-sci.org/intro/article_detail/jml/22982.html} }
TY - JOUR T1 - On the Existence of Optimal Shallow Feedforward Networks with ReLU Activation AU - Dereich , Steffen AU - Kassing , Sebastian JO - Journal of Machine Learning VL - 1 SP - 1 EP - 22 PY - 2024 DA - 2024/03 SN - 3 DO - http://doi.org/10.4208/jml.230903 UR - https://global-sci.org/intro/article_detail/jml/22982.html KW - Neural Networks, Shallow Networks, Best Approximation, ReLU Activation, Approximatively Compact. AB -

We prove existence of global minima in the loss landscape for the approximation of continuous target functions using shallow feedforward artificial neural networks with ReLU activation. This property is one of the fundamental artifacts separating ReLU from other commonly used activation functions. We propose a kind of closure of the search space so that in the extended space minimizers exist. In a second step, we show under mild assumptions that the newly added functions in the extension perform worse than appropriate representable ReLU networks. This then implies that the optimal response in the extended target space is indeed the response of a ReLU network.

Dereich , Steffen and Kassing , Sebastian. (2024). On the Existence of Optimal Shallow Feedforward Networks with ReLU Activation. Journal of Machine Learning. 3 (1). 1-22. doi:10.4208/jml.230903
Copy to clipboard
The citation has been copied to your clipboard