Volume 3, Issue 3
Convergence of Stochastic Gradient Descent Schemes for Łojasiewicz-Landscapes

Steffen Dereich & Sebastian Kassing

J. Mach. Learn. , 3 (2024), pp. 245-281.

Published online: 2024-09

[An open-access article; the PDF is free to any online user.]

Export citation
  • Abstract

In this article, we consider convergence of stochastic gradient descent schemes (SGD), including momentum stochastic gradient descent (MSGD), under weak assumptions on the underlying landscape. More explicitly, we show that on the event that the SGD stays bounded we have convergence of the SGD if there is only a countable number of critical points or if the objective function satisfies Łojasiewicz-inequalities around all critical levels as all analytic functions do. In particular, we show that for neural networks with analytic activation function such as softplus, sigmoid and the hyperbolic tangent, SGD converges on the event of staying bounded, if the random variables modelling the signal and response in the training are compactly supported.

  • AMS Subject Headings

  • Copyright

COPYRIGHT: © Global Science Press

  • Email address
  • BibTex
  • RIS
  • TXT
@Article{JML-3-245, author = {Dereich , Steffen and Kassing , Sebastian}, title = {Convergence of Stochastic Gradient Descent Schemes for Łojasiewicz-Landscapes}, journal = {Journal of Machine Learning}, year = {2024}, volume = {3}, number = {3}, pages = {245--281}, abstract = {

In this article, we consider convergence of stochastic gradient descent schemes (SGD), including momentum stochastic gradient descent (MSGD), under weak assumptions on the underlying landscape. More explicitly, we show that on the event that the SGD stays bounded we have convergence of the SGD if there is only a countable number of critical points or if the objective function satisfies Łojasiewicz-inequalities around all critical levels as all analytic functions do. In particular, we show that for neural networks with analytic activation function such as softplus, sigmoid and the hyperbolic tangent, SGD converges on the event of staying bounded, if the random variables modelling the signal and response in the training are compactly supported.

}, issn = {2790-2048}, doi = {https://doi.org/10.4208/jml.240109}, url = {http://global-sci.org/intro/article_detail/jml/23416.html} }
TY - JOUR T1 - Convergence of Stochastic Gradient Descent Schemes for Łojasiewicz-Landscapes AU - Dereich , Steffen AU - Kassing , Sebastian JO - Journal of Machine Learning VL - 3 SP - 245 EP - 281 PY - 2024 DA - 2024/09 SN - 3 DO - http://doi.org/10.4208/jml.240109 UR - https://global-sci.org/intro/article_detail/jml/23416.html KW - Stochastic gradient descent, Stochastic approximation, Robbins-Monro, Almost sure convergence, Łojasiewicz-inequality. AB -

In this article, we consider convergence of stochastic gradient descent schemes (SGD), including momentum stochastic gradient descent (MSGD), under weak assumptions on the underlying landscape. More explicitly, we show that on the event that the SGD stays bounded we have convergence of the SGD if there is only a countable number of critical points or if the objective function satisfies Łojasiewicz-inequalities around all critical levels as all analytic functions do. In particular, we show that for neural networks with analytic activation function such as softplus, sigmoid and the hyperbolic tangent, SGD converges on the event of staying bounded, if the random variables modelling the signal and response in the training are compactly supported.

Dereich , Steffen and Kassing , Sebastian. (2024). Convergence of Stochastic Gradient Descent Schemes for Łojasiewicz-Landscapes. Journal of Machine Learning. 3 (3). 245-281. doi:10.4208/jml.240109
Copy to clipboard
The citation has been copied to your clipboard