- Journal Home
- Volume 42 - 2024
- Volume 41 - 2023
- Volume 40 - 2022
- Volume 39 - 2021
- Volume 38 - 2020
- Volume 37 - 2019
- Volume 36 - 2018
- Volume 35 - 2017
- Volume 34 - 2016
- Volume 33 - 2015
- Volume 32 - 2014
- Volume 31 - 2013
- Volume 30 - 2012
- Volume 29 - 2011
- Volume 28 - 2010
- Volume 27 - 2009
- Volume 26 - 2008
- Volume 25 - 2007
- Volume 24 - 2006
- Volume 23 - 2005
- Volume 22 - 2004
- Volume 21 - 2003
- Volume 20 - 2002
- Volume 19 - 2001
- Volume 18 - 2000
- Volume 17 - 1999
- Volume 16 - 1998
- Volume 15 - 1997
- Volume 14 - 1996
- Volume 13 - 1995
- Volume 12 - 1994
- Volume 11 - 1993
- Volume 10 - 1992
- Volume 9 - 1991
- Volume 8 - 1990
- Volume 7 - 1989
- Volume 6 - 1988
- Volume 5 - 1987
- Volume 4 - 1986
- Volume 3 - 1985
- Volume 2 - 1984
- Volume 1 - 1983
Cited by
- BibTex
- RIS
- TXT
A stochastic approximation (SA) algorithm with new adaptive step sizes for solving unconstrained minimization problems in noisy environment is proposed. New adaptive step size scheme uses ordered statistics of fixed number of previous noisy function values as a criterion for accepting good and rejecting bad steps. The scheme allows the algorithm to move in bigger steps and avoid steps proportional to $1/k$ when it is expected that larger steps will improve the performance. An algorithm with the new adaptive scheme is defined for a general descent direction. The almost sure convergence is established. The performance of new algorithm is tested on a set of standard test problems and compared with relevant algorithms. Numerical results support theoretical expectations and verify efficiency of the algorithm regardless of chosen search direction and noise level. Numerical results on problems arising in machine learning are also presented. Linear regression problem is considered using real data set. The results suggest that the proposed algorithm shows promise.
}, issn = {1991-7139}, doi = {https://doi.org/10.4208/jcm.1710-m2017-0021}, url = {http://global-sci.org/intro/article_detail/jcm/12650.html} }A stochastic approximation (SA) algorithm with new adaptive step sizes for solving unconstrained minimization problems in noisy environment is proposed. New adaptive step size scheme uses ordered statistics of fixed number of previous noisy function values as a criterion for accepting good and rejecting bad steps. The scheme allows the algorithm to move in bigger steps and avoid steps proportional to $1/k$ when it is expected that larger steps will improve the performance. An algorithm with the new adaptive scheme is defined for a general descent direction. The almost sure convergence is established. The performance of new algorithm is tested on a set of standard test problems and compared with relevant algorithms. Numerical results support theoretical expectations and verify efficiency of the algorithm regardless of chosen search direction and noise level. Numerical results on problems arising in machine learning are also presented. Linear regression problem is considered using real data set. The results suggest that the proposed algorithm shows promise.