- Journal Home
- Volume 42 - 2024
- Volume 41 - 2023
- Volume 40 - 2022
- Volume 39 - 2021
- Volume 38 - 2020
- Volume 37 - 2019
- Volume 36 - 2018
- Volume 35 - 2017
- Volume 34 - 2016
- Volume 33 - 2015
- Volume 32 - 2014
- Volume 31 - 2013
- Volume 30 - 2012
- Volume 29 - 2011
- Volume 28 - 2010
- Volume 27 - 2009
- Volume 26 - 2008
- Volume 25 - 2007
- Volume 24 - 2006
- Volume 23 - 2005
- Volume 22 - 2004
- Volume 21 - 2003
- Volume 20 - 2002
- Volume 19 - 2001
- Volume 18 - 2000
- Volume 17 - 1999
- Volume 16 - 1998
- Volume 15 - 1997
- Volume 14 - 1996
- Volume 13 - 1995
- Volume 12 - 1994
- Volume 11 - 1993
- Volume 10 - 1992
- Volume 9 - 1991
- Volume 8 - 1990
- Volume 7 - 1989
- Volume 6 - 1988
- Volume 5 - 1987
- Volume 4 - 1986
- Volume 3 - 1985
- Volume 2 - 1984
- Volume 1 - 1983
Cited by
- BibTex
- RIS
- TXT
A new adaptive subspace minimization three-term conjugate gradient algorithm with nonmonotone line search is introduced and analyzed in this paper. The search directions are computed by minimizing a quadratic approximation of the objective function on special subspaces, and we also proposed an adaptive rule for choosing different searching directions at each iteration. We obtain a significant conclusion that the each choice of the search directions satisfies the sufficient descent condition. With the used nonmonotone line search, we prove that the new algorithm is globally convergent for general nonlinear functions under some mild assumptions. Numerical experiments show that the proposed algorithm is promising for the given test problem set.
}, issn = {1991-7139}, doi = {https://doi.org/10.4208/jcm.1907-m2018-0173}, url = {http://global-sci.org/intro/article_detail/jcm/18369.html} }A new adaptive subspace minimization three-term conjugate gradient algorithm with nonmonotone line search is introduced and analyzed in this paper. The search directions are computed by minimizing a quadratic approximation of the objective function on special subspaces, and we also proposed an adaptive rule for choosing different searching directions at each iteration. We obtain a significant conclusion that the each choice of the search directions satisfies the sufficient descent condition. With the used nonmonotone line search, we prove that the new algorithm is globally convergent for general nonlinear functions under some mild assumptions. Numerical experiments show that the proposed algorithm is promising for the given test problem set.