arrow
Volume 20, Issue 6
A Note on the Nonlinear Conjugate Gradient Method

Yu-Hong Dai & Ya-Xiang Yuan

J. Comp. Math., 20 (2002), pp. 575-582.

Published online: 2002-12

Export citation
  • Abstract

The conjugate gradient method for unconstrained optimization problems varies with a scalar. In this note, a general condition concerning the scalar is given, which ensures the global convergence of the method in the case of strong Wolfe line searches. It is also discussed how to use the result to obtain the convergence of the famous Fletcher-Reeves, and Polak-Ribiére-Polyak conjugate gradient methods. That the condition cannot be relaxed in some sense is mentioned.  

  • AMS Subject Headings

  • Copyright

COPYRIGHT: © Global Science Press

  • Email address
  • BibTex
  • RIS
  • TXT
@Article{JCM-20-575, author = { , Yu-Hong Dai and Yuan , Ya-Xiang}, title = {A Note on the Nonlinear Conjugate Gradient Method}, journal = {Journal of Computational Mathematics}, year = {2002}, volume = {20}, number = {6}, pages = {575--582}, abstract = {

The conjugate gradient method for unconstrained optimization problems varies with a scalar. In this note, a general condition concerning the scalar is given, which ensures the global convergence of the method in the case of strong Wolfe line searches. It is also discussed how to use the result to obtain the convergence of the famous Fletcher-Reeves, and Polak-Ribiére-Polyak conjugate gradient methods. That the condition cannot be relaxed in some sense is mentioned.  

}, issn = {1991-7139}, doi = {https://doi.org/}, url = {http://global-sci.org/intro/article_detail/jcm/8942.html} }
TY - JOUR T1 - A Note on the Nonlinear Conjugate Gradient Method AU - , Yu-Hong Dai AU - Yuan , Ya-Xiang JO - Journal of Computational Mathematics VL - 6 SP - 575 EP - 582 PY - 2002 DA - 2002/12 SN - 20 DO - http://doi.org/ UR - https://global-sci.org/intro/article_detail/jcm/8942.html KW - Unconstrained optimization, Conjugate gradient, Line search, Global convergence. AB -

The conjugate gradient method for unconstrained optimization problems varies with a scalar. In this note, a general condition concerning the scalar is given, which ensures the global convergence of the method in the case of strong Wolfe line searches. It is also discussed how to use the result to obtain the convergence of the famous Fletcher-Reeves, and Polak-Ribiére-Polyak conjugate gradient methods. That the condition cannot be relaxed in some sense is mentioned.  

, Yu-Hong Dai and Yuan , Ya-Xiang. (2002). A Note on the Nonlinear Conjugate Gradient Method. Journal of Computational Mathematics. 20 (6). 575-582. doi:
Copy to clipboard
The citation has been copied to your clipboard