arrow
Volume 39, Issue 3
Two Novel Gradient Methods with Optimal Step Sizes

Harry Oviedo, Oscar Dalmau & Rafael Herrera

J. Comp. Math., 39 (2021), pp. 375-391.

Published online: 2021-04

Export citation
  • Abstract

In this work we introduce two new Barzilai and Borwein-like steps sizes for the classical gradient method for strictly convex quadratic optimization problems. The proposed step sizes employ second-order information in order to obtain faster gradient-type methods. Both step sizes are derived from two unconstrained optimization models that involve approximate information of the Hessian of the objective function. A convergence analysis of the proposed algorithm is provided. Some numerical experiments are performed in order to compare the efficiency and effectiveness of the proposed methods with similar methods in the literature. Experimentally, it is observed that our proposals accelerate the gradient method at nearly no extra computational cost, which makes our proposal a good alternative to solve large-scale problems.

  • AMS Subject Headings

90C20, 90C25, 90C52, 65F10.

  • Copyright

COPYRIGHT: © Global Science Press

  • Email address

harry.oviedo@cimat.mx (Harry Oviedo)

dalmau@cimat.mx (Oscar Dalmau)

rherrera@cimat.mx (Rafael Herrera)

  • BibTex
  • RIS
  • TXT
@Article{JCM-39-375, author = {Oviedo , HarryDalmau , Oscar and Herrera , Rafael}, title = {Two Novel Gradient Methods with Optimal Step Sizes}, journal = {Journal of Computational Mathematics}, year = {2021}, volume = {39}, number = {3}, pages = {375--391}, abstract = {

In this work we introduce two new Barzilai and Borwein-like steps sizes for the classical gradient method for strictly convex quadratic optimization problems. The proposed step sizes employ second-order information in order to obtain faster gradient-type methods. Both step sizes are derived from two unconstrained optimization models that involve approximate information of the Hessian of the objective function. A convergence analysis of the proposed algorithm is provided. Some numerical experiments are performed in order to compare the efficiency and effectiveness of the proposed methods with similar methods in the literature. Experimentally, it is observed that our proposals accelerate the gradient method at nearly no extra computational cost, which makes our proposal a good alternative to solve large-scale problems.

}, issn = {1991-7139}, doi = {https://doi.org/10.4208/jcm.2001-m2018-0205}, url = {http://global-sci.org/intro/article_detail/jcm/18746.html} }
TY - JOUR T1 - Two Novel Gradient Methods with Optimal Step Sizes AU - Oviedo , Harry AU - Dalmau , Oscar AU - Herrera , Rafael JO - Journal of Computational Mathematics VL - 3 SP - 375 EP - 391 PY - 2021 DA - 2021/04 SN - 39 DO - http://doi.org/10.4208/jcm.2001-m2018-0205 UR - https://global-sci.org/intro/article_detail/jcm/18746.html KW - Gradient methods, Convex quadratic optimization, Hessian spectral properties, Steplength selection. AB -

In this work we introduce two new Barzilai and Borwein-like steps sizes for the classical gradient method for strictly convex quadratic optimization problems. The proposed step sizes employ second-order information in order to obtain faster gradient-type methods. Both step sizes are derived from two unconstrained optimization models that involve approximate information of the Hessian of the objective function. A convergence analysis of the proposed algorithm is provided. Some numerical experiments are performed in order to compare the efficiency and effectiveness of the proposed methods with similar methods in the literature. Experimentally, it is observed that our proposals accelerate the gradient method at nearly no extra computational cost, which makes our proposal a good alternative to solve large-scale problems.

Oviedo , HarryDalmau , Oscar and Herrera , Rafael. (2021). Two Novel Gradient Methods with Optimal Step Sizes. Journal of Computational Mathematics. 39 (3). 375-391. doi:10.4208/jcm.2001-m2018-0205
Copy to clipboard
The citation has been copied to your clipboard