Anal. Theory Appl., 36 (2020), pp. 262-282.
Published online: 2020-09
[An open-access article; the PDF is free to any online user.]
Cited by
- BibTex
- RIS
- TXT
It is well recognized the convenience of converting the linearly constrained convex optimization problems to a monotone variational inequality. Recently, we have proposed a unified algorithmic framework which can guide us to construct the solution methods for solving these monotone variational inequalities. In this work, we revisit two full Jacobian decomposition of the augmented Lagrangian methods for separable convex programming which we have studied a few years ago. In particular, exploiting this framework, we are able to give a very clear and elementary proof of the convergence of these solution methods.
}, issn = {1573-8175}, doi = {https://doi.org/10.4208/ata.OA-SU13}, url = {http://global-sci.org/intro/article_detail/ata/18286.html} }It is well recognized the convenience of converting the linearly constrained convex optimization problems to a monotone variational inequality. Recently, we have proposed a unified algorithmic framework which can guide us to construct the solution methods for solving these monotone variational inequalities. In this work, we revisit two full Jacobian decomposition of the augmented Lagrangian methods for separable convex programming which we have studied a few years ago. In particular, exploiting this framework, we are able to give a very clear and elementary proof of the convergence of these solution methods.