- Journal Home
- Volume 18 - 2025
- Volume 17 - 2024
- Volume 16 - 2023
- Volume 15 - 2022
- Volume 14 - 2021
- Volume 13 - 2020
- Volume 12 - 2019
- Volume 11 - 2018
- Volume 10 - 2017
- Volume 9 - 2016
- Volume 8 - 2015
- Volume 7 - 2014
- Volume 6 - 2013
- Volume 5 - 2012
- Volume 4 - 2011
- Volume 3 - 2010
- Volume 2 - 2009
- Volume 1 - 2008
Numer. Math. Theor. Meth. Appl., 13 (2020), pp. 200-219.
Published online: 2019-12
Cited by
- BibTex
- RIS
- TXT
In recent years, alternating direction method of multipliers (ADMM) and its variants are popular for the extensive use in image processing and statistical learning. A variant of ADMM: symmetric ADMM, which updates the Lagrange multiplier twice in one iteration, is always faster whenever it converges. In this paper, combined with Nesterov's accelerating strategy, an accelerated symmetric ADMM is proposed. We prove its $\mathcal{O}(\frac{1}{k^2})$ convergence rate under strongly convex condition. For the general situation, an accelerated method with a restart rule is proposed. Some preliminary numerical experiments show the efficiency of our algorithms.
}, issn = {2079-7338}, doi = {https://doi.org/10.4208/nmtma.OA-2018-0108}, url = {http://global-sci.org/intro/article_detail/nmtma/13437.html} }In recent years, alternating direction method of multipliers (ADMM) and its variants are popular for the extensive use in image processing and statistical learning. A variant of ADMM: symmetric ADMM, which updates the Lagrange multiplier twice in one iteration, is always faster whenever it converges. In this paper, combined with Nesterov's accelerating strategy, an accelerated symmetric ADMM is proposed. We prove its $\mathcal{O}(\frac{1}{k^2})$ convergence rate under strongly convex condition. For the general situation, an accelerated method with a restart rule is proposed. Some preliminary numerical experiments show the efficiency of our algorithms.