arrow
Volume 17, Issue 3
Convergence Analysis of Split-Douglas-Rachford Algorithm and a Novel Preconditioned ADMM with an Improved Condition

Maoran Wang, Xingju Cai & Yongxin Chen

Numer. Math. Theor. Meth. Appl., 17 (2024), pp. 658-696.

Published online: 2024-08

Export citation
  • Abstract

For the primal-dual monotone inclusion problem, the split-Douglas-Rachford (SDR) algorithm is a well-known first-order splitting method. Similar to other first-order algorithms, the efficiency of SDR is greatly influenced by its step parameters. Therefore, expanding the range of stepsizes may lead to improved numerical performance. In this paper, we prove that the stepsize range of SDR can be expanded based on a series of properties of the product Hilbert space. Moreover, we present a concise counterexample to illustrate that the newly proposed stepsize range cannot be further enhanced. Furthermore, to bridge the theoretical gap in the convergence rate of SDR, we analyze the descent of SDR’s fixed point residuals and provide the first proof of a sublinear convergence rate for the fixed point residuals. As an application, we utilize SDR to solve separable convex optimization problems with linear equality constraints and develop a novel preconditioned alternating direction method of multipliers (NP-ADMM). This method can handle scenarios where two linear operators are not identity operators. We propose strict convergence conditions and convergence rates for the preconditioned primal-dual split (P-PDS) method and NP-ADMM by demonstrating the relationship between SDR, P-PDS, and NP-ADMM. Finally, we conduct four numerical experiments to verify the computational efficiency of these algorithms and demonstrate that the performance of these algorithms has been significantly improved with the improved conditions.

  • AMS Subject Headings

90C25, 90C30, 90C33, 90C47

  • Copyright

COPYRIGHT: © Global Science Press

  • Email address
  • BibTex
  • RIS
  • TXT
@Article{NMTMA-17-658, author = {Wang , MaoranCai , Xingju and Chen , Yongxin}, title = {Convergence Analysis of Split-Douglas-Rachford Algorithm and a Novel Preconditioned ADMM with an Improved Condition}, journal = {Numerical Mathematics: Theory, Methods and Applications}, year = {2024}, volume = {17}, number = {3}, pages = {658--696}, abstract = {

For the primal-dual monotone inclusion problem, the split-Douglas-Rachford (SDR) algorithm is a well-known first-order splitting method. Similar to other first-order algorithms, the efficiency of SDR is greatly influenced by its step parameters. Therefore, expanding the range of stepsizes may lead to improved numerical performance. In this paper, we prove that the stepsize range of SDR can be expanded based on a series of properties of the product Hilbert space. Moreover, we present a concise counterexample to illustrate that the newly proposed stepsize range cannot be further enhanced. Furthermore, to bridge the theoretical gap in the convergence rate of SDR, we analyze the descent of SDR’s fixed point residuals and provide the first proof of a sublinear convergence rate for the fixed point residuals. As an application, we utilize SDR to solve separable convex optimization problems with linear equality constraints and develop a novel preconditioned alternating direction method of multipliers (NP-ADMM). This method can handle scenarios where two linear operators are not identity operators. We propose strict convergence conditions and convergence rates for the preconditioned primal-dual split (P-PDS) method and NP-ADMM by demonstrating the relationship between SDR, P-PDS, and NP-ADMM. Finally, we conduct four numerical experiments to verify the computational efficiency of these algorithms and demonstrate that the performance of these algorithms has been significantly improved with the improved conditions.

}, issn = {2079-7338}, doi = {https://doi.org/10.4208/nmtma.OA-2023-0113}, url = {http://global-sci.org/intro/article_detail/nmtma/23370.html} }
TY - JOUR T1 - Convergence Analysis of Split-Douglas-Rachford Algorithm and a Novel Preconditioned ADMM with an Improved Condition AU - Wang , Maoran AU - Cai , Xingju AU - Chen , Yongxin JO - Numerical Mathematics: Theory, Methods and Applications VL - 3 SP - 658 EP - 696 PY - 2024 DA - 2024/08 SN - 17 DO - http://doi.org/10.4208/nmtma.OA-2023-0113 UR - https://global-sci.org/intro/article_detail/nmtma/23370.html KW - Primal-dual monotone inclusion problem, split-Douglas-Rachford algorithm, preconditioned ADMM, improved convergence condition, sublinear convergence rate. AB -

For the primal-dual monotone inclusion problem, the split-Douglas-Rachford (SDR) algorithm is a well-known first-order splitting method. Similar to other first-order algorithms, the efficiency of SDR is greatly influenced by its step parameters. Therefore, expanding the range of stepsizes may lead to improved numerical performance. In this paper, we prove that the stepsize range of SDR can be expanded based on a series of properties of the product Hilbert space. Moreover, we present a concise counterexample to illustrate that the newly proposed stepsize range cannot be further enhanced. Furthermore, to bridge the theoretical gap in the convergence rate of SDR, we analyze the descent of SDR’s fixed point residuals and provide the first proof of a sublinear convergence rate for the fixed point residuals. As an application, we utilize SDR to solve separable convex optimization problems with linear equality constraints and develop a novel preconditioned alternating direction method of multipliers (NP-ADMM). This method can handle scenarios where two linear operators are not identity operators. We propose strict convergence conditions and convergence rates for the preconditioned primal-dual split (P-PDS) method and NP-ADMM by demonstrating the relationship between SDR, P-PDS, and NP-ADMM. Finally, we conduct four numerical experiments to verify the computational efficiency of these algorithms and demonstrate that the performance of these algorithms has been significantly improved with the improved conditions.

Wang , MaoranCai , Xingju and Chen , Yongxin. (2024). Convergence Analysis of Split-Douglas-Rachford Algorithm and a Novel Preconditioned ADMM with an Improved Condition. Numerical Mathematics: Theory, Methods and Applications. 17 (3). 658-696. doi:10.4208/nmtma.OA-2023-0113
Copy to clipboard
The citation has been copied to your clipboard