Numer. Math. Theor. Meth. Appl., 17 (2024), pp. 658-696.
Published online: 2024-08
Cited by
- BibTex
- RIS
- TXT
For the primal-dual monotone inclusion problem, the split-Douglas-Rachford (SDR) algorithm is a well-known first-order splitting method. Similar to other first-order algorithms, the efficiency of SDR is greatly influenced by its step parameters. Therefore, expanding the range of stepsizes may lead to improved numerical performance. In this paper, we prove that the stepsize range of SDR can be expanded based on a series of properties of the product Hilbert space. Moreover, we present a concise counterexample to illustrate that the newly proposed stepsize range cannot be further enhanced. Furthermore, to bridge the theoretical gap in the convergence rate of SDR, we analyze the descent of SDR’s fixed point residuals and provide the first proof of a sublinear convergence rate for the fixed point residuals. As an application, we utilize SDR to solve separable convex optimization problems with linear equality constraints and develop a novel preconditioned alternating direction method of multipliers (NP-ADMM). This method can handle scenarios where two linear operators are not identity operators. We propose strict convergence conditions and convergence rates for the preconditioned primal-dual split (P-PDS) method and NP-ADMM by demonstrating the relationship between SDR, P-PDS, and NP-ADMM. Finally, we conduct four numerical experiments to verify the computational efficiency of these algorithms and demonstrate that the performance of these algorithms has been significantly improved with the improved conditions.
}, issn = {2079-7338}, doi = {https://doi.org/10.4208/nmtma.OA-2023-0113}, url = {http://global-sci.org/intro/article_detail/nmtma/23370.html} }For the primal-dual monotone inclusion problem, the split-Douglas-Rachford (SDR) algorithm is a well-known first-order splitting method. Similar to other first-order algorithms, the efficiency of SDR is greatly influenced by its step parameters. Therefore, expanding the range of stepsizes may lead to improved numerical performance. In this paper, we prove that the stepsize range of SDR can be expanded based on a series of properties of the product Hilbert space. Moreover, we present a concise counterexample to illustrate that the newly proposed stepsize range cannot be further enhanced. Furthermore, to bridge the theoretical gap in the convergence rate of SDR, we analyze the descent of SDR’s fixed point residuals and provide the first proof of a sublinear convergence rate for the fixed point residuals. As an application, we utilize SDR to solve separable convex optimization problems with linear equality constraints and develop a novel preconditioned alternating direction method of multipliers (NP-ADMM). This method can handle scenarios where two linear operators are not identity operators. We propose strict convergence conditions and convergence rates for the preconditioned primal-dual split (P-PDS) method and NP-ADMM by demonstrating the relationship between SDR, P-PDS, and NP-ADMM. Finally, we conduct four numerical experiments to verify the computational efficiency of these algorithms and demonstrate that the performance of these algorithms has been significantly improved with the improved conditions.