arrow
Volume 36, Issue 1
AutoAMG($θ$): An Auto-Tuned AMG Method Based on Deep Learning for Strong Threshold

Haifeng Zou, Xiaowen Xu, Chen-Song Zhang & Zeyao Mo

Commun. Comput. Phys., 36 (2024), pp. 200-220.

Published online: 2024-07

Export citation
  • Abstract

Algebraic Multigrid (AMG) is one of the most widely used iterative algorithms for solving large sparse linear equations $Ax=b.$ In AMG, the coarse grid is a key component that affects the efficiency of the algorithm, the construction of which relies on the strong threshold parameter $θ.$ This parameter is generally chosen empirically, with a default value in many current AMG solvers of 0.25 for 2D problems and 0.5 for 3D problems. However, for many practical problems, the quality of the coarse grid and the efficiency of the AMG algorithm are sensitive to $θ;$ the default value is rarely optimal, and sometimes is far from it. Therefore, how to choose a better $θ$ is an important question. In this paper, we propose a deep learning based auto-tuning method, AutoAMG($θ$) for multiscale sparse linear equations, which are common in practical problems. The method uses Graph Neural Network (GNN) to extract matrix features, and a Multilayer Perceptron (MLP) to build the mapping between matrix features and the optimal $θ,$ which can adaptively predict $θ$ values for different matrices. Numerical experiments show that AutoAMG($θ$) can achieve significant speedup compared to the default $θ$ value.

  • AMS Subject Headings

65F08, 65F10, 65N55, 68T05

  • Copyright

COPYRIGHT: © Global Science Press

  • Email address
  • BibTex
  • RIS
  • TXT
@Article{CiCP-36-200, author = {Zou , HaifengXu , XiaowenZhang , Chen-Song and Mo , Zeyao}, title = {AutoAMG($θ$): An Auto-Tuned AMG Method Based on Deep Learning for Strong Threshold}, journal = {Communications in Computational Physics}, year = {2024}, volume = {36}, number = {1}, pages = {200--220}, abstract = {

Algebraic Multigrid (AMG) is one of the most widely used iterative algorithms for solving large sparse linear equations $Ax=b.$ In AMG, the coarse grid is a key component that affects the efficiency of the algorithm, the construction of which relies on the strong threshold parameter $θ.$ This parameter is generally chosen empirically, with a default value in many current AMG solvers of 0.25 for 2D problems and 0.5 for 3D problems. However, for many practical problems, the quality of the coarse grid and the efficiency of the AMG algorithm are sensitive to $θ;$ the default value is rarely optimal, and sometimes is far from it. Therefore, how to choose a better $θ$ is an important question. In this paper, we propose a deep learning based auto-tuning method, AutoAMG($θ$) for multiscale sparse linear equations, which are common in practical problems. The method uses Graph Neural Network (GNN) to extract matrix features, and a Multilayer Perceptron (MLP) to build the mapping between matrix features and the optimal $θ,$ which can adaptively predict $θ$ values for different matrices. Numerical experiments show that AutoAMG($θ$) can achieve significant speedup compared to the default $θ$ value.

}, issn = {1991-7120}, doi = {https://doi.org/10.4208/cicp.OA-2023-0072}, url = {http://global-sci.org/intro/article_detail/cicp/23301.html} }
TY - JOUR T1 - AutoAMG($θ$): An Auto-Tuned AMG Method Based on Deep Learning for Strong Threshold AU - Zou , Haifeng AU - Xu , Xiaowen AU - Zhang , Chen-Song AU - Mo , Zeyao JO - Communications in Computational Physics VL - 1 SP - 200 EP - 220 PY - 2024 DA - 2024/07 SN - 36 DO - http://doi.org/10.4208/cicp.OA-2023-0072 UR - https://global-sci.org/intro/article_detail/cicp/23301.html KW - AMG, strong threshold, graph neural network, auto-tuning, multiscale matrix. AB -

Algebraic Multigrid (AMG) is one of the most widely used iterative algorithms for solving large sparse linear equations $Ax=b.$ In AMG, the coarse grid is a key component that affects the efficiency of the algorithm, the construction of which relies on the strong threshold parameter $θ.$ This parameter is generally chosen empirically, with a default value in many current AMG solvers of 0.25 for 2D problems and 0.5 for 3D problems. However, for many practical problems, the quality of the coarse grid and the efficiency of the AMG algorithm are sensitive to $θ;$ the default value is rarely optimal, and sometimes is far from it. Therefore, how to choose a better $θ$ is an important question. In this paper, we propose a deep learning based auto-tuning method, AutoAMG($θ$) for multiscale sparse linear equations, which are common in practical problems. The method uses Graph Neural Network (GNN) to extract matrix features, and a Multilayer Perceptron (MLP) to build the mapping between matrix features and the optimal $θ,$ which can adaptively predict $θ$ values for different matrices. Numerical experiments show that AutoAMG($θ$) can achieve significant speedup compared to the default $θ$ value.

Zou , HaifengXu , XiaowenZhang , Chen-Song and Mo , Zeyao. (2024). AutoAMG($θ$): An Auto-Tuned AMG Method Based on Deep Learning for Strong Threshold. Communications in Computational Physics. 36 (1). 200-220. doi:10.4208/cicp.OA-2023-0072
Copy to clipboard
The citation has been copied to your clipboard