- Journal Home
- Volume 41 - 2025
- Volume 40 - 2024
- Volume 39 - 2023
- Volume 38 - 2022
- Volume 37 - 2021
- Volume 36 - 2020
- Volume 35 - 2019
- Volume 34 - 2018
- Volume 33 - 2017
- Volume 32 - 2016
- Volume 31 - 2015
- Volume 30 - 2014
- Volume 29 - 2013
- Volume 28 - 2012
- Volume 27 - 2011
- Volume 26 - 2010
- Volume 25 - 2009
Cited by
- BibTex
- RIS
- TXT
In the paper, a reduced basis (RB) method for time-dependent nonlocal problems with a special parameterized fractional Laplace kernel function is proposed. Because of the lack of sparsity of discretized nonlocal systems compared to corresponding local partial differential equation (PDE) systems, model reduction for nonlocal systems becomes more critical. The method of snapshots and greedy (MOS-greedy) algorithm of RB method is developed for nonlocal problems with random inputs, which provides an efficient and reliable approximation of the solution. A major challenge lies in the excessive influence of the time domain on the model reduction process. To address this, the Fourier transform is applied to convert the original time-dependent parabolic equation into a frequency-dependent elliptic equation, where variable frequencies are independent. This enables parallel computation for approximating the solution in the frequency domain. Finally, the proposed MOS-greedy algorithm is applied to the nonlocal diffusion problems. Numerical results demonstrate that it provides an accurate approximation of the full order problems and significantly improves computational efficiency.
}, issn = {2707-8523}, doi = {https://doi.org/10.4208/cmr.2025-0002}, url = {http://global-sci.org/intro/article_detail/cmr/23932.html} }In the paper, a reduced basis (RB) method for time-dependent nonlocal problems with a special parameterized fractional Laplace kernel function is proposed. Because of the lack of sparsity of discretized nonlocal systems compared to corresponding local partial differential equation (PDE) systems, model reduction for nonlocal systems becomes more critical. The method of snapshots and greedy (MOS-greedy) algorithm of RB method is developed for nonlocal problems with random inputs, which provides an efficient and reliable approximation of the solution. A major challenge lies in the excessive influence of the time domain on the model reduction process. To address this, the Fourier transform is applied to convert the original time-dependent parabolic equation into a frequency-dependent elliptic equation, where variable frequencies are independent. This enables parallel computation for approximating the solution in the frequency domain. Finally, the proposed MOS-greedy algorithm is applied to the nonlocal diffusion problems. Numerical results demonstrate that it provides an accurate approximation of the full order problems and significantly improves computational efficiency.