Numer. Math. Theor. Meth. Appl., 10 (2017), pp. 775-797.
Published online: 2017-11
Cited by
- BibTex
- RIS
- TXT
In this paper we consider the algorithm for recovering sparse orthogonal polynomials using stochastic collocation via ℓq minimization. The main results include: 1) By using the norm inequality between ℓq and ℓ2 and the square root lifting inequality, we present several theoretical estimates regarding the recoverability for both sparse and non-sparse signals via ℓq minimization; 2) We then combine this method with the stochastic collocation to identify the coefficients of sparse orthogonal polynomial expansions, stemming from the field of uncertainty quantification. We obtain recoverability results for both sparse polynomial functions and general non-sparse functions. We also present various numerical experiments to show the performance of the ℓq algorithm. We first present some benchmark tests to demonstrate the ability of ℓq minimization to recover exactly sparse signals, and then consider three classical analytical functions to show the advantage of this method over the standard ℓ1 and reweighted ℓ1 minimization. All the numerical results indicate that the ℓq method performs better than standard ℓ1 and reweighted ℓ1 minimization.
}, issn = {2079-7338}, doi = {https://doi.org/10.4208/nmtma.2017.0001}, url = {http://global-sci.org/intro/article_detail/nmtma/10456.html} }In this paper we consider the algorithm for recovering sparse orthogonal polynomials using stochastic collocation via ℓq minimization. The main results include: 1) By using the norm inequality between ℓq and ℓ2 and the square root lifting inequality, we present several theoretical estimates regarding the recoverability for both sparse and non-sparse signals via ℓq minimization; 2) We then combine this method with the stochastic collocation to identify the coefficients of sparse orthogonal polynomial expansions, stemming from the field of uncertainty quantification. We obtain recoverability results for both sparse polynomial functions and general non-sparse functions. We also present various numerical experiments to show the performance of the ℓq algorithm. We first present some benchmark tests to demonstrate the ability of ℓq minimization to recover exactly sparse signals, and then consider three classical analytical functions to show the advantage of this method over the standard ℓ1 and reweighted ℓ1 minimization. All the numerical results indicate that the ℓq method performs better than standard ℓ1 and reweighted ℓ1 minimization.