- Journal Home
- Volume 36 - 2024
- Volume 35 - 2024
- Volume 34 - 2023
- Volume 33 - 2023
- Volume 32 - 2022
- Volume 31 - 2022
- Volume 30 - 2021
- Volume 29 - 2021
- Volume 28 - 2020
- Volume 27 - 2020
- Volume 26 - 2019
- Volume 25 - 2019
- Volume 24 - 2018
- Volume 23 - 2018
- Volume 22 - 2017
- Volume 21 - 2017
- Volume 20 - 2016
- Volume 19 - 2016
- Volume 18 - 2015
- Volume 17 - 2015
- Volume 16 - 2014
- Volume 15 - 2014
- Volume 14 - 2013
- Volume 13 - 2013
- Volume 12 - 2012
- Volume 11 - 2012
- Volume 10 - 2011
- Volume 9 - 2011
- Volume 8 - 2010
- Volume 7 - 2010
- Volume 6 - 2009
- Volume 5 - 2009
- Volume 4 - 2008
- Volume 3 - 2008
- Volume 2 - 2007
- Volume 1 - 2006
Commun. Comput. Phys., 24 (2018), pp. 286-308.
Published online: 2018-03
Cited by
- BibTex
- RIS
- TXT
In this paper, we discuss a gradient-enhanced $ℓ_1$ approach for the recovery of sparse Fourier expansions. By $gradient$-$enhanced$ approaches we mean that the directional derivatives along given vectors are utilized to improve the sparse approximations. We first consider the case where both the function values and the directional derivatives at sampling points are known. We show that, under some mild conditions, the inclusion of the derivatives information can indeed decrease the coherence of measurement matrix, and thus leads to the improved the sparse recovery conditions of the $ℓ_1$ minimization. We also consider the case where either the function values or the directional derivatives are known at the sampling points, in which we present a sufficient condition under which the measurement matrix satisfies RIP, provided that the samples are distributed according to the uniform measure. This result shows that the derivatives information plays a similar role as that of the function values. Several numerical examples are presented to support the theoretical statements. Potential applications to function (Hermite-type) interpolations and uncertainty quantification are also discussed.
}, issn = {1991-7120}, doi = {https://doi.org/10.4208/cicp.OA-2018-0006}, url = {http://global-sci.org/intro/article_detail/cicp/10938.html} }In this paper, we discuss a gradient-enhanced $ℓ_1$ approach for the recovery of sparse Fourier expansions. By $gradient$-$enhanced$ approaches we mean that the directional derivatives along given vectors are utilized to improve the sparse approximations. We first consider the case where both the function values and the directional derivatives at sampling points are known. We show that, under some mild conditions, the inclusion of the derivatives information can indeed decrease the coherence of measurement matrix, and thus leads to the improved the sparse recovery conditions of the $ℓ_1$ minimization. We also consider the case where either the function values or the directional derivatives are known at the sampling points, in which we present a sufficient condition under which the measurement matrix satisfies RIP, provided that the samples are distributed according to the uniform measure. This result shows that the derivatives information plays a similar role as that of the function values. Several numerical examples are presented to support the theoretical statements. Potential applications to function (Hermite-type) interpolations and uncertainty quantification are also discussed.