Stable signal recovery from incomplete and inaccurate measurements
read more
Citations
An Introduction To Compressive Sampling
Robust Face Recognition via Sparse Representation
Sparse MRI: The application of compressed sensing for rapid MR imaging.
Enhancing Sparsity by Reweighted ℓ 1 Minimization
CoSaMP: Iterative signal recovery from incomplete and inaccurate samples
References
Convex Optimization
Compressed sensing
Nonlinear total variation based noise removal algorithms
Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
Decoding by linear programming
Related Papers (5)
Frequently Asked Questions (14)
Q2. What is the problem of minimizing l1 under linear equality constraints?
The problem of recovering a sparse vector by minimizing ℓ1 under linear equality constraints has recently received much attention, mostly in the context of Basis Pursuit, where the goal is to uncover sparse signal decompositions in overcomplete dictionaries.
Q3. What is the recovery condition for the Fourier transform?
The recovery condition then depends on the mutual coherence µ between the measurement basis Φ and the sparsity basis Ψ which measures the similarity between Φ and Ψ; µ(Φ,Ψ) = √ m max |〈φk, ψj〉|, φk ∈ Φ, ψj ∈ Ψ.
Q4. What are the obvious extensions of the vcp?
Obvious extensions include looking for signals that are sparsein overcomplete wavelet or curvelet bases, or for images that have certain geometrical structure.
Q5. How do the authors get the wavelet coefficients of the image from the scrambled real?
the authors make 25000 measurements of the image using a scrambled real Fourier ensemble; that is, the test functions ak(t) are real-valued sines and cosines (with randomly selected frequencies) which are temporally scrambled by randomly permuting the m time points.
Q6. What is the significance of the singular values of AT0?
Now of course, x̂−x0 = 0 on the complement of T0 while on T0 x̂− x0 = (A∗T0AT0)−1A∗T0e, and since by hypothesis, the eigenvalues of A∗T0AT0 are well-behaved 2‖x̂− x0‖ℓ2 ≈ ‖A∗T0e‖ℓ2 ≈ ǫ,at least for perturbations concentrated in the row space of AT0 .
Q7. What is the general rule for the recovery procedure?
To be broadly applicable, their recovery procedure must be stable: small changes in the observations should result in small changes in the recovery.
Q8. What is the sparsity constraint on the underlying signal?
In [12], the sparsity constraint on the underlying signal x0 depends on the magnitude ofthe maximum entry of the Gram matrix M(A) = maxi,j:i6=j |(A∗A)|i,j .
Q9. How many terms are used to calculate the recovery error?
As a reference, the 50 term nonlinear approximation errors of these compressible signals is around 0.47; at low signal-to-noise ratios their recovery error is about 1.5 times this quantity.
Q10. What is the sparsity constraint for the underlying signal?
For the measurement ensembles listed in the previous section, however, the sparsity required is still on the order of √ n in the situation where n is comparable to m.
Q11. What is the way to explain the random matrix theory?
Using tools from random matrix theory, [3,5,10] give several examples of matrices such that (3) holds for S on the order of n to within log factors.
Q12. What is the important part of the paper?
It was shown (also in [4]) that if S verifiesδS + δ2S + δ3S < 1, (3)then solving (P1) recovers any sparse signal x0 with support size obeying |T0| ≤ S.This paper develops results for the “imperfect” (and far more realistic) scenarios where the measurements are noisy and the signal is not exactly sparse.
Q13. What is the S-restricted isometry constant of A?
Then [4] defines the S-restricted isometry constant δS of A which is the smallest quantity such that(1 − δS) ‖c‖2ℓ2 ≤ ‖AT c‖2ℓ2 ≤ (1 + δS) ‖c‖2ℓ2 (2)for all subsets T with |T | ≤ S and coefficient sequences (cj)j∈T .
Q14. What is the way to recover a vector of arbitrary size?
Roughly speaking, the theorem says that minimizing ℓ1 stably recovers the S-largest entries of an m-dimensional unknown vector x from n measurements only.