Fast sparse recovery via non-convex optimization

2015 
Recovering sparse signals from noisy underdetermined linear measurements has been a concerning problem in the signal processing community. Lasso has been put forward to handle this problem well, yet recent research reveals that replacing l1 norm with some non-convex functions leads to better recovery performance. In this paper, based on majorization-minimization and proximal operator, we propose a fast algorithm for the non-convex function regularized least squares problem. Theoretical analysis shows that any limit point of the iterative sequence is a stationary point of the problem, and if the non-convexity is below a threshold, the iterative sequence converges to a neighborhood of the sparse signal with superlinear convergence rate. Simulation results verify such theoretical rate of convergence, and demonstrate that the algorithm outperforms its convex counterpart in various aspects including more nonzero entries allowed, less running time required, and better denoising performance exhibited.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    27
    References
    7
    Citations
    NaN
    KQI
    []