One-Bit Compressive Sensing with Projected Subgradient Method under Sparsity Constraints

2019 
One-bit compressive sensing theory shows that the sparse signals can be almost exactly reconstructed from a small number of one-bit quantized linear measurements. This paper presents the convergence analysis of the binary iterative hard thresholding (BIHT) algorithm which is a state-of-the-art recovery algorithm in one-bit compressive sensing. The basic idea of the convergence analysis is to view BIHT as a kind of projected subgradient method under sparsity constrains. To the best of our knowledge, this is the first convergence analysis of BIHT. We first consider a general convex function subject to sparsity constraints and connect it with the non-convex model in one-bit compressive sensing literatures. A projected subgradient method is proposed to solve the general model and some convergence results are established. A stronger convergence theorem for $\alpha $ –strongly convex functions without assumption on differentiable condition is also established. Furthermore, the corresponding stochastic projected subgradient method is provided with convergence guarantee. In our settings, BIHT is a special case of the projected subgradient method. Therefore, the convergence analysis can be applied to BIHT naturally. Then, we apply the projected subgradient method to some related non-convex optimization models arising in compressive sensing with $\ell _{1}$ -constraint, sparse support vector machines, and rectifier linear units regression. Finally, some numerical examples are presented to show the validity of our convergence analysis. The numerical experiments also show that the proposed projected subgradient method is very simple to implement, robust to sparse noise, and effective for sparse recovery problems.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    58
    References
    3
    Citations
    NaN
    KQI
    []