Recovery of Time-Varying Graph Signals via Distributed Algorithms on Regularized Problems

2020 
The recovery of missing samples from available noisy measurements is a fundamental problem in signal processing. This process is also sometimes known as graph signal inpainting, reconstruction, forecasting or inference. Many of the existing algorithms do not scale well with the size of the graph and/or they cannot be implemented efficiently in a distributed manner. In this paper, we develop efficient distributed algorithms for the recovery of time-varying graph signals. The a priori assumptions are that the signal is smooth with respect to the graph topology and correlative across time. These assumptions can be incorporated in an optimization formulation of the algorithm via Tikhonov regularization terms. Our formulation is tailored to yield algorithms that can be efficiently implemented in a distributed manner. Two different distributed algorithms, arising from two different formulations, are proposed to solve the optimization problems. The first involves the $\ell _2$ -norm, and a distributed least squared recovery algorithm (DLSRA) is proposed that leverages the graph topology and sparsity of the corresponding Hessian matrix. Updates of the Hessian inverse are not required here. The second involves the $\ell _1$ -norm and the philosophy of the alternating direction method of multipliers (ADMM) is utilized to develop the algorithm. An inexact Newton method is incorporated into the conventional ADMM to give a distributed ADMM recovery algorithm (DAMRA). The two distributed algorithms require only data exchanges between vertices in localized neighbourhood subgraphs. Experiments on a variety of synthetic and real-world datasets demonstrate that the proposed algorithms are superior to the existing methods in terms of the computational complexity and convergence rate.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    49
    References
    4
    Citations
    NaN
    KQI
    []