Convolutional Dictionary Learning with Grid Refinement

2020 
Given a continuous-domain signal that can be modeled as the superposition of localized events from multiple sources, the goal of Convolutional Dictionary Learning (CDL) is to identify the location of the events–by Convolutional Sparse Coding (CSC)–and learn the template for each source–through a Convolutional Dictionary Update (CDU) step. In practice, because we observe samples of the continuous-domain signal on a discrete grid, classical CSC methods can only produce estimates of the locations of the events on this grid, which degrades the performance of the CDU step. We introduce a CDL framework that significantly reduces the errors arising from performing the estimation on the grid. Specifically, we construct an expanded dictionary that comprises, not only discrete shifts of the templates, but also variants shifted by a non-integer amount and smoothly interpolated, that enable CSC and CDU to operate at a finer resolution than that of the original sampling grid. We term this approach CDL with grid refinement. For CSC, we develop a novel computationally efficient algorithm, termed Convolutional Orthogonal Matching Pursuit with an interpolated dictionary (COMP-INTERP). We use simulated data to compare COMP-INTERP to state-of-the-art CSC algorithms for estimating off-the-grid events, and demonstrate that it 1) achieves a competitive level of accuracy, and 2) is one order of magnitude faster. For CDU, we derive a novel procedure to update the templates given sparse codes that can occur both on and off the sampling grid. We also show that 3) dictionary update with the overcomplete dictionary yields more accurate templates. Finally, we demonstrate the competitive performance of the algorithms in two applications, namely spike sorting and super-resolution microscopy.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    30
    References
    5
    Citations
    NaN
    KQI
    []