Online Deep Transferable Dictionary Learning

2021 
Abstract In real-world applications, large-scale unlabeled data usually becomes available gradually over time. Online learning is important to update models while preserving their historical knowledge. However, a time-varying distribution shift exists in incoming sequential data in online learning, resulting in a data cluster discrepancy between the incoming unlabeled data and older labeled data, which is a challenging situation for online learning. To address this issue, we propose an online deep transferable dictionary learning (ODTDL) method that simultaneously mitigates the data cluster discrepancy for incoming unlabeled data while preserving historical knowledge of older data in the dictionary. By forming a locally linear representation and association of incoming unlabeled data over a small amount of labeled data in a deep feature space, the proposed ODTDL method can reveal data cluster discrepancies. To implement this approach, we propose a two-level affiliation regularizer that both comprehensively reveals the local instance-level and global cluster-level affiliations and enables an off-the-shelf dictionary reconstruction error method to establish a knowledge transfer pipeline between the labeled and unlabeled data. For online learning, this approach further decomposes the knowledge transfer pipeline into batchwise transfer pipelines, thereby establishing batchwise transfer pipelines between labeled and unlabeled data. Finally, the proposed method is confirmed to be feasible in online semi-supervised learning (SSL) and online unsupervised domain adaptation (UDA) scenarios and demonstrates its superiority in the online setting.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    53
    References
    0
    Citations
    NaN
    KQI
    []