Learning with Hilbert–Schmidt independence criterion: A review and new perspectives

2021 
Abstract The Hilbert–Schmidt independence criterion (HSIC) was originally designed to measure the statistical dependence of the distribution-based Hilbert space embedding in statistical inference. In recent years, it has been witnessed that this criterion can tackle a large number of learning problems owing to its effectiveness and high efficiency. In this article, we provide an in-depth survey of learning methods using the HSIC for various learning problems, like feature selection, dimensionality reduction, clustering, and kernel learning and optimization. Specifically, after introducing the basic idea of HISC, we systematically review the typical learning models based on the HISC, ranging from supervised learning to unsupervised learning, as well as from traditional machine learning to transfer learning and deep learning, followed by remaining challenges and future directions. The relationships between learning methods using the HSIC and other relevant learning algorithms are also discussed. We expect to provide practitioners valuable guidelines for their specific domains by elucidating the similarities and differences of these learning models.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    177
    References
    0
    Citations
    NaN
    KQI
    []