Multi-kernel Covariance Terms in Multi-output Support Vector Machines

2020 
This paper proposes a novel way to learn multi-task kernel machines by combining the structure of classical Support Vector Machine (SVM) optimization problem with multi-task covariance functions developed in Gaussian process (GP) literature. Specifically, we propose a multi-task Support Vector Machine that can be trained on data with multiple target variables simultaneously, while taking into account the correlation structure between different outputs. In the proposed framework, the correlation structure between multiple tasks is captured by covariance functions constructed using a Fourier transform, which allows to represent both auto and cross-correlation structure between the outputs. We present a mathematical model and validate it experimentally on a rescaled version of the Jura dataset, a collection of samples representing the amount of seven chemical elements into several locations. The results demonstrate the utility of our modeling framework.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    14
    References
    0
    Citations
    NaN
    KQI
    []