Learning Kullback-Leibler Divergence-based Gaussian Model for Multivariate Time Series Classification

2019 
The multivariate time series (MTS) classification is an important classification problem in which data has the temporal attribute. Because relationships between many variables of the MTS are complex and time-varying, existing methods perform not well in MTS classification with many attribute variables. Thus, in this paper, we propose a novel model-based classification method, called Kullback-Leibler Divergence-based Gaussian Model Classification (KLD-GMC), which converts the original MTS data into two important parameters of the multivariate Gaussian model: the mean vector and the inverse covariance matrix. The inverse covariance is the most important parameter, which can obtain the information between the variables. So that the more variables, the more information could be obtained by the inverse covariance, KLD-GMC can deal with the relationship between variables well in the MTS. Then the sparse inverse covariance of each subsequence is solved by Graphical Lasso. Furthermore, the Kullback-Leibler divergence is used as the similarity measurement to implement the classification of unlabeled subsequences, because it can effectively measure the similarity between different distributions. Experimental results on classical MTS datasets demonstrate that our method can improve the performance of multivariate time series classification and outperform the state-of-the-art methods.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    34
    References
    3
    Citations
    NaN
    KQI
    []