A Linear Time Approach to Computing Time Series Similarity based on Deep Metric Learning

2020 
Time series similarity computation is a fundamental primitive that underpins many time series data analysis tasks. However, many existing time series similarity measures have a high computation cost. While there has been much research effort for reducing the computational cost, such effort is usually specific to one similarity measure. We propose NEUTS (Neural metric learning for Time Series) to accelerate time series similarity computation in a generic fashion. NEUTS computes the similarity of a given time series pair in linear time and generic to handle any existing similarity measures. NEUTS samples a number of seed time series from the given database, and then uses their pair-wise similarities as guidance to approximate the similarity function with a neural metric learning framework. NEUTS features two novel modules to achieve accurate approximation of the similarity function: (1) a local attention memory module that augments existing recurrent neural networks for time series encoding; and (2) a distance-weighted ranking loss that effectively transcribes information from the seed-based guidance. With these two modules, NEUTS can yield high accuracies and fast convergence rates even if the training data is small. Our experiments with five real-life datasets and four similarity measures (Frechet, Hausdorff, ERP and DTW) show that NEUTS outperforms baselines consistently and significantly. Specifically, it achieves over 80% accuracies in most settings, while obtaining 50x-1000x speedup over bruteforce methods and 3x-350x speedup over approximate algorithms for top-k similarity search.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    4
    Citations
    NaN
    KQI
    []