CDA-LSTM: an evolutionary convolution-based dual-attention LSTM for univariate time series prediction

2021 
Univariate time series forecasting is still an important but challenging task. Considering the wide application of temporal data, adaptive predictors are needed to study historical behavior and forecast future state in various scenarios. In this paper, inspired by human attention mechanism and decomposition and reconstruction framework, we proposed a convolution-based dual-stage attention (CDA) architecture combined with Long Short-Term Memory networks (LSTM) for univariate time series forecasting. Specifically, we first use the decomposition algorithm to generate derived variables from target series. Input variables are then fed into the CDA-LSTM machine for further forecasting. In the Encoder–Decoder phase, for the first stage, attention operation is combined with the LSTM acting as an encoder, which could adaptively learn the relevant derived series to the target. In the second stage, the temporal attention mechanism is integrated with decoder aiming to automatically select the relevant encoder hidden states across all time steps. A convolution phase is concatenated parallelly to the Encoder–Decoder phase to reuse the historical information of the target and extract the mutation features. The experimental results demonstrate the proposed method could be adopted as expert systems for forecasting in multiple scenarios, and the superiority is verified by comparing with twelve baseline models on ten datasets. The practicability of different decomposition algorithms and convolution architectures is also discussed by extensive experiment. Overall, our work carries a significant value not merely in adaptive modeling of deep learning in time series issues, but also in the field of univariate data processing and prediction.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    42
    References
    1
    Citations
    NaN
    KQI
    []