Transformation-gated LSTM: efficient capture of short-term mutation dependencies for multivariate time series prediction tasks

2019 
Most multivariate time series data have very complex long-term and short-term dependencies that change over time. Currently, some recurrent neural network (RNN) variants for sequence tasks enhance the learning ability of long-term dependence on time series data. However, there lack of RNN network for capturing short-term mutation information for multivariate time series. In the present work, we proposed a transformation-gated LSTM (TG-LSTM) to enhance the ability of capturing short-term mutation information. First, the transformation gate introduced a hyperbolic tangent function to the memory cell state of the previous time step and the input gate information of the current time step without losing the memory cell state information. Then, the function value range of the partial derivative corresponding to the transformation gate during the backpropagation fully reflected the gradient change, thereby obtaining a better error gradient flow. We further extended to multi-layer TG-LSTM network and compared its stability and robustness with all baseline models. The multi-layer TG-LSTM network was superior to all baseline models in terms of prediction accuracy and performance stability on two different multivariate time series tasks.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    26
    References
    10
    Citations
    NaN
    KQI
    []