Recurrent Attention Unit: A Simple and Effective Method for Traffic Prediction

2021 
Recurrent neural networks are widely used in sequential data modeling. For example, long-short term memory network (LSTM) and gated recurrent unit(GRU) are two typical traffic prediction methods. It is noted that the gate structure is the key component of LSTM and GRU while increasing the number of training parameters against traditional RNN. In this paper, inspired by the function of attention mechanism in regulating information flow, we propose a simple yet effective method for traffic prediction which embeds the attention mechanism within the recurrent module attempting to focus on the important information of inside features. The proposed model structure is named as RAU, which is short for the recurrent attention unit. We evaluated the proposed methods on five real-world datasets. Extensive experiments show that the proposed method can achieve comparable prediction performances against LSTM and GRU, while decrease more than 30% and 50% parameters, respectively.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    4
    References
    0
    Citations
    NaN
    KQI
    []