A survey on long short-term memory networks for time series prediction

2021 
Abstract Recurrent neural networks and exceedingly Long short-term memory (LSTM) have been investigated intensively in recent years due to their ability to model and predict nonlinear time-variant system dynamics. The present paper delivers a comprehensive overview of existing LSTM cell derivatives and network architectures for time series prediction. A categorization in LSTM with optimized cell state representations and LSTM with interacting cell states is proposed. The investigated approaches are evaluated against defined requirements being relevant for an accurate time series prediction. These include short-term and long-term memory behavior, the ability for multimodal and multi-step ahead predictions and the according error propagation. Sequence-to-sequence networks with partially conditioning outperform the other approaches, such as bidirectional or associative networks, and are best suited to fulfill the requirements.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    32
    References
    10
    Citations
    NaN
    KQI
    []