Echo State Network with a Global Reversible Autoencoder for Time Series Classification

2021 
Abstract An echo state network (ESN) can provide an efficient dynamic solution for predicting time series problems. However, in most cases, ESN models are applied for predictions rather than classifications. The applications of ESN in time series classification (TSC) problems have yet to be fully studied. Moreover, the conventional randomly generated ESN is unlikely to be optimal because of the randomly generated input and reservoir weights, which are not always guaranteed to be optimal. Randomly generating all layer weights is improper, because a purely random layer might destroy the useful features. To overcome this disadvantage, this study provides a new input weight establishment framework of ESN based on autoencoder (AE) theory for TSC tasks. A global reversible AE (GRAE) algorithm is proposed to reestablish the random initialization input weights of the ESN. In existing ESN-AEs, the output weights obtained in the encoding process are directly reused as the initial input weights. By contrast, in GRAE, the reservoir layer with a reversible activation function is calculated by pulling the decoding layer output back and injecting it into the reservoir layer. Thus, feature learning is enriched by additional information, which results in improved performance. The current weights of the encoding layer are iteratively replaced by the decoding layer to ensure that the outputs of the GRAE are remarkably correlated with the input data. Visualization analyses and experiments of the input weights on a massive set of UCR time series datasets indicate that the proposed GRAE method can considerably improve the original two-layer ESN-based classifiers and the proposed GRAE-ESN classifier yields better performance compared with traditional state-of-the-art TSC classifiers. Furthermore, the proposed method can provide comparable performance and considerably faster training speed compared with three deep learning classifiers.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    48
    References
    0
    Citations
    NaN
    KQI
    []