Interval prediction for time series based on LSTM and mixed Gaussian distribution

2020 
Prediction interval (PI) as a method of probabilistic prediction can output the prediction range with a certain degree of confidence. It can give the users more information than point prediction. The noise of data in PI is usually assumed as a Gaussian, Laplace or other single distribution. However, these assumptions are not suitable for all the applications. In order to solve this problem, a mixed approach based on Long Short Term Memory Network with bootstrap (LSTM-bootstrapping) and mixed Gaussian distribution (MGD) with Expectation-Maximization (EM) algorithm is proposed to fore-cast intervals for time series. LSTM is chosen here because of its extremely effectiveness for time series prediction. Firstly, LSTM-bootstrapping is employed to calculate the model uncertainties and the point prediction. Afterwards, we assume that the noise satisfies a mixed Gaussian distribution and the EM algorithm is applied to estimate the noise uncertainty. Then PI can be acquired by the variances of model and noise uncertainty. The proposed predictive approach is evaluated on wind speed, heteroscedastic wind power and reg capacity price datasets. The results show that our method can solve the uncertainty problem of arbitrary distribution and obtain better performance.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    4
    References
    0
    Citations
    NaN
    KQI
    []