Linear Multiple Low-Rank Kernel Based Stationary Gaussian Processes Regression for Time Series

2020 
Gaussian processes (GPs) for machine learning have been studied systematically over the past two decades. However, kernel design for GPs and the associated hyper-parameters optimization are still difficult, and to a large extent open problems. We consider GP regression for time series modeling and analysis. The underlying stationary kernel is approximated closely by a new grid spectral mixture (GSM) kernel, which is a linear combination of low-rank sub-kernels. In the case where a large number of the involved sub-kernels are used, either the Nystrom or the random Fourier feature approximations can be adopted to reduce the required computer storage. The unknown GP hyper-parameters consist of the nonnegative weights of all sub-kernels as well as the noise variance, and they are determined through the maximum-likelihood estimation method. Two optimization methods for solving the unknown hyper-parameters are introduced, including a sequential majorization-minimization (MM) method and a nonlinearly constrained alternating direction method of multipliers (ADMM). Experimental results, based on various time series datasets, corroborate that the proposed GSM kernel-based GP regression model outperforms several benchmarks in terms of prediction accuracy and numerical stability.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []