Kullback-Leibler and Rényi divergence rate for Gaussian stationary ARMA processes comparison

2021 
Abstract In signal processing, ARMA processes are widely used to model short-memory processes. In various applications, comparing or classifying ARMA processes is required. In this paper, our purpose is to provide analytical expressions of the divergence rates of the Kullback-Leibler divergence, the Renyi divergence (RD) of order α and their symmetric versions for two Gaussian ARMA processes, by taking advantage of results such as the Yule-Walker equations and notions such as inverse filtering. The divergence rates can be interpreted as the sum of different quantities: power of one ARMA process filtered by the inverse filter associated with the second ARMA process, cepstrum, etc. Finally, illustrations show that the ranges of values taken by the divergence rates of the RD are sensitive to α, especially when the latter is close to 1.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    47
    References
    1
    Citations
    NaN
    KQI
    []