Kullback–Leibler Divergence Between Multivariate Generalized Gaussian Distributions
2019
The Kullback–Leibler divergence (KLD) between two multivariate generalized Gaussian distributions (MGGDs) is a fundamental tool in many signal and image processing applications. Until now, the KLD of MGGDs has no known explicit form, and it is in practice either estimated using expensive Monte-Carlo stochastic integration or approximated. The main contribution of this letter is to present a closed-form expression of the KLD between two zero-mean MGGDs. Depending on the Lauricella series, a simple way of calculating numerically the KLD is exposed. Finally, we show that the approximation of the KLD by Monte-Carlo sampling converges to its theoretical value when the number of samples goes to the infinity.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
0
References
13
Citations
NaN
KQI