Relative entropy minimization over Hilbert spaces via Robbins-Monro

2019 
One way of getting insight into non-Gaussian measures is to first obtain good Gaussian approximations. These best fit Gaussians can then provide a sense of the mean and variance of the distribution of interest. They can also be used to accelerate sampling algorithms. This begs the question of how one should measure optimality, and how to then obtain this optimal approximation. Here, we consider the problem of minimizing the distance between a family of Gaussians and the target measure with respect to relative entropy, or Kullback-Leibler divergence. As we are interested in applications in the infinite dimensional setting, it is desirable to have convergent algorithms that are well posed on abstract Hilbert spaces. We examine this minimization problem by seeking roots of the first variation of relative entropy, taken with respect to the mean of the Gaussian, leaving the covariance fixed. We prove the convergence of Robbins-Monro type root finding algorithms in this context, highlighting the assumptions necessary for convergence to relative entropy minimizers. Numerical examples are included to illustrate the algorithms.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    15
    References
    0
    Citations
    NaN
    KQI
    []