Relative Entropy Minimization over Hilbert Spaces via Robbins-Monro

2015 
One way of getting insight into non-Gaussian measures, posed on infinite dimensional Hilbert spaces, is to first obtain good approximations in terms of Gaussians. These best fit Gaussians then provide notions of mean and variance, and they can be used to accelerate sampling algorithms. This begs the question of how one should measure optimality. Here, we consider the problem of minimizing the distance between a family of Gaussians and the target measure, with respect to relative entropy, or Kullback-Leibler divergence, as has been done previously in the literature. Thus, it is desirable to have algorithms, well posed in the abstract Hilbert space setting, which converge to these minimizers. We examine this minimization problem by seeking roots of the first variation of relative entropy, taken with respect to the mean of the Gaussian, leaving the covariance fixed. We prove the convergence of Robbins-Monro type root finding algorithms, highlighting the assumptions necessary for them to converge to relative entropy minimizers.
    • Correction
    • Cite
    • Save
    • Machine Reading By IdeaReader
    18
    References
    0
    Citations
    NaN
    KQI
    []