Coupled Variational Bayes via Optimization Embedding

2018 
Variational inference plays a vital role in learning graphical models, especially on large-scale datasets. Much of its success depends on a proper choice of auxiliary distribution class for posterior approximation. However, how to pursue an auxiliary distribution class that achieves both good approximation ability and computation efficiency remains a core challenge. In this paper, we construct such a distribution class, termed optimization embedding, since it takes root in an optimization procedure. This flexible function class couples the variational distribution with the original parameters in the graphical models, allowing end-to-end learning of the graphical models by back-propagation through the variational distribution. Theoretically, we establish an interesting connection to gradient flow and demonstrate the extremely flexibility of this implicit distribution family in the limit sense. Practically, the proposed technique allows to significantly accelerate the learning procedure, i.e., the proposed coupled variational Bayes, by reducing the searching space to a large extent. We further demonstrate the significant superiority of the proposed method on multiple graphical models with either continuous or discrete latent variables comparing to state-of-the-art methods.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    20
    References
    24
    Citations
    NaN
    KQI
    []