Accelerated Stochastic Variational Inference

2019 
Variational inference (VI) is the method from machine learning, which approximates probability densities through optimization. In this paper, we propose a stochastic optimization algorithm, called randomized stochastic accelerated natural gradient (RSANG), which uses the unbiased estimates of natural gradient that utilizes the Riemannian geometry of the approximation space at each iteration for variational inference problems. The convergence rate of proposed algorithm is proven to be faster than SGD theoretically. Based on RSANG algorithm, we develop accelerated stochastic variational inference, a scalable algorithm for approximating posterior distributions for a general class of conjugate-exponential models. The convergence rate of accelerated stochastic variational inference is proven to be faster than stochastic variational inference theoretically. We also demonstrate that the proposed method improves substantially over the non-accelerated methods in simulated examples.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    12
    References
    0
    Citations
    NaN
    KQI
    []