Extended variational inference for gamma mixture model in positive vectors modeling

2021 
Abstract Bayesian estimation of finite Gamma mixture model (GaMM) has attracted considerable attention recently due to its capability of modeling positive data. With conventional variational inference (VI) frameworks, we cannot derive an analytically tractable solution for the variational posterior, since the expectation of the joint distribution of all the random variables cannot be estimated in a closed form. Therefore, numerical techniques are commonly utilized to simulate the posterior distribution. However, the optimization process of these methods can be prohibitively slow for practical applications. In order to obtain closed-form solutions, some lower-bound approximations are then introduced into the evidence lower bound (ELBO), following the recently proposed extended variational inference (EVI) framework. The problem in numerical simulation can be overcome. In this paper, we address the Bayesian estimation of the finite Gamma mixture model (GaMM) under the EVI framework in a flexible way. Moreover, the optimal mixture component number can be automatically determined based on the observed data and the over-fitting problem related to the conventional expectation–maximization (EM) is overcome. We demonstrate the excellent performance of the proposed method with synthesized data and real data evaluations. In the real data evaluation, we compare the proposed method on object detection and image categorization tasks with referred methods and find statistically significant improvement on accuracies and runtime.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    47
    References
    1
    Citations
    NaN
    KQI
    []