Preventing Posterior Collapse Induced by Oversmoothing in Gaussian VAE.

2021 
Variational autoencoders (VAEs) often suffer from posterior collapse, which is a phenomenon in which the learned latent space becomes uninformative. This is often related to a hyperparameter resembling the data variance. It can be shown that an inappropriate choice of this parameter causes oversmoothness and leads to posterior collapse in the linearly approximated case and can be empirically verified for the general cases. Therefore, we propose AR-ELBO (Adaptively Regularized Evidence Lower BOund), which controls the smoothness of the model by adapting this variance parameter. In addition, we extend VAE with alternative parameterizations on the variance parameter to deal with non-uniform or conditional data variance. The proposed VAE extensions trained with AR-ELBO show improved Frechet inception distance (FID) on images generated from the MNIST and CelebA datasets.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    32
    References
    2
    Citations
    NaN
    KQI
    []