Variational Resampling Based Assessment of Deep Neural Networks under Distribution Shift

2019 
A novel variational inference based resampling framework is proposed to evaluate the robustness and generalization capability of deep learning models with respect to distribution shift. We use Auto Encoding Variational Bayes to find a latent representation of the data, on which a Variational Gaussian Mixture Model is applied to deliberately create distribution shift by dividing the dataset into different clusters. Wasserstein distance is used to characterize the extent of distribution shift between the generated data splits. In experiments using the Fashion- MNIST data, we assess several popular image classification Convolutional Neural Network (CNN) architectures and Bayesian CNN models with respect to their robustness and generalization behavior under the deliberately created distribution shift, which is analyzed in contrast to random Cross Validation. Our method of creating artificial domain splits of a single dataset may also be used to establish novel model selection criteria and assessment tools in machine learning, as well as for benchmark methods in the areas of domain adaptation and domain generalization.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    41
    References
    0
    Citations
    NaN
    KQI
    []