Bayesian Neural Networks with Soft Evidence

2020 
Bayes's rule deals with hard evidence, that is, we can calculate the probability of event $A$ occuring given that event $B$ has occurred. Soft evidence, on the other hand, involves a degree of uncertainty about whether event $B$ has actually occurred or not. Jeffrey's rule of conditioning provides a way to update beliefs in the case of soft evidence. We provide a framework to learn a probability distribution on the weights of a neural network trained using soft evidence by way of two simple algorithms for approximating Jeffrey conditionalization. We propose an experimental protocol for benchmarking these algorithms on empirical datasets, even when the data is purposely corrupted.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    29
    References
    0
    Citations
    NaN
    KQI
    []