A Bridge between Cross-validation Bayes Factors and Geometric Intrinsic Bayes Factors.

2020 
Model Selections in Bayesian Statistics are primarily made with statistics known as Bayes Factors, which are directly related to Posterior Probabilities of models. Bayes Factors require a careful assessment of prior distributions as in the Intrinsic Priors of Berger and Pericchi (1996a) and integration over the parameter space, which may be highly dimensional. Recently researchers have been proposing alternatives to Bayes Factors that require neither integration nor specification of priors. These developments are still in a very early stage and are known as Prior-free Bayes Factors, Cross-Validation Bayes Factors (CVBF), and Bayesian "Stacking." This kind of method and Intrinsic Bayes Factor (IBF) both avoid the specification of prior. However, this Prior-free Bayes factor might need a careful choice of a training sample size. In this article, a way of choosing training sample sizes for the Prior-free Bayes factor based on Geometric Intrinsic Bayes Factors (GIBFs) is proposed and studied. We present essential examples with a different number of parameters and study the statistical behavior both numerically and theoretically to explain the ideas for choosing a feasible training sample size for Prior-free Bayes Factors. We put forward the "Bridge Rule" as an assignment of a training sample size for CVBF's that makes them close to Geometric IBFs. We conclude that even though tractable Geometric IBFs are preferable, CVBF's, using the Bridge Rule, are useful and economical approximations to Bayes Factors.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    9
    References
    0
    Citations
    NaN
    KQI
    []