Revisiting High Dimensional Bayesian Model Selection for Gaussian Regression.

2019 
Model selection for regression problems with an increasing number of covariates continues to be an important problem both theoretically and in applications. Model selection consistency and mean structure reconstruction depend on the interplay between the Bayes factor learning rate and the penalization on model complexity. In this work, we present results for the Zellner-Siow prior for regression coefficients paired with a Poisson prior for model complexity. We show that model selection consistency restricts the dimension of the true model from increasing too quickly. Further, we show that the additional contribution to the mean structure from new covariates must be large enough to overcome the complexity penalty. The average Bayes factors for different sets of models involves random variables over the choices of columns from the design matrix. We show that a large class these random variables have no moments asymptotically and need to be analyzed using stable laws. We derive the domain of attraction for these random variables and obtain conditions on the design matrix that provide for the control of false discoveries.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    33
    References
    0
    Citations
    NaN
    KQI
    []