Shrinkage with shrunken shoulders: inference via geometrically / uniformly ergodic Gibbs sampler.

2020 
Use of continuous shrinkage priors --- with a "spike" near zero and heavy-tails towards infinity --- is an increasingly popular approach to induce sparsity in parameter estimates. When the parameters are only weakly identified by the likelihood, however, the posterior may end up with tails as heavy as the prior, jeopardizing robustness of inference. A natural solution is to "shrink shoulders" of a shrinkage prior by lightening up its tails beyond a reasonable parameter range, yielding the regularized version of the prior. We develop a regularization approach which, unlike previously proposed one, preserves computationally attractive structures of original shrinkage priors. We study theoretical properties of the Gibbs sampler on resulting posterior distributions, with emphasis on convergence rates of the P\'olya-Gamma Gibbs sampler for sparse logistic regression. Our analysis shows that the proposed regularization leads to geometric ergodicity under a broad range of global-local shrinkage priors. Essentially, the only requirement is for the prior $\pi_{\rm local}(\cdot)$ on the local scale $\lambda$ to satisfy $\pi_{\rm local}(0) 0$ as in Bayesian bridge priors, we show the sampler to be uniformly ergodic.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    50
    References
    2
    Citations
    NaN
    KQI
    []