language-icon Old Web
English
Sign In

Deviance information criterion

The deviance information criterion (DIC) is a hierarchical modeling generalization of the Akaike information criterion (AIC). It is particularly useful in Bayesian model selection problems where the posterior distributions of the models have been obtained by Markov chain Monte Carlo (MCMC) simulation. DIC is an asymptotic approximation as the sample size becomes large, like AIC. It is only valid when the posterior distribution is approximately multivariate normal. The deviance information criterion (DIC) is a hierarchical modeling generalization of the Akaike information criterion (AIC). It is particularly useful in Bayesian model selection problems where the posterior distributions of the models have been obtained by Markov chain Monte Carlo (MCMC) simulation. DIC is an asymptotic approximation as the sample size becomes large, like AIC. It is only valid when the posterior distribution is approximately multivariate normal. Define the deviance as D ( θ ) = − 2 log ⁡ ( p ( y | θ ) ) + C {displaystyle D( heta )=-2log(p(y| heta ))+C,} , where y {displaystyle y} are the data, θ {displaystyle heta } are the unknown parameters of the model and p ( y | θ ) {displaystyle p(y| heta )} is the likelihood function. C {displaystyle C} is a constant that cancels out in all calculations that compare different models, and which therefore does not need to be known. There are two calculations in common usage for the effective number of parameters of the model. The first, as described in Spiegelhalter et al. (2002, p. 587) is p D = D ¯ − D ( θ ¯ ) {displaystyle p_{D}={ar {D}}-D({ar { heta }})} , where θ ¯ {displaystyle {ar { heta }}} is the expectation of θ {displaystyle heta } . The second, as described in Gelman et al. (2004, p. 182) is p D = p V = 1 2 var ^ ( D ( θ ) ) {displaystyle p_{D}=p_{V}={frac {1}{2}}{widehat {operatorname {var} }}left(D( heta ) ight)} . The larger the effective number of parameters is, the easier it is for the model to fit the data, and so the deviance needs to be penalized.

[ "Markov chain Monte Carlo", "Bayesian inference" ]
Parent Topic
Child Topic
    No Parent Topic