logo
    Omitted Response Treatment Using a Modified Laplace Smoothing for Approximate Bayesian Inference in Item Response Theory
    3
    Citation
    27
    Reference
    10
    Related Paper
    Citation Trend
    Abstract:
    This article applies the approach of adding artificially created data to observations to stabilize estimates to treat missing responses for cases in which students choose to omit answers to questionnaire or achievement test items.This addition of manufactured data is known in the literature as Laplace smoothing or the method of data augmentation priors. It can be understood as a penalty added to a parameter's likelihood function. This approach is used to stabilize results in the National Assessment of Educational Progress (NAEP) analysis and implemented in the MGROUP software program that plays an essential role in generating results files for NAEP.The modified data augmentation approach presented here aims to replace common missing data treatments used in IRT so it can be understood as special deterministic cases of data augmentation priors that add fixed information to the observed data, either by conceptualizing these as adding a fixed form to the likelihood function to constant represent prior information or by understanding the augmentation as a conjugate prior that ‘emulates’ non-random observations.
    Keywords:
    Smoothing
    Laplace's method
    Conjugate prior
    Marginal likelihood
    Laplace's method
    Marginal likelihood
    Conjugate prior
    Approximations of π
    Monte Carlo integration
    The marginal likelihood is a well established model selection criterion in Bayesian statistics. It also allows to efficiently calculate the marginal posterior model probabilities that can be used for Bayesian model averaging of quantities of interest. For many complex models, including latent modeling approaches, marginal likelihoods are however difficult to compute. One recent promising approach for approximating the marginal likelihood is Integrated Nested Laplace Approximation (INLA), design for models with latent Gaussian structures. In this study we compare the approximations obtained with INLA to some alternative approaches on a number of examples of different complexity. In particular we address a simple linear latent model, a Bayesian linear regression model, logistic Bayesian regression models with probit and logit links, and a Poisson longitudinal generalized linear mixed model.
    Laplace's method
    Marginal likelihood
    Bayesian information criterion
    Citations (15)
    Summary Likelihood methods of dealing with some multiparameter problems are introduced and exemplified. Specifically, methods of eliminating nuisance parameters from the likelihood function so that inferences can be made about the parameters of interest are considered. In this regard integrated likelihoods, maximum relative likelihoods, conditional likelihoods, marginal likelihoods and second-order likelihoods are introduced and their uses illustrated in examples. Marginal and conditional likelihoods are dependent upon factorings of the likelihood function. They are applied to the linear functional relationship and to related models and are found to give intuitively appealing results. These methods indicate that in many situations commonly encountered objective methods of eliminating unwanted parameters from the likelihood function can be adopted. This gives an alternative method of interpreting multiparameter likelihoods to that offered by the Bayesian approach.
    Marginal likelihood
    Likelihood principle
    Restricted maximum likelihood
    Nuisance parameter
    When using R package tmbstan for Bayesian inference, the built-in feature Laplace approximation to the marginal likelihood with random effects integrated out can be switched on and off. There exists no guideline on whether Laplace approximation should be used to achieve better efficiency especially when the statistical model for estimating selection is complicated. To answer this question, we conducted simulation studies under different scenarios with a state-space model employing a VAR(1) state equation. We found that turning on Laplace approximation in tmbstan would probably lower the computational efficiency, and only when there is a good amount of data, both tmbstan with and without Laplace approximation are worth trying since in this case, Laplace approximation is more likely to be accurate and may also lead to slightly higher computational efficiency. The transition parameters and scale parameters in a VAR(1) process are hard to be estimated accurately and increasing the sample size at each time point do not help in the estimation, only more time points in the data contain more information on these parameters and make the likelihood dominate the posterior likelihood, thus lead to accurate estimates for them.
    Laplace's method
    Marginal likelihood
    State-space representation
    Point estimation
    Statistical Inference
    Citations (0)
    Due to the intractable partition function, the exact likelihood function for a Markov random field (MRF), in many situations, can only be approximated. Major approximation approaches include pseudolikelihood and Laplace approximation. In this paper, we propose a novel way of approximating the likelihood function through first approximating the marginal likelihood functions of individual parameters and then reconstructing the joint likelihood function from these marginal likelihood functions. For approximating the marginal likelihood functions, we derive a particular likelihood function from a modified scenario of coin tossing which is useful for capturing how one parameter interacts with the remaining parameters in the likelihood function. For reconstructing the joint likelihood function, we use an appropriate copula to link up these marginal likelihood functions. Numerical investigation suggests the superior performance of our approach. Especially as the size of the MRF increases, both the numerical performance and the computational cost of our approach remain consistently satisfactory, whereas Laplace approximation deteriorates and pseudolikelihood becomes computationally unbearable.
    Marginal likelihood
    Laplace's method
    Markov random field
    Citations (0)
    Abstract In Chapter 8 it was shown that, in some cases, inference about a parameter of interest ψ may be based on a marginal or conditional likelihood function. Those methods are only available however when the model has a particular structure. Furthermore, even when a marginal or conditional likelihood function exists, calculation of the likelihood function is often difficult.In this chapter, we consider the modified profile likelihood, a pseudo-likelihood function that is available for general models. The modified profile likelihood may be derived as an approximation to either a marginal or conditional likelihood when either of those likelihoods exists. Furthermore, the calculation of the modified profile likelihood function does not require the existence of a marginal or conditional likelihood and, hence, it has been adopted for general use.
    Marginal likelihood
    Likelihood principle
    Restricted maximum likelihood
    Score test
    In Bayesian reference,marginal likelihood function involve to compute high dimensional complex integrand.So exactly to compute marginal likelihood is often difficult.Wo must choice approximated methods for estimates of the marginal likelihood function.
    Marginal likelihood
    Likelihood principle
    Marginal model
    Citations (0)
    Marginal likelihood
    Laplace's method
    Monte Carlo integration
    Nuisance parameter
    Citations (10)
    Multiple item response profile (MIRP) models are models with crossed fixed and random effects. At least one between‐person factor is crossed with at least one within‐person factor, and the persons nested within the levels of the between‐person factor are crossed with the items within levels of the within‐person factor. Maximum likelihood estimation (MLE) of models for binary data with crossed random effects is challenging. This is because the marginal likelihood does not have a closed form, so that MLE requires numerical or Monte Carlo integration. In addition, the multidimensional structure of MIRPs makes the estimation complex. In this paper, three different estimation methods to meet these challenges are described: the Laplace approximation to the integrand; hierarchical Bayesian analysis, a simulation‐based method; and an alternating imputation posterior with adaptive quadrature as the approximation to the integral. In addition, this paper discusses the advantages and disadvantages of these three estimation methods for MIRPs. The three algorithms are compared in a real data application and a simulation study was also done to compare their behaviour.
    Laplace's method
    Marginal likelihood
    Bayes estimator
    Factor Analysis