Objective priors for sequential experiments are considered. Common priors, such as the Jeffreys prior and the reference prior, will typically depend on the stopping rule used for the sequential experiment. New expressions for reference priors are obtained in various contexts, and computational issues involving such priors are considered.
Modern science, technology, and politics are all permeated by data that comes from people, measurements, or computational processes. While this data is often incomplete, corrupt, or lacking in sufficient accuracy and precision, explicit consideration of uncertainty is rarely part of the computational and decision making pipeline. The CCC Workshop on Quantification, Communication, and Interpretation of Uncertainty in Simulation and Data Science explored this problem, identifying significant shortcomings in the ways we currently process, present, and interpret uncertain data. Specific recommendations on a research agenda for the future were made in four areas: uncertainty quantification in large-scale computational simulations, uncertainty quantification in data science, software support for uncertainty computation, and better integration of uncertainty quantification and communication to stakeholders.
Article RANKING, ESTIMATION AND HYPOTHESIS TESTING IN UNBALANCED TWO-WAY ADDITIVE MODELS - A BAYESIAN APPROACH was published on January 1, 1993 in the journal Statistics & Risk Modeling (volume 11, issue 1).
Effective volcanic hazard management in regions where populations live in close proximity to persistent volcanic activity involves understanding the dynamic nature of hazards, and associated risk. Emphasis until now has been placed on identification and forecasting of the escalation phase of activity, in order to provide adequate warning of what might be to come. However, understanding eruption hiatus and post-eruption unrest hazards, or how to quantify residual hazard after the end of an eruption, is also important and often key to timely post-eruption recovery. Unfortunately, in many cases when the level of activity lessens, the hazards, although reduced, do not necessarily cease altogether. This is due to both the imprecise nature of determination of the “end” of an eruptive phase as well as to the possibility that post-eruption hazardous processes may continue to occur. An example of the latter is continued dome collapse hazard from lava domes which have ceased to grow, or sector collapse of parts of volcanic edifices, including lava dome complexes. We present a new probabilistic model for forecasting pyroclastic density currents (PDCs) from lava dome collapse that takes into account the heavy-tailed distribution of the lengths of eruptive phases, the periods of quiescence, and the forecast window of interest. In the hazard analysis, we also consider probabilistic scenario models describing the flow’s volume and initial direction. Further, with the use of statistical emulators, we combine these models with physics-based simulations of PDCs at Soufrière Hills Volcano to produce a series of probabilistic hazard maps for flow inundation over 5, 10, and 20 year periods. The development and application of this assessment approach is the first of its kind for the quantification of periods of diminished volcanic activity. As such, it offers evidence-based guidance for dome collapse hazards that can be used to inform decision-making around provisions of access and reoccupation in areas around volcanoes that are becoming less active over time.
Let $\mathbf{X} = (X_1, \cdots, X_p)^t$ be an observation from a $p$-variate normal distribution with unknown mean $\mathbf{\theta} = (\theta_1, \cdots, \theta_p)^t$ and identity covariance matrix. We consider a control problem which, in canonical form, is the problem of estimating $\mathbf{\theta}$ under the loss $L(\mathbf{\theta, \delta}) = (\mathbf{\theta}^t \mathbf{\delta} - 1)^2$, where $\mathbf{\delta(x)} = (\delta_1(\mathbf{x}), \cdots, \delta_p(\mathbf{x}))^t$ is the estimate of $\mathbf{\theta}$ for a given $\mathbf{x}$. General theorems are given for establishing admissibility or inadmissibility of estimators in this problem. As an application, it is shown that estimators of the form $\mathbf{\delta(x)} = (|\mathbf{x}|^2 + c)^{-1}\mathbf{x} + |\mathbf{x}|^{-4}w(|\mathbf{x}|)\mathbf{x}$, where $w(|\mathbf{x}|)$ tends to zero as $|\mathbf{x}| \rightarrow \infty$, are inadmissible if $c > 5 - p$, but are admissible if $c \leq 5 - p$ and $\mathbf{\delta}$ is generalized Bayes for an appropriate prior measure. Also, an approximation to generalized Bayes estimators for large $|\mathbf{x}|$ is developed.
Informally, "Information Inconsistency" is the property that has been observed in many Bayesian hypothesis testing and model selection procedures whereby the Bayesian conclusion does not become definitive when the data seems to become definitive. An example is that, when performing a t-test using standard conjugate priors, the Bayes factor of the alternative hypothesis to the null hypothesis remains bounded as the t statistic grows to infinity. This paper shows that information inconsistency is ubiquitous in Bayesian hypothesis testing under conjugate priors. Yet the title does not fully describe the paper, since we also show that theoretically recommended priors, including scale mixtures of conjugate priors and adaptive priors, are information consistent. Hence the paper is simply a forceful warning that use of conjugate priors in testing and model selection is highly problematical, and should be replaced by the information consistent alternatives.