logo
    Uncertainty Quantification and Model Calibration
    37
    Citation
    0
    Reference
    10
    Related Paper
    Citation Trend
    Abstract:
    Uncertainty quantification may appear daunting for practitioners due to its inherent complexity but can be intriguing and rewarding for anyone with mathematical ambitions and genuine concern for modeling quality. Uncertainty quantification is what remains to be done when too much credibility has been invested in deterministic analyses and unwarranted assumptions. Model calibration describes the inverse operation targeting optimal prediction and refers to inference of best uncertain model estimates from experimental calibration data. The limited applicability of most state-of-the-art approaches to many of the large and complex calculations made today makes uncertainty quantification and model calibration major topics open for debate, with rapidly growing interest from both science and technology, addressing subtle questions such as credible predictions of climate heating.
    Keywords:
    Uncertainty Quantification
    The paper refers to evaluation of the measurement uncertainty in calibration of volumetric standard installation used for metrological verification of cold water meters. Is important to be interested in evaluation uncertainty of measurement simply because is necessary to make good quality measurements and to understanding and using the results. There is a particular reason for evaluation of measurement uncertainty, as part of calibration, where the uncertainty of measurement must be reported on the certificate. For evaluation the uncertainty of measurement, firstly we must identify the sources of uncertainty in the measurement, then we must estimate the contribution of the uncertainty from each source, finally the individual uncertainties are combined. The calibration of the installations used for the verification of the water meters is performed by the direct comparison method using secondary volume standards.
    Standard uncertainty
    Certificate
    NIST
    The performance of a multidisciplinary system is inevitably affected by various sources of uncertainties, usually categorized as aleatory (e.g. input variability) or epistemic (e.g. model uncertainty) uncertainty. In the framework of design under uncertainty, all sources of uncertainties should be aggregated to assess the uncertainty of system quantities of interest (QOIs). In a multidisciplinary design system, uncertainty propagation refers to the analysis that quantifies the overall uncertainty of system QOIs resulting from all sources of aleatory and epistemic uncertainty originating in the individual disciplines. However, due to the complexity of multidisciplinary simulation, especially the coupling relationships between individual disciplines, many uncertainty propagation approaches in the existing literature only consider aleatory uncertainty and ignore the impact of epistemic uncertainty. In this paper, we address the issue of efficient uncertainty quantification of system QOIs considering both aleatory and epistemic uncertainties. We propose a spatial-random-process (SRP) based multidisciplinary uncertainty analysis (MUA) method that, subsequent to SRP-based disciplinary model uncertainty quantification, fully utilizes the structure of SRP emulators and leads to compact analytical formulas for assessing statistical moments of uncertain QOIs. The proposed method is applied to a benchmark electronics packaging problem. To demonstrate the effectiveness of the method, the estimated low-order statistical moments of the QOIs are compared to the results from Monte Carlo simulations.
    Uncertainty Quantification
    Propagation of uncertainty
    Sensitivity Analysis
    Benchmark (surveying)
    Citations (13)
    The objective of this research is to quantify the impact of both aleatory and epistemic uncertainties on performances of multidisciplinary systems. Aleatory uncertainty comes from the inherent uncertain nature and epistemic uncertainty comes from the lack of knowledge. Although intensive research has been conducted on aleatory uncertainty, few studies on epistemic uncertainty have been reported. In this work, the two types of uncertainty are analyzed. Aleatory uncertainty is modeled by probability distributions while epistemic uncertainty is modeled by intervals. Probabilistic analysis (PA) and interval analysis (IA) are integrated to capture the effect of the two types of uncertainty. The First Order Reliability Method is employed for PA while nonlinear optimization is used for IA. The unified uncertainty analysis, which consists of PA and IA, is employed to develop new sensitivity analysis methods for the mixture of the two types of uncertainty. The methods are able to quantify the contribution of each input variable with either epistemic uncertainty or aleatory uncertainty. The analysis results can then help better decision making on how to effectively mitigate the effect of uncertainty. The other major contribution of this research is the extension of the unified uncertainty analysis to the reliability analysis for multidisciplinary systems. The major findings of this research are as follows. (1) Sensitivity analysis method is an effective tool for reducing the impact of epistemic uncertainty. (2) The proposed new reliability sensitivity indexes can easily measure the changes in output uncertainty with respect to those in input uncertainty. (3) The effect of aleatory uncertainty can be primarily measured by the distribution of a performance; and the effect of epistemic uncertainty can be measured by the bounds of the distribution. (4) The unified uncertainty analysis methods for single-disciplinary systems can be extended to the reliability analysis for multidisciplinary systems. (5) All the proposed methods can be ultimately integrated with multidisciplinary design optimization.
    Uncertainty Quantification
    Sensitivity Analysis
    Interval arithmetic
    Propagation of uncertainty
    Citations (1)
    This paper presents a computational framework for uncertainty characterization and propagation, as well as sensitivity analysis under the presence of aleatory and epistemic uncertainties, and it develops a rigorous methodology for efficient refinement of epistemic uncertainty by identifying important epistemic variables that significantly affect the overall performance of an engineering system. The proposed methodology is illustrated using the NASA Langley Research Center Uncertainty Quantification Challenge problem that deals with an uncertainty analysis of a generic transport model. First, Bayesian inference is used to infer subsystem-level epistemic quantities using the subsystem-level model and corresponding data. Second, tools of variance-based global sensitivity analysis are used to identify four important epistemic variables (this limitation, specified in the NASA Langley Research Center Uncertainty Quantification Challenge, is reflective of practical engineering situations where not all epistemic variables can be refined due to time/budget constraints) that significantly affect system-level performance. The most significant contribution of this paper is the development of the sequential refinement methodology, where epistemic variables for refinement are not identified all at once. Instead, only one variable is first identified, and then, Bayesian inference and global sensitivity calculations are repeated to identify the next important variable. This procedure is continued until all four variables are identified and the refinement in the system-level performance is computed. The advantages of the proposed sequential refinement methodology over the all-at-once uncertainty refinement approach are explained and numerically demonstrated, before being applied to the NASA Langley Research Center Uncertainty Quantification Challenge problem.
    Uncertainty Quantification
    Sensitivity Analysis
    Propagation of uncertainty
    Statistical Inference
    Citations (7)
    Some Important Definitions.- Probability Functions.- Other Probability Functions.- Evaluation of Measurement Data.- Propagation of Errors/Uncertainty.- Uncertainty and Calibration of Instruments.- Calculation of Uncertainty.- Uncertainty in Calibration of a Surface Plate.- Uncertainty in Mass Measurement.- Uncertainty in Volumetric Measurement.- Uncertainty in Calibration of Some More Physical Instruments.- Uncertainty in Calibration of Electrical Instruments.
    Propagation of uncertainty
    Standard uncertainty
    Sensitivity Analysis
    Measuring instrument
    Citations (39)
    Aiming at providing more accurate information for the first restoration, overhaul and maintenance of airborne products, this paper gives a calendar life assessment method for airborne products considering epistemic and aleatory uncertainty on the basis of investigation and analysis for the factors which affect the uncertainty of calendar life assessment. The uncertainty of calendar life is modeled as two types: epistemic uncertainty and aleatory uncertainty. a aleatory uncertainty model of calendar life based on interval analysis and blind number theory is given with parameter sensitivity analysis. Considering the designer's problem of incomplete cognition of failure mechanism, a measurement method of epistemic uncertainty is given based on the concept of confidence reliability.
    Uncertainty Quantification
    Interval arithmetic
    Possibility theory
    Citations (0)
    This paper presents a comprehensive methodology that combines uncertainty quantification, uncertainty propagation, and design optimization using a Bayesian framework. The epistemic uncertainty due to input data uncertainty is considered. Two types of uncertainty models for input variables and/or their distribution parameters are addressed: 1) uncertainty modeled as family of distributions, and 2) uncertainty modeled as interval data. A Bayesian approach is adopted to update the uncertainty models, where the likelihood functions are constructed using limited experimental data. Global sensitivity analysis, which previously only considered aleatory inputs in the context of probabilistic representation, is extended in this paper to quantify the contributions of both aleatory and epistemic uncertainty sources for multioutput problems using an auxiliary variable approach. Gaussian process surrogate modeling is employed to replace the expensive physics models and improve the computational efficiency. A previously developed bias-minimization technique, which only dealt with single-output functions, is extended to reduce the surrogate model error for a multioutput function. A decoupled robustness-based design optimization framework is developed to include both aleatory and epistemic uncertainties. The proposed methodology is illustrated using the NASA Langley Research Center's multidisciplinary uncertainty quantification challenge problem.
    Uncertainty Quantification
    Sensitivity Analysis
    Propagation of uncertainty
    Robustness
    Surrogate model
    Sobol sequence
    Citations (15)
    Results from risk analyses are always uncertain, and to make good decisions about risk, it is important that this uncertainty is understood by the decision-maker. A discussion is provided of uncertainty and specifically what the main contributors and sources of uncertainty are. Epistemic and aleatory uncertainty is introduced, and model uncertainty, parameter uncertainty, and completeness uncertainty are described and discussed. Methods for uncertainty analysis are briefly introduced and also sensitivity analysis as a tool for investigating the effect of uncertainty on the results from risk analysis.
    Uncertainty Quantification
    Sensitivity Analysis
    Completeness (order theory)
    Decision maker
    Uncertainty
    Propagation of uncertainty
    A comprehensive uncertainty analysis is an important part of any engineering calculation because it allows stakeholders to assess the confidence in the conclusions that come from the calculation. Without this necessary step in hypothesis testing and model validation, the true accuracy and confidence of predictions cannot be completely quantified. An experimental and analytical approach is presented in this work to determine the kinetics and energetics of pyrolysis which form the foundation for larger scale models of material burning. A detailed uncertainty quantification which utilized the generalized polynomial chaos expansion method is presented to propagate uncertainty through the pyrolysis model representation of a thermal analysis experiment. Using this methodology, the uncertainty in the predictions was found to be comparable to the uncertainty in the experimental thermal analysis data within calibration conditions. Sensitivity analyses revealed the largest contributor to uncertainty in heat flow rate predictions was the specific heat capacity of all components. Comparison of the uncertainty quantification and sensitivity analyses to well-known results validated the method of uncertainty propagation for more complex scenarios.
    Uncertainty Quantification
    Propagation of uncertainty
    Polynomial Chaos
    Sensitivity Analysis
    Representation
    Experimental data
    Uncertainty is ubiquitous in tolerance analysis problem. This paper deals with tolerance analysis formulation, more particularly, with the uncertainty which is necessary to take into account into the foundation of this formulation. It presents: a brief view of the uncertainty classification: Aleatory uncertainty comes from the inherent uncertain nature and phenomena and epistemic uncertainty comes from the lack of knowledge a formulation of the tolerance analysis problem based on this classification its development: Aleatory uncertainty is modeled by probability distributions while epistemic uncertainty is modeled by intervals; Monte Carlo simulation is employed for probabilistic analysis while nonlinear optimization is used for interval analysis.
    Uncertainty Quantification
    Interval arithmetic
    Sensitivity Analysis
    Propagation of uncertainty
    Foundation (evidence)