Publications

Results 26–43 of 43
Skip to search filters

Dependence in probabilistic modeling, Dempster-Shafer theory, and probability bounds analysis

Oberkampf, William L.

This report summarizes methods to incorporate information (or lack of information) about inter-variable dependence into risk assessments that use Dempster-Shafer theory or probability bounds analysis to address epistemic and aleatory uncertainty. The report reviews techniques for simulating correlated variates for a given correlation measure and dependence model, computation of bounds on distribution functions under a specified dependence model, formulation of parametric and empirical dependence models, and bounding approaches that can be used when information about the intervariable dependence is incomplete. The report also reviews several of the most pervasive and dangerous myths among risk analysts about dependence in probabilistic models.

More Details

AIAA committee on standards for computational fluid dynamics - Status and plans

AIAA Paper

Cosner, Raymond R.; Oberkampf, William L.; Rahaim, Christopher P.; Shih, Tom I.P.

Computational simulation methods in areas such as fluid dynamics have become a critical element of the aerospace vehicle development process. However, engineering groups are reluctant to make critical design decisions based solely on Computational Fluid Dynamics (CFD). Instead, acquiring similar data from independent sources, such as wind tunnel testing, mitigates the perceived risks due to feared deficiencies in CFD data. Verification and validation of CFD codes and calculations is the process of determining the level of confidence that can be placed in the resulting CFD data. The AIAA Committee on Standards for CFD has been a significant contributor to the development of sound practices for CFD verification and validation. A summary of the recent work of this Committee is presented here.

More Details

Verification, validation, and predictive capability in computational engineering and physics

Applied Mechanics Reviews

Oberkampf, William L.; Trucano, Timothy G.; Hirsch, Charles

The views of state of art in verification and validation (V & V) in computational physics are discussed. These views are described in the framework in which predictive capability relies on V & V, as well as other factors that affect predictive capability. Some of the research topics addressed are development of improved procedures for the use of the phenomena identification and ranking table (PIRT) for prioritizing V & V activities, and the method of manufactured solutions for code verification. It also addressed development and use of hierarchical validation diagrams, and the construction and use of validation metrics incorporating statistical measures.

More Details

On the role of code comparisons in verification and validation

Trucano, Timothy G.; Trucano, Timothy G.; Pilch, Martin P.; Oberkampf, William L.

This report presents a perspective on the role of code comparison activities in verification and validation. We formally define the act of code comparison as the Code Comparison Principle (CCP) and investigate its application in both verification and validation. One of our primary conclusions is that the use of code comparisons for validation is improper and dangerous. We also conclude that while code comparisons may be argued to provide a beneficial component in code verification activities, there are higher quality code verification tasks that should take precedence. Finally, we provide a process for application of the CCP that we believe is minimal for achieving benefit in verification processes.

More Details

An exploration of alternative approaches to the representation of uncertainty in model predictions

Proposed for publication in Reliability Engineering and System Safety.

Oberkampf, William L.; Helton, J.C.; Johnson, J.D.

Several simple test problems are used to explore the following approaches to the representation of the uncertainty in model predictions that derives from uncertainty in model inputs: probability theory, evidence theory, possibility theory, and interval analysis. Each of the test problems has rather diffuse characterizations of the uncertainty in model inputs obtained from one or more equally credible sources. These given uncertainty characterizations are translated into the mathematical structure associated with each of the indicated approaches to the representation of uncertainty and then propagated through the model with Monte Carlo techniques to obtain the corresponding representation of the uncertainty in one or more model predictions. The different approaches to the representation of uncertainty can lead to very different appearing representations of the uncertainty in model predictions even though the starting information is exactly the same for each approach. To avoid misunderstandings and, potentially, bad decisions, these representations must be interpreted in the context of the theory/procedure from which they derive.

More Details

Constructing Probability Boxes and Dempster-Shafer Structures

Oberkampf, William L.

This report summarizes a variety of the most useful and commonly applied methods for obtaining Dempster-Shafer structures, and their mathematical kin probability boxes, from empirical information or theoretical knowledge. The report includes a review of the aggregation methods for handling agreement and conflict when multiple such objects are obtained from different sources.

More Details

Combination of Evidence in Dempster-Shafer Theory

Sentz, Kari S.; Oberkampf, William L.

Dempster-Shafer theory offers an alternative to traditional probabilistic theory for the mathematical representation of uncertainty. The significant innovation of this framework is that it allows for the allocation of a probability mass to sets or intervals. Dempster-Shafer theory does not require an assumption regarding the probability of the individual constituents of the set or interval. This is a potentially valuable tool for the evaluation of risk and reliability in engineering applications when it is not possible to obtain a precise measurement from experiments, or when knowledge is obtained from expert elicitation. An important aspect of this theory is the combination of evidence obtained from multiple sources and the modeling of conflict between them. This report surveys a number of possible combination rules for Dempster-Shafer structures and provides examples of the implementation of these rules for discrete and interval-valued data.

More Details

General Concepts for Experimental Validation of ASCI Code Applications

Trucano, Timothy G.; Pilch, Martin P.; Oberkampf, William L.

This report presents general concepts in a broadly applicable methodology for validation of Accelerated Strategic Computing Initiative (ASCI) codes for Defense Programs applications at Sandia National Laboratories. The concepts are defined and analyzed within the context of their relative roles in an experimental validation process. Examples of applying the proposed methodology to three existing experimental validation activities are provided in appendices, using an appraisal technique recommended in this report.

More Details

Verification and Validation in Computational Fluid Dynamics

Oberkampf, William L.; Trucano, Timothy G.

Verification and validation (V and V) are the primary means to assess accuracy and reliability in computational simulations. This paper presents an extensive review of the literature in V and V in computational fluid dynamics (CFD), discusses methods and procedures for assessing V and V, and develops a number of extensions to existing ideas. The review of the development of V and V terminology and methodology points out the contributions from members of the operations research, statistics, and CFD communities. Fundamental issues in V and V are addressed, such as code verification versus solution verification, model validation versus solution validation, the distinction between error and uncertainty, conceptual sources of error and uncertainty, and the relationship between validation and prediction. The fundamental strategy of verification is the identification and quantification of errors in the computational model and its solution. In verification activities, the accuracy of a computational solution is primarily measured relative to two types of highly accurate solutions: analytical solutions and highly accurate numerical solutions. Methods for determining the accuracy of numerical solutions are presented and the importance of software testing during verification activities is emphasized.

More Details

Methodology for characterizing modeling and discretization uncertainties in computational simulation

Alvin, Kenneth F.; Oberkampf, William L.; Rutherford, Brian M.; Diegert, Kathleen V.

This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.

More Details
Results 26–43 of 43
Results 26–43 of 43