Publications

33 Results
Skip to search filters

V&V framework

Hills, Richard G.; Maniaci, David C.; Naughton, Jonathan W.

A Verification and Validation (V&V) framework is presented for the development and execution of coordinated modeling and experimental program s to assess the predictive capability of computational models of complex systems through focused, well structured, and formal processes.The elements of the framework are based on established V&V methodology developed by various organizations including the Department of Energy, National Aeronautics and Space Administration, the American Institute of Aeronautics and Astronautics, and the American Society of Mechanical Engineers. Four main topics are addressed: 1) Program planning based on expert elicitation of the modeling physics requirements, 2) experimental design for model assessment, 3) uncertainty quantification for experimental observations and computational model simulations, and 4) assessment of the model predictive capability. The audience for this document includes program planners, modelers, experimentalist, V &V specialist, and customers of the modeling results.

More Details

Addressing Model Form Error for Time-Dependent Conservation Equations

Hills, Richard G.

Model form error of the type considered here is error due to an approximate or incorrect representation of physics by a computational model. Typical approaches to adjust a model based on observed differences between experiment and prediction are to calibrate the model parameters utilizing the observed discrepancies and to develop parameterized additive corrections to the model output. These approaches are generally not suitable if significant physics is missing from the model and the desired quantities of interest for an application are different than those used for calibration. The approach developed here is to build a corrected surrogate solver through a multi- step process: 1) Sampled simulation results are used to develop a surrogate computational solver that maintains the overall conservative principles of the unmodified governing equations, 2) the surrogate solver is applied to candidate linear and non-linear corrector terms to develop corrections that are consistent with the original conservative principles, 3) constant multipliers on the these terms are calibrated using the experimental observations, and 4) the resulting surrogate solver is used to predict application response for the quantity of interest. This approach and several other calibration-based approaches were applied to an example problem based on the diffusive Burgers' equation. While all the approaches provided some model correction when the measure/calibration quantity was the same as that for an application, only the present approach was able to adequately correct the CompSim results when the prediction quantity was different from the calibration quantity.

More Details

V&V Framework Part 1 Release

Hills, Richard G.; Maniaci, David C.; Naughton, Jonathan W.

The objective of this document is to accurately predict, assess and optimize wind plant performance utilizing High Performance Modeling (HPC) tools developed in a community-based, open-source simulation environment to understand and accurately predict the fundamental physics and complex flows of the atmospheric boundary layer, interaction with the wind plant, as well as the response of individual turbines to the complex flows within that plant

More Details

Roll-up of validation results to a target application

Hills, Richard G.

Suites of experiments are preformed over a validation hierarchy to test computational simulation models for complex applications. Experiments within the hierarchy can be performed at different conditions and configurations than those for an intended application, with each experiment testing only part of the physics relevant for the application. The purpose of the present work is to develop methodology to roll-up validation results to an application, and to assess the impact the validation hierarchy design has on the roll-up results. The roll-up is accomplished through the development of a meta-model that relates validation measurements throughout a hierarchy to the desired response quantities for the target application. The meta-model is developed using the computation simulation models for the experiments and the application. The meta-model approach is applied to a series of example transport problems that represent complete and incomplete coverage of the physics of the target application by the validation experiments.

More Details

Development of a fourth generation predictive capability maturity model

Hills, Richard G.; Witkowski, Walter R.; Rider, William J.; Trucano, Timothy G.; Urbina, Angel U.

The Predictive Capability Maturity Model (PCMM) is an expert elicitation tool designed to characterize and communicate completeness of the approaches used for computational model definition, verification, validation, and uncertainty quantification associated for an intended application. The primary application of this tool at Sandia National Laboratories (SNL) has been for physics-based computational simulations in support of nuclear weapons applications. The two main goals of a PCMM evaluation are 1) the communication of computational simulation capability, accurately and transparently, and 2) the development of input for effective planning. As a result of the increasing importance of computational simulation to SNLs mission, the PCMM has evolved through multiple generations with the goal to provide more clarity, rigor, and completeness in its application. This report describes the approach used to develop the fourth generation of the PCMM.

More Details

Relation of validation experiments to applications

Numerical Heat Transfer, Part B: Fundamentals

Hamilton, J.R.; Hills, Richard G.

Model validation efforts often use a suite of experiments to provide data to test models for predictive use for a targeted application. A question that naturally arises is Does the experimental suite provide data to adequately test the target application model? The goal of this article is to develop methodology to partially address this question. The methodology utilizes computational models for the individual test suite experiments and for the target application, to assess coverage. The impact of uncertainties in model parameters on the assessment is addressed. Simple linear and nonlinear heat conduction examples of the methodology are provided. Copyright © Taylor & Francis Group, LLC.

More Details

Overview of ASME V&V 20-2009 standard for verification and validation in computational fluid mechanics and heat transfer

Hills, Richard G.

The objective of this Standard is the specification of a verification and validation approach that quantifies the degree of accuracy inferred from the comparison of solution and data for a specified variable at a specified validation point. The approach uses the concepts from experimental uncertainty analysis to consider the errors and uncertainties in both the solution and the data. The scope of this Standard is the quantification of the degree of accuracy of simulation of specified validation variables at a specified validation point for cases in which the conditions of the actual experiment are simulated. Consideration of solution accuracy at points within a domain other than the validation points, for example interpolation/extrapolation in a domain of validation, is a matter of engineering judgment specific to each family of problems and is beyond the scope of this Standard.

More Details

Political dynamics determined by interactions between political leaders and voters

Bernard, Michael L.; Backus, George A.; Hills, Richard G.

The political dynamics associated with an election are typically a function of the interplay between political leaders and voters, as well as endogenous and exogenous factors that impact the perceptions and goals of the electorate. This paper describes an effort by Sandia National Laboratories to model the attitudes and behaviors of various political groups along with that population's primary influencers, such as government leaders. To accomplish this, Sandia National Laboratories is creating a hybrid system dynamics-cognitive model to simulate systems- and individual-level political dynamics in a hypothetical society. The model is based on well-established psychological theory, applied to both individuals and groups within the modeled society. Confidence management processes are being incorporated into the model design process to increase the utility of the tool and assess its performance. This project will enhance understanding of how political dynamics are determined in democratic society.

More Details

Some guidance on preparing validation plans for the DART Full System Models

Gray, Genetha A.; Hills, Richard G.

Planning is an important part of computational model verification and validation (V&V) and the requisite planning document is vital for effectively executing the plan. The document provides a means of communicating intent to the typically large group of people, from program management to analysts to test engineers, who must work together to complete the validation activities. This report provides guidelines for writing a validation plan. It describes the components of such a plan and includes important references and resources. While the initial target audience is the DART Full System Model teams in the nuclear weapons program, the guidelines are generally applicable to other modeling efforts. Our goal in writing this document is to provide a framework for consistency in validation plans across weapon systems, different types of models, and different scenarios. Specific details contained in any given validation plan will vary according to application requirements and available resources.

More Details

Relation of validation experiments to applications

Hills, Richard G.

Computational and mathematical models are developed in engineering to represent the behavior of physical systems to various system inputs and conditions. These models are often used to predict at other conditions, rather than to just reproduce the behavior of data obtained at the experimental conditions. For example, the boundary or initial conditions, time of prediction, geometry, material properties, and other model parameters can be different at test conditions than those for an anticipated application of a model. Situations for which the conditions may differ include those for which (1) one is in the design phase and a prototype of the system has not been constructed and tested under the anticipated conditions, (2) only one version of a final system can be built and destructive testing is not feasible, or (3) the anticipated design conditions are variable and one cannot easily reproduce the range of conditions with a limited number of carefully controlled experiments. Because data from these supporting experiments have value in model validation, even if the model was tested at different conditions than an anticipated application, methodology is required to evaluate the ability of the validation experiments to resolve the critical behavior for the anticipated application. The methodology presented uses models for the validation experiments and a model for the application to address how well the validation experiments resolve the application. More specifically, the methodology investigates the tradeoff that exists between the uncertainty (variability) in the behavior of the resolved critical variables for the anticipated application and the ability of the validation experiments to resolve this behavior. The important features of this approach are demonstrated through simple linear and non-linear heat conduction examples.

More Details

Statistical Validation of Engineering and Scientific Models: A Maximum Likelihood Based Metric

Hills, Richard G.; Trucano, Timothy G.; Trucano, Timothy G.

Two major issues associated with model validation are addressed here. First, we present a maximum likelihood approach to define and evaluate a model validation metric. The advantage of this approach is it is more easily applied to nonlinear problems than the methods presented earlier by Hills and Trucano (1999, 2001); the method is based on optimization for which software packages are readily available; and the method can more easily be extended to handle measurement uncertainty and prediction uncertainty with different probability structures. Several examples are presented utilizing this metric. We show conditions under which this approach reduces to the approach developed previously by Hills and Trucano (2001). Secondly, we expand our earlier discussions (Hills and Trucano, 1999, 2001) on the impact of multivariate correlation and the effect of this on model validation metrics. We show that ignoring correlation in multivariate data can lead to misleading results, such as rejecting a good model when sufficient evidence to do so is not available.

More Details

Description of the Sandia Validation Metrics Project

Trucano, Timothy G.; Easterling, Robert G.; Dowding, Kevin J.; Paez, Thomas L.; Urbina, Angel U.; Romero, Vicente J.; Rutherford, Brian M.; Hills, Richard G.

This report describes the underlying principles and goals of the Sandia ASCI Verification and Validation Program Validation Metrics Project. It also gives a technical description of two case studies, one in structural dynamics and the other in thermomechanics, that serve to focus the technical work of the project in Fiscal Year 2001.

More Details
33 Results
33 Results