Publications

Results 1–50 of 81

Search results

Jump to search filters

Economic analysis of model validation for a challenge problem

Journal of Verification, Validation and Uncertainty Quantification

Hu, Kenneth; Paez, Thomas L.; Paez, Paul J.; Hasselman, Timothy K.

It is now commonplace for engineers to build mathematical models of the systems they are designing, building, or testing. And, it is nearly universally accepted that phenomenological models of physical systems must be validated prior to use for prediction in consequential scenarios. Yet, there are certain situations in which testing only or no testing and no modeling may be economically viable alternatives to modeling and its associated testing. This paper develops an economic framework within which benefit–cost can be evaluated for modeling and model validation relative to other options. The development is presented in terms of a challenge problem. As a result, we provide a numerical example that quantifies when modeling, calibration, and validation yield higher benefit–cost than a testing only or no modeling and no testing option.

More Details

Why Do Verification and Validation?

Journal of Verification, Validation and Uncertainty Quantification

Hu, Kenneth; Paez, Thomas L.

In this discussion paper, we explore different ways to assess the value of verification and validation (V&V) of engineering models. We first present a literature review on the value of V&V and then use value chains and decision trees to show how value can be assessed from a decision maker’s perspective. In this context, the value is what the decision maker is willing to pay for V&V analysis with the understanding that the V&V results are uncertain. The 2014 Sandia V&V Challenge Workshop is used to illustrate these ideas.

More Details

Laser tracker TSPI uncertainty quantification via centrifuge trajectory

Proceedings of SPIE - The International Society for Optical Engineering

Romero, Edward; Paez, Thomas L.; Brown, Timothy; Miller, Timothy J.

Sandia National Laboratories currently utilizes two laser tracking systems to provide time-space-position-information (TSPI) and high speed digital imaging of test units under flight. These laser trackers have been in operation for decades under the premise of theoretical accuracies based on system design and operator estimates. Advances in optical imaging and atmospheric tracking technology have enabled opportunities to provide more precise six degree of freedom measurements from these trackers. Applying these technologies to the laser trackers requires quantified understanding of their current errors and uncertainty. It was well understood that an assortment of variables contributed to laser tracker uncertainty but the magnitude of these contributions was not quantified and documented. A series of experiments was performed at Sandia National Laboratories large centrifuge complex to quantify TSPI uncertainties of Sandia National Laboratories laser tracker III. The centrifuge was used to provide repeatable and economical test unit trajectories of a test-unit to use for TSPI comparison and uncertainty analysis. On a centrifuge, testunits undergo a known trajectory continuously with a known angular velocity. Each revolution may represent an independent test, which may be repeated many times over for magnitudes of data practical for statistical analysis. Previously these tests were performed at Sandia's rocket sled track facility but were found to be costly with challenges in the measurement ground truth TSPI. The centrifuge along with on-board measurement equipment was used to provide known ground truth position of test units. This paper discusses the experimental design and techniques used to arrive at measures of laser tracker error and uncertainty. © 2009 Copyright SPIE - The International Society for Optical Engineering.

More Details

Epistemic uncertainty quantification tutorial

Conference Proceedings of the Society for Experimental Mechanics Series

Swiler, Laura P.; Paez, Thomas L.; Mayes, Randall L.

This paper presents a basic tutorial on epistemic uncertainty quantification methods. Epistemic uncertainty, characterizing lack-of-knowledge, is often prevalent in engineering applications. However, the methods we have for analyzing and propagating epistemic uncertainty are not as nearly widely used or well-understood as methods to propagate aleatory uncertainty (e.g. inherent variability characterized by probability distributions). We examine three methods used in propagating epistemic uncertainties: interval analysis, Dempster-Shafer evidence theory, and second-order probability. We demonstrate examples of their use on a problem in structural dynamics. © 2009 Society for Experimental Mechanics Inc.

More Details

Introduction to model validation

Conference Proceedings of the Society for Experimental Mechanics Series

Paez, Thomas L.

The discipline of mathematical model validation is increasing in importance as the value of accurate models of physical systems increases. The fundamental activity of model validation is the comparison of predictions from a mathematical model of a system to the measured behavior of the system. This discussion motivates the need for model validation and introduces some preliminary elements of model validation. This is the first in a sequence of six tutorial presentations on model validation, and will introduce five presentations to follow. © 2009 Society for Experimental Mechanics Inc.

More Details

Probabilistic methods in model validation

Conference Proceedings of the Society for Experimental Mechanics Series

Paez, Thomas L.; Swiler, Laura P.

Extensive experimentation over the past decade has shown that fabricated physical systems that are intended to be identical, and are nominally identical, in fact, differ from one another, and sometimes substantially. This fact makes it difficult to validate a mathematical model for any system and results in the requirement to characterize physical system behavior using the tools of uncertainty quantification. Further, because of the existence of system, component, and material uncertainty, the mathematical models of these elements sometimes seek to reflect the uncertainty. This presentation introduces some of the methods of probability and statistics, and shows how they can be applied in engineering modeling and data analysis. The ideas of randomness and some basic means for measuring and modeling it are presented. The ideas of random experiment, random variable, mean, variance and standard deviation, and probability distribution are introduced. The ideas are introduced in the framework of a practical, yet simple, example; measured data are included. This presentation is the third in a sequence of tutorial discussions on mathematical model validation. The example introduced here is also used in later presentations. © 2009 Society for Experimental Mechanics Inc.

More Details

Validation of mathematical models using weighted response measures

Conference Proceedings of the Society for Experimental Mechanics Series

Paez, Thomas L.; Massad, Jordan; Hinnerichs, Terry D.; O'Gorman, Chris; Hunter, Patrick

Advancements in our capabilities to accurately model physical systems using high resolution finite element models have led to increasing use of models for prediction of physical system responses. Yet models are typically not used without first demonstrating their accuracy or, at least, adequacy. In high consequence applications where model predictions are used to make decisions or control operations involving human life or critical systems, a movement toward accreditation of mathematical model predictions via validation is taking hold. Model validation is the activity wherein the predictions of mathematical models are demonstrated to be accurate or adequate for use within a particular regime. Though many types of predictions can be made with mathematical models, not all predictions have the same impact on the usefulness of a model. For example, predictions where the response of a system is greatest may be most critical to the adequacy of a model. Therefore, a model that makes accurate predictions in some environments and poor predictions in other environments may be perfectly adequate for certain uses. The current investigation develops a general technique for validating mathematical models where the measures of response are weighted in some logical manner. A combined experimental and numerical example that demonstrates the validation of a system using both weighted and non-weighted response measures is presented.

More Details

Validation of a viscoelastic model for foam encapsulated component response over a wide temperature range

Conference Proceedings of the Society for Experimental Mechanics Series

Hinnerichs, Terry D.; Urbina, Angel U.; Paez, Thomas L.; O'Gorman, Chris; Hunter, Patrick

Accurate material models are fundamental to predictive structural finite element models. Because potting foams are routinely used to mitigate shock and vibration of encapsulated components in electro/mechanical systems, accurate material models of foams are needed. A linear-viscoelastic foam constitutive model has been developed to represent the foam's stiffness and damping throughout an application space defined by temperature, strain rate or frequency and strain level. Validation of this linear-viscoelastic model, which is integrated into the Salinas structural dynamics code, is being achieved by modeling and testing a series of structural geometries of increasing complexity that have been designed to ensure sensitivity to material parameters. Both experimental and analytical uncertainties are being quantified to ensure the fair assessment of model validity. Quantitative model validation metrics are being developed to provide a means of comparison for analytical model predictions to observations made in the experiments. This paper is one of several recent papers documenting the validation process for simple to complex structures with foam encapsulated components. This paper specifically focuses on model validation over a wide temperature range and using a simple dumbbell structure for modal testing and simulation. Material variations of density and modulus have been included. A double blind validation process is described that brings together test data with model predictions.

More Details

Validation of a viscoelastic model for foam encapsulated component response over a wide temperature range

Conference Proceedings of the Society for Experimental Mechanics Series

Hinnerichs, Terry; Urbina, Angel U.; Paez, Thomas L.; O'Gorman, Chris; Hunter, Patrick

Accurate material models are fundamental to predictive structural finite element models. Because potting foams are routinely used to mitigate shock and vibration of encapsulated components in electro/mechanical systems, accurate material models of foams are needed. A linear-viscoelastic foam constitutive model has been developed to represent the foam's stiffness and damping throughout an application space defined by temperature, strain rate or frequency and strain level. Validation of this linear-viscoelastic model, which is integrated into the Salinas structural dynamics code, is being achieved by modeling and testing a series of structural geometries of increasing complexity that have been designed to ensure sensitivity to material parameters. Both experimental and analytical uncertainties are being quantified to ensure the fair assessment of model validity. Quantitative model validation metrics are being developed to provide a means of comparison for analytical model predictions to observations made in the experiments. This paper is one of several recent papers documenting the validation process for simple to complex structures with foam encapsulated components. This paper specifically focuses on model validation over a wide temperature range and using a simple dumbbell structure for modal testing and simulation. Material variations of density and modulus have been included. A double blind validation process is described that brings together test data with model predictions.

More Details

The history of random vibrations through 1958

Mechanical Systems and Signal Processing

Paez, Thomas L.

Interest in the analysis of random vibrations of mechanical systems started to grow about a half century ago in response to the need for a theory that could accurately predict structural response to jet engine noise and missile launch-induced environments. However, the work that enabled development of the theory of random vibrations started about a half century earlier. This paper discusses contributions to the theory of random vibrations from the time of Einstein to the time of an MIT workshop that was organized by Crandall in 1958. © 2006 Elsevier Ltd. All rights reserved.

More Details

Top-down vs. bottom-up uncertainty quantification for validation of a mechanical joint model

Conference Proceedings of the Society for Experimental Mechanics Series

Hasselman, Timothy; Wathugala, G.W.; Urbina, Angel; Paez, Thomas L.

Mechanical systems behave randomly and it is desirable to capture this feature when making response predictions. Currently, there is an effort to develop predictive mathematical models and test their validity through the assessment of their predictive accuracy relative to experimental results. Traditionally, the approach to quantify modeling uncertainty is to examine the uncertainty associated with each of the critical model parameters and to propagate this through the model to obtain an estimate of uncertainty in model predictions. This approach is referred to as the "bottom-up" approach. However, parametric uncertainty does not account for all sources of the differences between model predictions and experimental observations, such as model form uncertainty and experimental uncertainty due to the variability of test conditions, measurements and data processing. Uncertainty quantification (UQ) based directly on the differences between model predictions and experimental data is referred to as the "top-down" approach. This paper discusses both the top-down and bottom-up approaches and uses the respective stochastic models to assess the validity of a joint model with respect to experimental data not used to calibrate the model, i.e. random vibration versus sine test data. Practical examples based on joint modeling and testing performed by Sandia are presented and conclusions are drawn as to the pros and cons of each approach.

More Details

Inductive model development for lithium-ion batteries to predict life and performance

Proposed for publication in the Electrochemical Society Symposium Publication.

Paez, Thomas L.; Jungst, Rudolph G.; Doughty, Daniel H.

Sandia National Laboratories has been conducting studies on performance of laboratory and commercial lithium-ion and other types of electrochemical cells using inductive models [1]. The objectives of these investigations are: (1) To develop procedures and techniques to rapidly determine performance degradation rates while these cells undergo life tests; (2) To model cell voltage and capacity in order to simulate cell performance characteristics under variable load and temperature conditions; (3) To model rechargeable battery degradation under charge/discharge cycles and many other conditions. The inductive model and methodology are particularly useful when complicated cell performance behaviors are involved, which are often difficult to be interpreted from simple empirical approaches. We find that the inductive model can be used effectively: (1) To enable efficient predictions of battery life; (2) To characterize system behavior. Inductive models provide convenient tools to characterize system behavior using experimentally or analytically derived data in an efficient and robust framework. The approach does not require detailed phenomenological development. There are certain advantages unique to this approach. Among these advantages is the ability to avoid making measurements of hard to determine physical parameters or having to understand cell processes sufficiently to write mathematical functions describing their behavior. We used artificial neural network for inductive modeling, along with ancillary mathematical tools to improve their accuracy. This paper summarizes efforts to use inductive tools for cell and battery modeling. Examples of numerical results will be presented. One of them is related to high power lithium-ion batteries tested under the U.S. Department of Energy Advanced Technology Development Program for hybrid vehicle applications. Sandia National Laboratories is involved in the development of accelerated life testing and thermal abuse tests to enhance the understanding of power and capacity fade issues and predict life of the battery under a nominal use condition. This paper will use power and capacity fade behaviors of a Ni-oxide-based lithium-ion battery system to illustrate how effective the inductive model can interpret the cell behavior and provide predictions of life. We will discuss the analysis of the fading behavior associated with the cell performance and explain how the model can predict cell performance.

More Details

Status and Integrated Road-Map for Joints Modeling Research

Segalman, Daniel J.; Smallwood, David O.; Sumali, Hartono (Anton); Paez, Thomas L.; Urbina, Angel U.

The constitutive behavior of mechanical joints is largely responsible for the energy dissipation and vibration damping in weapons systems. For reasons arising from the dramatically different length scales associated with those dissipative mechanisms and the length scales characteristic of the overall structure, this physics cannot be captured adequately through direct simulation of the contact mechanics within a structural dynamics analysis. The only practical method for accommodating the nonlinear nature of joint mechanisms within structural dynamic analysis is through constitutive models employing degrees of freedom natural to the scale of structural dynamics. This document discusses a road-map for developing such constitutive models.

More Details
Results 1–50 of 81
Results 1–50 of 81