Publications

49 Results
Skip to search filters

Network Uncertainty Quantification for Analysis of Multi-Component Systems

ASCE-ASME Journal of Risk and Uncertainty in Engineering Systems. Part B. Mechanical Engineering

Tencer, John T.; efrojas, efrojas; Schroeder, Benjamin B.

To impact physical mechanical system design decisions and realize the full promise of high-fidelity computational tools, simulation results must be integrated at the earliest stages of the design process. This is particularly challenging when dealing with uncertainty and optimizing for system-level performance metrics, as full-system models (often notoriously expensive and time-consuming to develop) are generally required to propagate uncertainties to system-level quantities of interest. Methods for propagating parameter and boundary condition uncertainty in networks of interconnected components hold promise for enabling design under uncertainty in real-world applications. These methods avoid the need for time consuming mesh generation of full-system geometries when changes are made to components or subassemblies. Additionally, they explicitly tie full-system model predictions to component/subassembly validation data which is valuable for qualification. These methods work by leveraging the fact that many engineered systems are inherently modular, being comprised of a hierarchy of components and subassemblies that are individually modified or replaced to define new system designs. By doing so, these methods enable rapid model development and the incorporation of uncertainty quantification earlier in the design process. The resulting formulation of the uncertainty propagation problem is iterative. We express the system model as a network of interconnected component models, which exchange solution information at component boundaries. We present a pair of approaches for propagating uncertainty in this type of decomposed system and provide implementations in the form of an open-source software library. We demonstrate these tools on a variety of applications and demonstrate the impact of problem-specific details on the performance and accuracy of the resulting UQ analysis. This work represents the most comprehensive investigation of these network uncertainty propagation methods to date.

More Details

Credible, Automated Meshing of Images (CAMI)

Roberts, Scott A.; Donohoe, Brendan D.; Martinez, Carianne M.; Krygier, Michael K.; Hernandez-Sanchez, Bernadette A.; Foster, Collin W.; Collins, Lincoln; Greene, Benjamin G.; Noble, David R.; Norris, Chance A.; Potter, Kevin M.; Roberts, Christine C.; Neal, Kyle D.; Bernard, Sylvain R.; Schroeder, Benjamin B.; Trembacki, Bradley L.; LaBonte, Tyler L.; Sharma, Krish S.; Ganter, Tyler G.; Jones, Jessica E.; Smith, Matthew D.

Abstract not provided.

Multi-fidelity electrochemical modeling of thermally activated battery cells

Journal of Power Sources

Voskuilen, Tyler V.; Moffat, Harry K.; Schroeder, Benjamin B.

Thermally activated batteries undergo a series of coupled physical changes during activation that influence battery performance. These processes include energetic material burning, heat transfer, electrolyte phase change, capillary-driven two-phase porous flow, ion transport, electrochemical reactions, and electrical transport. Several of these processes are strongly coupled and have a significant effect on battery performance, but others have minimal impact or may be suitably represented by reduced-order models. Assessing the relative importance of these phenomena must be based on comparisons to a high-fidelity model including all known processes. In this work, we first present and demonstrate a high-fidelity, multi-physics model of electrochemical performance. This novel multi-physics model enables predictions of how competing physical processes affect battery performance and provides unique insights into the difficult-to-measure processes that happen during battery activation. We introduce four categories of model fidelity that include different physical simplifications, assumptions, and reduced-order models to decouple or remove costly elements of the simulation. Using this approach, we show an order-of-magnitude reduction in computational cost while preserving all design-relevant quantities of interest within 5 percent. The validity of this approach and these model reductions is demonstrated by comparison between results from the full fidelity model and the different reduced models.

More Details

Robust importance sampling for bayesian model calibration with spatiotemporal data

International Journal for Uncertainty Quantification

Neal, Kyle D.; Schroeder, Benjamin B.; Mullins, Joshua; Subramanian, Abhinav; Mahadevan, Sankaran

This paper addresses two challenges in Bayesian calibration: (1) computational speed of existing sampling algorithms and (2) calibration with spatiotemporal responses. The commonly used Markov chain Monte Carlo (MCMC) approaches require many sequential model evaluations making the computational expense prohibitive. This paper proposes an efficient sampling algorithm: iterative importance sampling with genetic algorithm (IISGA). While iterative importance sampling enables computational efficiency, the genetic algorithm enables robustness by preventing sample degeneration and avoids getting stuck in multimodal search spaces. An inflated likelihood further enables robustness in high-dimensional parameter spaces by enlarging the target distribution. Spatiotemporal data complicate both surrogate modeling, which is necessary for expensive computational models, and the likelihood estimation. In this work, singular value decomposition is investigated for reducing the high-dimensional field data to a lower-dimensional space prior to Bayesian calibration. Then the likelihood is formulated and Bayesian inference is performed in the lower-dimension, latent space. An illustrative example is provided to demonstrate IISGA relative to existing sampling methods, and then IISGA is employed to calibrate a thermal battery model with 26 uncertain calibration parameters and spatiotemporal response data.

More Details

Optimizing a falling particle receiver geometry using CFD simulations to maximize the thermal efficiency

AIP Conference Proceedings

Mills, Brantley M.; Schroeder, Benjamin B.; Yue, Lindsey; Shaeffer, Reid; Ho, Clifford K.

A strategy to optimize the thermal efficiency of falling particle receivers (FPRs) in concentrating solar power applications is described in this paper. FPRs are a critical component of a falling particle system, and receiver designs with high thermal efficiencies (~90%) for particle outlet temperatures > 700°C have been targeted for next generation systems. Advective losses are one of the most significant loss mechanisms for FPRs. Hence, this optimization aims to find receiver geometries that passively minimize these losses. The optimization strategy consists of a series of simulations varying different geometric parameters on a conceptual receiver design for the Generation 3 Particle Pilot Plant (G3P3) project using simplified CFD models to model the flow. A linear polynomial surrogate model was fit to the resulting data set, and a global optimization routine was then executed on the surrogate to reveal an optimized receiver geometry that minimized advective losses. This optimized receiver geometry was then evaluated with more rigorous CFD models, revealing a thermal efficiency of 86.9% for an average particle temperature increase of 193.6°C and advective losses less than 3.5% of the total incident thermal power in quiescent conditions.

More Details

A causal perspective on reliability assessment

Reliability Engineering and System Safety

Hund, Lauren H.; Schroeder, Benjamin B.

Causality in an engineered system pertains to how a system output changes due to a controlled change or intervention on the system or system environment. Engineered systems designs reflect a causal theory regarding how a system will work, and predicting the reliability of such systems typically requires knowledge of this underlying causal structure. The aim of this work is to introduce causal modeling tools that inform reliability predictions based on biased data sources. We present a novel application of the popular structural causal modeling (SCM) framework to reliability estimation in an engineering application, illustrating how this framework can inform whether reliability is estimable and how to estimate reliability given a set of data and assumptions about the subject matter and data generating mechanism. When data are insufficient for estimation, sensitivity studies based on problem-specific knowledge can inform how much reliability estimates can change due to biases in the data and what information should be collected next to provide the most additional information. We apply the approach to a pedagogical example related to a real, but proprietary, engineering application, considering how two types of biases in data can influence a reliability calculation.

More Details

The need for credibility guidance for analyses quantifying margin and uncertainty

Conference Proceedings of the Society for Experimental Mechanics Series

Schroeder, Benjamin B.; Hund, Lauren H.; Kittinger, Robert

Current quantification of margin and uncertainty (QMU) guidance lacks a consistent framework for communicating the credibility of analysis results. Recent efforts at providing QMU guidance have pushed for broadening the analyses supporting QMU results beyond extrapolative statistical models to include a more holistic picture of risk, including information garnered from both experimental campaigns and computational simulations. Credibility guidance would assist in the consideration of belief-based aspects of an analysis. Such guidance exists for presenting computational simulation-based analyses and is under development for the integration of experimental data into computational simulations (calibration or validation), but is absent for the ultimate QMU product resulting from experimental or computational analyses. A QMU credibility assessment framework comprised of five elements is proposed: requirement definitions and quantity of interest selection, data quality, model uncertainty, calibration/parameter estimation, and validation. Through considering and reporting on these elements during a QMU analysis, the decision-maker will receive a more complete description of the analysis and be better positioned to understand the risks involved with using the analysis to support a decision. A molten salt battery application is used to demonstrate the proposed QMU credibility framework.

More Details

Separability of Mesh Bias and Parametric Uncertainty for a Full System Thermal Analysis

Journal of Verification, Validation and Uncertainty Quantification

Schroeder, Benjamin B.; Silva, Humberto; Smith, Kyle D.

When making computational simulation predictions of multiphysics engineering systems, sources of uncertainty in the prediction need to be acknowledged and included in the analysis within the current paradigm of striving for simulation credibility. A thermal analysis of an aerospace geometry was performed at Sandia National Laboratories. Here, for this analysis, a verification, validation, and uncertainty quantification (VVUQ) workflow provided structure for the analysis, resulting in the quantification of significant uncertainty sources including spatial numerical error and material property parametric uncertainty. It was hypothesized that the parametric uncertainty and numerical errors were independent and separable for this application. This hypothesis was supported by performing uncertainty quantification (UQ) simulations at multiple mesh resolutions, while being limited by resources to minimize the number of medium and high resolution simulations. In conclusion, based on this supported hypothesis, a prediction including parametric uncertainty and a systematic mesh bias is used to make a margin assessment that avoids unnecessary uncertainty obscuring the results and optimizes use of computing resources.

More Details

Simple effective conservative treatment of uncertainty from sparse samples of random functions

ASCE-ASME Journal of Risk and Uncertainty in Engineering Systems. Part B. Mechanical Engineering

Romero, Vicente J.; Schroeder, Benjamin B.; Dempsey, James F.; Lewis, John R.; Breivik, Nicole L.; Orient, George E.; Antoun, Bonnie R.; Winokur, Justin W.; Glickman, Matthew R.; Red-Horse, John R.

This paper examines the variability of predicted responses when multiple stress-strain curves (reflecting variability from replicate material tests) are propagated through a finite element model of a ductile steel can being slowly crushed. Over 140 response quantities of interest (including displacements, stresses, strains, and calculated measures of material damage) are tracked in the simulations. Each response quantity’s behavior varies according to the particular stress-strain curves used for the materials in the model. We desire to estimate response variability when only a few stress-strain curve samples are available from material testing. Propagation of just a few samples will usually result in significantly underestimated response uncertainty relative to propagation of a much larger population that adequately samples the presiding random-function source. A simple classical statistical method, Tolerance Intervals, is tested for effectively treating sparse stress-strain curve data. The method is found to perform well on the highly nonlinear input-to-output response mappings and non-standard response distributions in the can-crush problem. The results and discussion in this paper support a proposition that the method will apply similarly well for other sparsely sampled random variable or function data, whether from experiments or models. Finally, the simple Tolerance Interval method is also demonstrated to be very economical.

More Details

Separability of Mesh Bias and Parametric Uncertainty for a Full System Thermal Analysis

Schroeder, Benjamin B.; Silva, Humberto; Smith, Kyle D.

When making computational simulation predictions of multi-physics engineering systems, sources of uncertainty in the prediction need to be acknowledged and included in the analysis within the current paradigm of striving for simulation credibility. A thermal analysis of an aerospace geometry was performed at Sandia National Laboratories. For this analysis a verification, validation and uncertainty quantification workflow provided structure for the analysis, resulting in the quantification of significant uncertainty sources including spatial numerical error and material property parametric uncertainty. It was hypothesized that the parametric uncertainty and numerical errors were independent and separable for this application. This hypothesis was supported by performing uncertainty quantification simulations at multiple mesh resolutions, while being limited by resources to minimize the number of medium and high resolution simulations. Based on this supported hypothesis, a prediction including parametric uncertainty and a systematic mesh bias are used to make a margin assessment that avoids unnecessary uncertainty obscuring the results and optimizes computing resources.

More Details

Evaluation of a Class of Simple and Effective Uncertainty Methods for Sparse Samples of Random Variables and Functions

Romero, Vicente J.; Bonney, Matthew S.; Schroeder, Benjamin B.; Weirs, Vincent G.

When very few samples of a random quantity are available from a source distribution of unknown shape, it is usually not possible to accurately infer the exact distribution from which the data samples come. Under-estimation of important quantities such as response variance and failure probabilities can result. For many engineering purposes, including design and risk analysis, we attempt to avoid under-estimation with a strategy to conservatively estimate (bound) these types of quantities -- without being overly conservative -- when only a few samples of a random quantity are available from model predictions or replicate experiments. This report examines a class of related sparse-data uncertainty representation and inference approaches that are relatively simple, inexpensive, and effective. Tradeoffs between the methods' conservatism, reliability, and risk versus number of data samples (cost) are quantified with multi-attribute metrics use d to assess method performance for conservative estimation of two representative quantities: central 95% of response; and 10-4 probability of exceeding a response threshold in a tail of the distribution. Each method's performance is characterized with 10,000 random trials on a large number of diverse and challenging distributions. The best method and number of samples to use in a given circumstance depends on the uncertainty quantity to be estimated, the PDF character, and the desired reliability of bounding the true value. On the basis of this large data base and study, a strategy is proposed for selecting the method and number of samples for attaining reasonable credibility levels in bounding these types of quantities when sparse samples of random variables or functions are available from experiments or simulations.

More Details

Robust approaches to quantification of margin and uncertainty for sparse data

Hund, Lauren H.; Schroeder, Benjamin B.; Rumsey, Kelin R.; Murchison, Nicole M.

Characterizing the tails of probability distributions plays a key role in quantification of margins and uncertainties (QMU), where the goal is characterization of low probability, high consequence events based on continuous measures of performance. When data are collected using physical experimentation, probability distributions are typically fit using statistical methods based on the collected data, and these parametric distributional assumptions are often used to extrapolate about the extreme tail behavior of the underlying probability distribution. In this project, we character- ize the risk associated with such tail extrapolation. Specifically, we conducted a scaling study to demonstrate the large magnitude of the risk; then, we developed new methods for communicat- ing risk associated with tail extrapolation from unvalidated statistical models; lastly, we proposed a Bayesian data-integration framework to mitigate tail extrapolation risk through integrating ad- ditional information. We conclude that decision-making using QMU is a complex process that cannot be achieved using statistical analyses alone.

More Details

Summary of the 2014 Sandia V&V Challenge Workshop

Journal of Verification, Validation and Uncertainty Quantification

Schroeder, Benjamin B.; Hu, Kenneth H.; Mullins, Joshua; Winokur, Justin W.

A discussion of the five responses to the 2014 Sandia Verification and Validation (V&V) Challenge Problem, presented within this special issue, is provided hereafter. Overviews of the challenge problem workshop, workshop participants, and the problem statement are also included. Brief summations of teams' responses to the challenge problem are provided. Issues that arose throughout the responses that are deemed applicable to the general verification, validation, and uncertainty quantification (VVUQ) community are the main focal point of this paper. The discussion is oriented and organized into big picture comparison of data and model usage, VVUQ activities, and differentiating conceptual themes behind the teams' VVUQ strategies. Significant differences are noted in the teams' approaches toward all VVUQ activities, and those deemed most relevant are discussed. Beyond the specific details of VVUQ implementations, thematic concepts are found to create differences among the approaches; some of the major themes are discussed. Lastly, an encapsulation of the key contributions, the lessons learned, and advice for the future are presented.

More Details

Can-crush model and simulations for verifying uncertainty quantification method for sparse stress-strain curve data

ASME International Mechanical Engineering Congress and Exposition, Proceedings (IMECE)

Dempsey, James F.; Romero, Vicente J.; Breivik, Nicole L.; Orient, G.; Antoun, Bonnie R.; Schroeder, Benjamin B.; Lewis, John R.; Winokur, Justin W.

This work examines the variability of predicted responses when multiple stress-strain curves (reflecting variability from replicate material tests) are propagated through a transient dynamics finite element model of a ductile steel can being slowly crushed. An elastic-plastic constitutive model is employed in the large-deformation simulations. The present work assigns the same material to all the can parts: lids, walls, and weld. Time histories of 18 response quantities of interest (including displacements, stresses, strains, and calculated measures of material damage) at several locations on the can and various points in time are monitored in the simulations. Each response quantity's behavior varies according to the particular stressstrain curves used for the materials in the model. We estimate response variability due to variability of the input material curves. When only a few stress-strain curves are available from material testing, response variance will usually be significantly underestimated. This is undesirable for many engineering purposes. This paper describes the can-crush model and simulations used to evaluate a simple classical statistical method, Tolerance Intervals (TIs), for effectively compensating for sparse stress-strain curve data in the can-crush problem. Using the simulation results presented here, the accuracy and reliability of the TI method are being evaluated on the highly nonlinear inputto- output response mappings and non-standard response distributions in the can-crush UQ problem.

More Details
49 Results
49 Results