Publications

Results 1–25 of 28
Skip to search filters

Improved Uncertainty Quantification with Advanced Reactor Application

Mousseau, Vincent A.

This document provides an overview of the economic and technical challenges related to bringing small modular reactors to market and then presents an outline for how to address the new challenges. The purpose of this project was to proactively design software for its intended use to provide a strategic positioning for work in the future. This project seeks to augment the short-term stop-gap approach of trying to use legacy software well outside of its range of applicability.

More Details

Simple Heat Pipe Model

Mousseau, Vincent A.; Clark, Andrew

This is a simple model designed to run fast but still maintain the key physics and feedback mechanisms of a heat pipe. First, the capillary pressure is a function of the liquid working fluid volume fraction. Second, the boiling and condensation are based on the saturation temperature that is based on the heat pipe pressure. When the pressure goes up, the saturation temperature goes up and the vapor rains on the wick. When the pressure goes down, the saturation temperature goes down and the liquid in the entire wick boils. This is how the heat pipe adjusts to stay robust under different temperatures and heat fluxes.

More Details

Preliminary Implementation of Two-Dimensional Cartesian Solver in CTF-R

Bullerwell, Lance B.; Porter, N.W.; Mousseau, Vincent A.

Sub-channel codes are one of the the modeling and simulation tools used for thermal-hydraulic analysis of nuclear reactors. A few examples of such sub-channel codes are the COolant Boiling in Rod Arrays (COBRA) family of codes. The approximations that are used to simplify the fluid conservation equations into sub-channel form, mainly that of axially-dominated flow, lead to noticeable limitations on sub-channels solvers for problems with significant flow in lateral directions. In this report, a two-dimensional Cartesian solver is developed and implemented within CTF-R, which is the residual solver in the North Carolina State University version of COBRA-TF (CTF). The new solver will enable CTF to simulate flow that is not axially-dominated. The appropriate Cartesian forms of the conservation equations are derived and implemented in the solver. Once the conservation equations are established, the process of constructing the matrix system was altered to solve a two-dimensional staggered grid system. A simple case was used to test that the two-dimensional Cartesian solver is accurate. The test problem does not include any source terms or flow in the lateral direction. The results show that the solver was able to run the simple case and converge to a steady-state solution. Future work will focus on testing existing capabilities by using test cases that include transients and equation cross-terms. Future work will also include adding additional capabilities such as enabling the solver to include cases with source terms and three dimensional cases.

More Details

Validation Metrics for Fixed Effects and Mixed-Effects Calibration

Journal of Verification, Validation and Uncertainty Quantification

Porter, N.W.; Maupin, Kathryn A.; Swiler, Laura P.; Mousseau, Vincent A.

The modern scientific process often involves the development of a predictive computational model. To improve its accuracy, a computational model can be calibrated to a set of experimental data. A variety of validation metrics can be used to quantify this process. Some of these metrics have direct physical interpretations and a history of use, while others, especially those for probabilistic data, are more difficult to interpret. In this work, a variety of validation metrics are used to quantify the accuracy of different calibration methods. Frequentist and Bayesian perspectives are used with both fixed effects and mixed-effects statistical models. Through a quantitative comparison of the resulting distributions, the most accurate calibration method can be selected. Two examples are included which compare the results of various validation metrics for different calibration methods. It is quantitatively shown that, in the presence of significant laboratory biases, a fixed effects calibration is significantly less accurate than a mixed-effects calibration. This is because the mixed-effects statistical model better characterizes the underlying parameter distributions than the fixed effects model. The results suggest that validation metrics can be used to select the most accurate calibration model for a particular empirical model with corresponding experimental data.

More Details

Local Truncation Error-Informed Code Verification

Journal of Verification, Validation and Uncertainty Quantification

Krueger, Aaron M.; Mousseau, Vincent A.; Hassan, Yassin H.

The method of manufactured solutions (MMS) has become increasingly popular in conducting code verification studies on predictive codes, such as nuclear power system codes and computational fluid dynamic codes. The reason for the popularity of this approach is that it can be used when an analytical solution is not available. Using MMS, code developers are able to verify that their code is free of coding errors that impact the observed order of accuracy. While MMS is still an excellent tool for code verification, it does not identify coding errors that are of the same order as the numerical method. This paper presents a method that combines MMS with modified equation analysis (MEA), which calculates the local truncation error (LTE) to identify coding error up to and including the order of the numerical method. This method is referred to as modified equation analysis methd of manufactured solutions (MEAMMS). MEAMMS is then applied to a custom-built code, which solves the shallow water equations, to test the performance of the code verification method. MEAMMS is able to detect all coding errors that impact the implementation of the numerical scheme. To show how MEAMMS is different than MMS, they are both applied to the same first-order numerical method test problem with a first-order coding error. When there are first-order coding errors, only MEAMMS is able to identify them. Finally, this shows that MEAMMS is able to identify a larger set of coding errors while still being able to identify the coding errors MMS is able to identify.

More Details

Bayesian calibration of empirical models common in MELCOR and other nuclear safety codes

18th International Topical Meeting on Nuclear Reactor Thermal Hydraulics, NURETH 2019

Porter, N.W.; Mousseau, Vincent A.

In modern scientific analyses, physical experiments are often supplemented with computational modeling and simulation. This is especially true in the nuclear power industry, where experiments are prohibitively expensive, or impossible, due to extreme scales, high temperatures, high pressures, and the presence of radiation. To qualify these computational tools, it is necessary to perform software quality assurance, verification, validation, and uncertainty quantification. As part of this broad process, the uncertainty of empirically derived models must be quantified. In this work, three commonly used thermal hydraulic models are calibrated to experimental data. The empirical equations are used to determine single phase friction factor in smooth tubes, single phase heat transfer coefficient for forced convection, and the transfer of mass between two phases. Bayesian calibration methods are used to estimate the posterior distribution of the parameters given the experimental data. In cases where it is appropriate, mixed-effects hierarchical calibration methods are utilized. The analyses presented in this work result in justified and reproducible joint parameter distributions which can be used in future uncertainty analysis of nuclear thermal hydraulic codes. When using these joint distributions, uncertainty in the output will be lower than traditional methods of determining parameter uncertainty. The lower uncertainties are more representative of the state of knowledge for the phenomena analyzed in this work.

More Details

Rigorous code verification: An additional tool to use with the method of manufactured solutions

ASME 2019 Verification and Validation Symposium, VVS 2019

Krueger, Aaron M.; Mousseau, Vincent A.; Hassan, Yassin A.

The Method of Manufactured Solutions (MMS) has proven to be useful for completing code verification studies. MMS allows the code developer to verify that the observed order-of-accuracy matches the theoretical order-of accuracy. Even though the solution to the partial differential equation is not intuitive, it provides an exact solution to a problem that most likely could not be solved analytically. The code developer can then use the exact solution as a debugging tool. While the order-of-accuracy test has been historically treated as the most rigorous of all code verification methods, it fails to indicate code”bugs” that are of the same order as the theoretical order-of-accuracy. The only way to test for these types of code bugs is to verify that the theoretical local truncation error for a particular grid matches the difference between the manufactured solution (MS) and the solution on that grid. The theoretical local truncation error can be computed by using the modified equation analysis (MEA) with the MS and its analytic derivatives, which we call modified equation analysis method of manufactured solutions (MEAMMS). In addition to describing the MEAMMS process, this study shows the results of completing a code verification study on a conservation of mass code. The code was able to compute the leading truncation error term as well as additional higher-order terms. When the code verification process was complete, not only did the observed order-of-accuracy match the theoretical order-of-accuracy for all numerical schemes implemented in the code, but it was also able to cancel the discretization error to within round-off error for a 64-bit system.

More Details

Uncertainty quantification study of CTF for the OECD/NEA LWR uncertainty analysis in modeling benchmark

Nuclear Science and Engineering

Porter, Nathan W.; Avramova, Maria N.; Mousseau, Vincent A.

This work describes the results of a quantitative uncertainty analysis of the thermal-hydraulic subchannel code for nuclear engineering applications, Coolant Boiling in Rod Arrays-Three Field (COBRA-TF). CTF is used, which is a version of COBRA-TF developed in cooperation between the Consortium for Advanced Simulation of Light Water Reactors and North Carolina State University. Four steady-state cases from Phase II Exercise 3 of the Organisation for Economic Co-operation and Development/Nuclear Energy Agency Light Water Reactor Uncertainty Analysis in Modeling (UAM) Benchmark are analyzed using the statistical analysis tool, Design Analysis Kit for Optimization and Terascale Applications (Dakota). The input parameters include boundary condition, geometry, and modeling uncertainties, which are selected using a sensitivity study and then defined based on expert judgment. A forward uncertainty quantification method with Latin hypercube sampling (LHS) is used, where the sample size is based on available computational resources. The means and standard deviations of thermal-hydraulic quantities of interest are reported, as well as the Spearman rank correlation coefficients between the inputs and outputs. The means and standard deviations are accompanied by their respective standard errors, and the correlation coefficients are tested for statistical significance. The quantities of interest include void fractions, temperatures, and pressure drops. The predicted uncertainty in all parameters remains relatively low for all quantities of interest. The dominant sources of uncertainty are identified. For cases based on experiments, two different validation metrics are used to quantify the difference between measured and predicted void fractions. The results compare well with past studies, but with a number of improvements: the use of an updated CTF input deck using the current UAM specification and the most recent version of CTF, the use of an LHS method, an analysis of standard errors for the statistical results, and a quantitative comparison to experimental data. Though the statistical uncertainty analysis framework presented herein is applied to thermal-hydraulic analyses, it is generally applicable to any simulation tool. Given a specified amount of computational resources, it can be used to quantify statistical significance through the use of fundamental statistical analyses. This is in contrast with the prevailing methods in nuclear engineering, which provide a sample size necessary to achieve a specified level of statistical certainty.

More Details

Uncertainty quantification's role in modeling and simulation planning, and credibility assessment through the predictive capability maturity model

Handbook of Uncertainty Quantification

Rider, William J.; Witkowski, Walter R.; Mousseau, Vincent A.

The importance of credible, trustworthy numerical simulations is obvious especially when using the results for making high-consequence decisions. Determining the credibility of such numerical predictions is much more difficult and requires a systematic approach to assessing predictive capability, associated uncertainties and overall confidence in the computational simulation process for the intended use of the model. This process begins with an evaluation of the computational modeling of the identified, important physics of the simulation for its intended use. This is commonly done through a Phenomena Identification Ranking Table (PIRT). Then an assessment of the evidence basis supporting the ability to computationally simulate these physics can be performed using various frameworks such as the Predictive Capability Maturity Model (PCMM). Several critical activities follow in the areas of code and solution verification, validation and uncertainty quantification, which will be described in detail in the following sections. The subject matter is introduced for general applications but specifics are given for the failure prediction project. The first task that must be completed in the verification & validation procedure is to perform a credibility assessment to fully understand the requirements and limitations of the current computational simulation capability for the specific application intended use. The PIRT and PCMM are tools used at Sandia National Laboratories (SNL) to provide a consistent manner to perform such an assessment. Ideally, all stakeholders should be represented and contribute to perform an accurate credibility assessment. PIRTs and PCMMs are both described in brief detail below and the resulting assessments for an example project are given.

More Details

CASL Verification and Validation Plan

Mousseau, Vincent A.; Dinh, Nam D.

This report documents the Consortium for Advanced Simulation of LWRs (CASL) verification and validation plan. The document builds upon input from CASL subject matter experts, most notably the CASL Challenge Problem Product Integrators, CASL Focus Area leaders, and CASL code development and assessment teams. This document will be a living document that will track progress on CASL to do verification and validation for both the CASL codes (including MPACT, CTF, BISON, MAMBA) and for the CASL challenge problems (CIPS, PCI, DNB). The CASL codes and the CASL challenge problems are at differing levels of maturity with respect to validation and verification. The gap analysis will summarize additional work that needs to be done. Additional VVUQ work will be done as resources permit. This report is prepared for the Department of Energy’s (DOE’s) CASL program in support of milestone CASL.P13.02.

More Details
Results 1–25 of 28
Results 1–25 of 28