Publications

Results 1–25 of 49
Skip to search filters

Development and implementation of a CTF code verification suite

Nuclear Engineering and Design

Porter, N.W.; Salko, Robert K.; Pilch, Martin P.

CTF is a thermal hydraulic subchannel code developed to predict light water reactor (LWR) core behavior. It is a version of Coolant Boiling in Rod Arrays (COBRA) developed by Oak Ridge National Laboratory (ORNL) and North Carolina State University (NCSU) and used in the Consortium for the Advanced Simulation of LWRs (CASL). In this work, the existing CTF code verification matrix is expanded, which ensures that the code is a faithful representation of the underlying mathematical model. The suite of code verification tests are mapped to the underlying conservation equations of CTF and significant gaps are addressed. As such, five new problems are incorporated: isokinetic advection, conduction, pressure drop, convection, and pipe boiling. Convergence behavior and numerical errors are quantified for each of the tests and all tests converge at the correct rate to their corresponding analytic solution. A new verification utility that generalizes the code verification process is used to incorporate these problems into the CTF automated test suite.

More Details

Probability of loss of assured safety in systems with multiple time-dependent failure modes

Pilch, Martin P.; Sallaberry, Cedric J.

Weak link (WL)/strong link (SL) systems are important parts of the overall operational design of high-consequence systems. In such designs, the SL system is very robust and is intended to permit operation of the entire system under, and only under, intended conditions. In contrast, the WL system is intended to fail in a predictable and irreversible manner under accident conditions and render the entire system inoperable before an accidental operation of the SL system. The likelihood that the WL system will fail to deactivate the entire system before the SL system fails (i.e., degrades into a configuration that could allow an accidental operation of the entire system) is referred to as probability of loss of assured safety (PLOAS). Representations for PLOAS for situations in which both link physical properties and link failure properties are time-dependent are derived and numerically evaluated for a variety of WL/SL configurations, including PLOAS defined by (i) failure of all SLs before failure of any WL, (ii) failure of any SL before failure of any WL, (iii) failure of all SLs before failure of all WLs, and (iv) failure of any SL before failure of all WLs. The effects of aleatory uncertainty and epistemic uncertainty in the definition and numerical evaluation of PLOAS are considered.

More Details

We underestimate uncertainties in our predictions

Pilch, Martin P.

Prediction is defined in the American Heritage Dictionary as follows: 'To state, tell about, or make known in advance, especially on the basis of special knowledge.' What special knowledge do we demand of modeling and simulation to assert that we have a predictive capability for high consequence applications? The 'special knowledge' question can be answered in two dimensions: the process and rigor by which modeling and simulation is executed and assessment results for the specific application. Here we focus on the process and rigor dimension and address predictive capability in terms of six attributes: (1) geometric and representational fidelity, (2) physics and material model fidelity, (3) code verification, (4) solution verification, (5) validation, and (6) uncertainty quantification. This presentation will demonstrate through mini-tutorials, simple examples, and numerous case studies how each attribute creates opportunities for errors, biases, or uncertainties to enter into simulation results. The demonstrations will motivate a set of practices that minimize the risk in using modeling and simulation for high-consequence applications while defining important research directions. It is recognized that there are cultural, technical, infrastructure, and resource barriers that prevent analysts from performing all analyses at the highest levels of rigor. Consequently, the audience for this talk is (1) analysts, so they can know what is expected of them, (2) decision makers, so they can know what to expect from modeling and simulation, and (3) the R&D community, so they can address the technical and infrastructure issues that prevent analysts from executing analyses in a practical, timely, and quality manner.

More Details

Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan : ASC software quality engineering practices Version 3.0

Turgeon, Jennifer T.; Minana, Molly A.; Pilch, Martin P.

The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in the US Department of Energy/National Nuclear Security Agency (DOE/NNSA) Quality Criteria, Revision 10 (QC-1) as 'conformance to customer requirements and expectations'. This quality plan defines the SNL ASC Program software quality engineering (SQE) practices and provides a mapping of these practices to the SNL Corporate Process Requirement (CPR) 001.3.6; 'Corporate Software Engineering Excellence'. This plan also identifies ASC management's and the software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals. This SNL ASC Software Quality Plan establishes the signatories commitments to improving software products by applying cost-effective SQE practices. This plan enumerates the SQE practices that comprise the development of SNL ASC's software products and explains the project teams opportunities for tailoring and implementing the practices.

More Details

Formulation of the thermal problem

Computer Methods in Applied Mechanics and Engineering

Dowding, Kevin J.; Pilch, Martin P.; Hills, Richard G.

This paper describes the thermal problem and presents the experimental data for validation. The thermal problem involves validating a model for heat conduction in a solid. The mathematical model is based on one-dimensional, linear heat conduction in a solid slab, with heat flux boundary conditions. Experimental data from a series of material characterization, validation, and accreditation experiments related to the mathematical model are provided. The objective is to use the series of experiments to assess the model, and then use the model to predict regulatory performance relative to a regulatory requirement. The regulatory requirement is defined in terms of the probability that a surface temperature not exceed a specified temperature at the regulatory conditions. © 2007 Elsevier B.V. All rights reserved.

More Details

Toward a more rigorous application of margins and uncertainties within the nuclear weapons life cycle : a Sandia perspective

Diegert, Kathleen V.; Klenke, S.E.; Paulsen, Robert A.; Pilch, Martin P.; Trucano, Timothy G.

This paper presents the conceptual framework that is being used to define quantification of margins and uncertainties (QMU) for application in the nuclear weapons (NW) work conducted at Sandia National Laboratories. The conceptual framework addresses the margins and uncertainties throughout the NW life cycle and includes the definition of terms related to QMU and to figures of merit. Potential applications of QMU consist of analyses based on physical data and on modeling and simulation. Appendix A provides general guidelines for addressing cases in which significant and relevant physical data are available for QMU analysis. Appendix B gives the specific guidance that was used to conduct QMU analyses in cycle 12 of the annual assessment process. Appendix C offers general guidelines for addressing cases in which appropriate models are available for use in QMU analysis. Appendix D contains an example that highlights the consequences of different treatments of uncertainty in model-based QMU analyses.

More Details

Predictive Capability Maturity Model for computational modeling and simulation

Pilch, Martin P.; Oberkampf, William L.; Trucano, Timothy G.

The Predictive Capability Maturity Model (PCMM) is a new model that can be used to assess the level of maturity of computational modeling and simulation (M&S) efforts. The development of the model is based on both the authors experience and their analysis of similar investigations in the past. The perspective taken in this report is one of judging the usefulness of a predictive capability that relies on the numerical solution to partial differential equations to better inform and improve decision making. The review of past investigations, such as the Software Engineering Institute's Capability Maturity Model Integration and the National Aeronautics and Space Administration and Department of Defense Technology Readiness Levels, indicates that a more restricted, more interpretable method is needed to assess the maturity of an M&S effort. The PCMM addresses six contributing elements to M&S: (1) representation and geometric fidelity, (2) physics and material model fidelity, (3) code verification, (4) solution verification, (5) model validation, and (6) uncertainty quantification and sensitivity analysis. For each of these elements, attributes are identified that characterize four increasing levels of maturity. Importantly, the PCMM is a structured method for assessing the maturity of an M&S effort that is directed toward an engineering application of interest. The PCMM does not assess whether the M&S effort, the accuracy of the predictions, or the performance of the engineering system satisfies or does not satisfy specified application requirements.

More Details
Results 1–25 of 49
Results 1–25 of 49