Publications

Results 51–75 of 76
Skip to search filters

U.S. Nuclear Regulatory Commission Extremely Low Probability of Rupture pilot study : xLPR framework model user's guide

Mattie, Patrick D.; Sallaberry, Cedric J.; McClellan, Yvonne M.

For the U.S. Nuclear Regulatory Commission (NRC) Extremely Low Probability of Rupture (xLPR) pilot study, Sandia National Laboratories (SNL) was tasked to develop and evaluate a probabilistic framework using a commercial software package for Version 1.0 of the xLPR Code. Version 1.0 of the xLPR code is focused assessing the probability of rupture due to primary water stress corrosion cracking in dissimilar metal welds in pressurizer surge nozzles. Future versions of this framework will expand the capabilities to other cracking mechanisms, and other piping systems for both pressurized water reactors and boiling water reactors. The goal of the pilot study project is to plan the xLPR framework transition from Version 1.0 to Version 2.0; hence the initial Version 1.0 framework and code development will be used to define the requirements for Version 2.0. The software documented in this report has been developed and tested solely for this purpose. This framework and demonstration problem will be used to evaluate the commercial software's capabilities and applicability for use in creating the final version of the xLPR framework. This report details the design, system requirements, and the steps necessary to use the commercial-code based xLPR framework developed by SNL.

More Details

Development, analysis, and evaluation of a commercial software framework for the study of Extremely Low Probability of Rupture (xLPR) events at nuclear power plants

Mattie, Patrick D.; Sallaberry, Cedric J.; Kalinich, Donald A.

Sandia National Laboratories (SNL) participated in a Pilot Study to examine the process and requirements to create a software system to assess the extremely low probability of pipe rupture (xLPR) in nuclear power plants. This project was tasked to develop a prototype xLPR model leveraging existing fracture mechanics models and codes coupled with a commercial software framework to determine the framework, model, and architecture requirements appropriate for building a modular-based code. The xLPR pilot study was conducted to demonstrate the feasibility of the proposed developmental process and framework for a probabilistic code to address degradation mechanisms in piping system safety assessments. The pilot study includes a demonstration problem to assess the probability of rupture of DM pressurizer surge nozzle welds degraded by primary water stress-corrosion cracking (PWSCC). The pilot study was designed to define and develop the framework and model; then construct a prototype software system based on the proposed model. The second phase of the project will be a longer term program and code development effort focusing on the generic, primary piping integrity issues (xLPR code). The results and recommendations presented in this report will be used to help the U.S. Nuclear Regulatory Commission (NRC) define the requirements for the longer term program.

More Details

Characterization, propagation and analysis of aleatory and epistemic uncertainty in the 2008 performance assessment for the proposed repository for high-level radioactive waste at Yucca Mountain, Nevada

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

Hansen, Clifford W.; Helton, Jon C.; Sallaberry, Cedric J.

The 2008 performance assessment (PA) for the proposed repository for high-level radioactive waste at Yucca Mountain (YM), Nevada, illustrates the conceptual structure of risk assessments for complex systems. The 2008 YM PA is based on the following three conceptual entities:a probability space that characterizes aleatory uncertainty; a function that predicts consequences for individual elements of the sample space for aleatory uncertainty; and a probability space that characterizes epistemic uncertainty. These entities and their use in the characterization, propagation and analysis of aleatory and epistemic uncertainty are described and illustrated with results from the 2008 YM PA. © 2010 Springer-Verlag Berlin Heidelberg.

More Details

Representation of analysis results involving aleatory and epistemic uncertainty

International Journal of General Systems

Helton, Jon C.; Johnson, Jay D.; Oberkampf, William L.; Sallaberry, Cedric J.

Procedures are described for the representation of results in analyses that involve both aleatory uncertainty and epistemic uncertainty, with aleatory uncertainty deriving from an inherent randomness in the behaviour of the system under study and epistemic uncertainty deriving from a lack of knowledge about the appropriate values to use for quantities that are assumed to have fixed but poorly known values in the context of a specific study. Aleatory uncertainty is usually represented with probability and leads to cumulative distribution functions (CDFs) or complementary CDFs (CCDFs) for analysis results of interest. Several mathematical structures are available for the representation of epistemic uncertainty, including interval analysis, possibility theory, evidence theory and probability theory. In the presence of epistemic uncertainty, there is not a single CDF or CCDF for a given analysis result. Rather, there is a family of CDFs and a corresponding family of CCDFs that derive from epistemic uncertainty and have an uncertainty structure that derives from the particular uncertainty structure (e.g. interval analysis, possibility theory, evidence theory or probability theory) used to represent epistemic uncertainty. Graphical formats for the representation of epistemic uncertainty in families of CDFs and CCDFs are investigated and presented for the indicated characterisations of epistemic uncertainty.

More Details

Sensitivity analyses of radionuclide transport in the saturated zone at yucca mountain, nevada

American Nuclear Society - 12th International High-Level Radioactive Waste Management Conference 2008

Arnold, Bill W.; Hadgu, Teklu; Sallaberry, Cedric J.

Simulation of potential radionuclide transport in the saturated zone from beneath the proposed repository at Yucca Mountain to the accessible environment is an important aspect of the total system performance assessment (TSPA) for disposal of high-level radioactive waste at the site. Analyses of uncertainty and sensitivity are integral components of the TSPA and have been conducted at both the sub-system and system levels to identify parameters and processes that contribute to the overall uncertainty in predictions of repository performance. Results of the sensitivity analyses indicate that uncertainty in groundwater specific discharge along the flow path in the saturated zone from beneath the repository is an important contributor to uncertainty in TSPA results and is the dominant source of uncertainty in transport times in the saturated zone for most radionuclides. Uncertainties in parameters related to matrix diffusion in the volcanic units, colloid-facilitated transport, and sorption are also important contributors to uncertainty in transport times to differing degrees for various radionuclides.

More Details

Yucca mountain 2008 performance assessment: Uncertainty and sensitivity analysis for expected dose

American Nuclear Society - 12th International High-Level Radioactive Waste Management Conference 2008

Hansen, C.W.; Brooks, K.; Groves, J.W.; Helton, J.C.; Lee, K.P.; Sallaberry, Cedric J.; Statham, W.; Thorn, C.

Uncertainty and sensitivity analyses of the expected dose to the reasonably maximally exposed individual in the Yucca Mountain 2008 total system performance assessment (TSPA) are presented. Uncertainty results are obtained with Latin hypercube sampling of epistemic uncertain inputs, and partial rank correlation coefficients are used to illustrate sensitivity analysis results.

More Details

Yucca mountain 2008 performance assessment: Conceptual structure and computational organization

American Nuclear Society - 12th International High-Level Radioactive Waste Management Conference 2008

Helton, J.C.; Hansen, C.W.; Sallaberry, Cedric J.

The conceptual structure and computational organization of the 2008 total system performance assessment (TSPA) for the proposed high-level radioactive waste repository at Yucca Mountain, Nevada, are described. This analysis was carried out to support the License Application by the U.S. Department of Energy (DOE) to the U.S. Nuclear Regulatory Commission (NRC) for the indicated repository. In particular, the analysis was carried out to establish compliance with the postclosure requirements specified by the NRC in proposed 10 CFR Part 63. The requirements in 10 CFR Part 63 result in a performance assessment that involves three basic entities: (EN1) a characterization of the uncertainty in the occurrence of future events (eg., igneous events, seismic events) that could affect the performance of the repository; (EN2) models for predicting the physical behavior and evolution of the repository (e.g., systems of ordinary and partial d?fferential equations); and (EN3) a characterization of the uncertainty associated with analysis Inputs that have fixed but imprecisely known values (e.g., the appropriate spatially-averaged value for a distribution coefficient). The designators aleatory and epistemic are commonly used for the uncertainties characterized by entities (EN1) and (EN3). The manner in which the preceding entities are defined and organized to produce the 2008 TSPA for the proposed Yucca Mountain repository are described.

More Details

Yucca mountain 2008 performance assessment: Uncertainty and sensitivity analysis for physical processes

American Nuclear Society - 12th International High-Level Radioactive Waste Management Conference 2008

Sallaberry, Cedric J.; Aragon, A.; Bier, A.; Chen, Y.; Groves, J.W.; Hansen, C.W.; Helton, J.C.; Mehta, S.; Miller, S.P.; Min, J.; Vo, P.

The Total System Performance Assessment (TSPA) for the proposed high level radioactive waste repository at Yucca Mountain, Nevada, uses a sampling-based approach to uncertainty and sensitivity analysis. Specifically, Latin hypercube sampling is used to generate a mapping between epistemically uncertain analysis inputs and analysis outcomes of interest. This results in distributions that characterize the uncertainty in analysis outcomes. Further, the resultant mapping can be explored with sensitivity analysis procedures based on (i) examination of scatterplots, (ii) partial rank correlation coefficients, (iii) R2 values and standardized rank regression coefficients obtained in stepwise rank regression analyses, and (iv) other analysis techniques. The TSPA considers over 300 epistemically uncertain inputs (e.g., corrosion properties, solubilities, retardations, defining parameters for Poisson processes, ⋯) and over 70 time-dependent analysis outcomes (e.g., physical properties in waste packages and the engineered barrier system, releases from the engineered barrier system, the unsaturated zone and the saturated zone for individual radionuclides, and annual dose to the reasonably maximally exposed individual (RMEI) from both individual radionuclides and all radionuclides. The obtained uncertainty and sensitivity analysis results play an important role in facilitating understanding of analysis results, supporting analysis verification, establishing risk importance, and enhancing overall analysis credibility. The uncertainty and sensitivity analysis procedures are illustrated and explained with selected results for releases from the engineered barrier system, the unsaturated zone and the saturated zone and also for annual dose to the RMEI.

More Details

Annual Site Environmental Report Sandia National Laboratories, Albuquerque, New Mexico, Calendar year 2007

Arnold, Bill W.; Sallaberry, Cedric J.

Sandia National Laboratories, New Mexico (SNL/NM) is a government-owned/contractor-operated facility. Sandia Corporation (Sandia), a wholly owned subsidiary of Lockheed Martin Corporation, manages and operates the laboratory for the U.S. Department of Energy (DOE), National Nuclear Security Administration (NNSA). The DOE/NNSA Sandia Site Office (SSO) administers the contract and oversees contractor operations at the site. This annual report summarizes data and the compliance status of Sandia Corporation’s environmental protection and monitoring programs through December 31, 2007. Major environmental programs include air quality, water quality, groundwater protection, terrestrial surveillance, waste management, pollution prevention (P2), environmental restoration (ER), oil and chemical spill prevention, and implementation of the National Environmental Policy Act (NEPA). Environmental monitoring and surveillance programs are required by DOE Order 450.1, Environmental Protection Program (DOE 2007a) and DOE Manual 231.1-1A, Environment, Safety, and Health Reporting (DOE 2007).

More Details

Extension of Latin hypercube samples with correlated variables

Reliability Engineering and System Safety

Sallaberry, Cedric J.; Helton, J.C.; Hora, S.C.

A procedure for extending the size of a Latin hypercube sample (LHS) with rank correlated variables is described and illustrated. The extension procedure starts with an LHS of size m and associated rank correlation matrix C and constructs a new LHS of size 2m that contains the elements of the original LHS and has a rank correlation matrix that is close to the original rank correlation matrix C. The procedure is intended for use in conjunction with uncertainty and sensitivity analysis of computationally demanding models in which it is important to make efficient use of a necessarily limited number of model evaluations.

More Details

Illustration of sampling-based approaches to the calculation of expected dose in performance assessments for the proposed high level radioactive waste repository at Yucca Mountain, Nevada

Sallaberry, Cedric J.

A deep geologic repository for high level radioactive waste is under development by the U.S. Department of Energy at Yucca Mountain (YM), Nevada. As mandated in the Energy Policy Act of 1992, the U.S. Environmental Protection Agency (EPA) has promulgated public health and safety standards (i.e., 40 CFR Part 197) for the YM repository, and the U.S. Nuclear Regulatory Commission has promulgated licensing standards (i.e., 10 CFR Parts 2, 19, 20, etc.) consistent with 40 CFR Part 197 that the DOE must establish are met in order for the YM repository to be licensed for operation. Important requirements in 40 CFR Part 197 and 10 CFR Parts 2, 19, 20, etc. relate to the determination of expected (i.e., mean) dose to a reasonably maximally exposed individual (RMEI) and the incorporation of uncertainty into this determination. This presentation describes and illustrates how general and typically nonquantitive statements in 40 CFR Part 197 and 10 CFR Parts 2, 19, 20, etc. can be given a formal mathematical structure that facilitates both the calculation of expected dose to the RMEI and the appropriate separation in this calculation of aleatory uncertainty (i.e., randomness in the properties of future occurrences such as igneous and seismic events) and epistemic uncertainty (i.e., lack of knowledge about quantities that are poorly known but assumed to have constant values in the calculation of expected dose to the RMEI).

More Details

Survey of sampling-based methods for uncertainty and sensitivity analysis

Reliability Engineering and System Safety

Helton, J.C.; Johnson, J.D.; Sallaberry, Cedric J.; Storlie, C.B.

Sampling-based methods for uncertainty and sensitivity analysis are reviewed. The following topics are considered: (i) definition of probability distributions to characterize epistemic uncertainty in analysis inputs, (ii) generation of samples from uncertain analysis inputs, (iii) propagation of sampled inputs through an analysis, (iv) presentation of uncertainty analysis results, and (v) determination of sensitivity analysis results. Special attention is given to the determination of sensitivity analysis results, with brief descriptions and illustrations given for the following procedures/techniques: examination of scatterplots, correlation analysis, regression analysis, partial correlation analysis, rank transformations, statistical tests for patterns based on gridding, entropy tests for patterns based on gridding, nonparametric regression analysis, squared rank differences/rank correlation coefficient test, two-dimensional Kolmogorov-Smirnov test, tests for patterns based on distance measures, top down coefficient of concordance, and variance decomposition. © 2005 Elsevier Ltd. All rights reserved.

More Details

Measurement and modeling of energetic-material mass transfer to soil-pore water - Project CP-1227 final technical report

Webb, Stephen W.; Phelan, James M.; Stein, Joshua S.; Sallaberry, Cedric J.

Military test and training ranges operate with live-fire engagements to provide realism important to the maintenance of key tactical skills. Ordnance detonations during these operations typically produce minute residues of parent explosive chemical compounds. Occasional low-order detonations also disperse solid-phase energetic material onto the surface soil. These detonation remnants are implicated in chemical contamination impacts to groundwater on a limited set of ranges where environmental characterization projects have occurred. Key questions arise regarding how these residues and the environmental conditions (e.g., weather and geostratigraphy) contribute to groundwater pollution. This final report documents the results of experimental and simulation model development for evaluating mass transfer processes from solid-phase energetics to soil-pore water.

More Details

A method for extending the size of Latin Hypercube Sample

Conference Proceedings of the Society for Experimental Mechanics Series

Sallaberry, Cedric J.; Helton, Jon C.

Latin Hypercube Sampling (LHS) is widely used as sampling based method for probabilistic calculations. This method has some clear advantages over classical random sampling (RS) that derive from its efficient stratification properties. However, one of its limitations is that it is not possible to extend the size of an initial sample by simply adding new simulations, as this will lead to a loss of the efficient stratification associated with LHS. We describe a new method to extend the size of an LHS to n (>=2) times its original size while preserving both the LHS structure and any induced correlations between the input parameters. This method involves introducing a refined grid for the original sample and then filling in empty rows and columns with new data in a way that conserves both the LHS structure and any induced correlations. An estimate of the bounds of the resulting correlation between two variables is derived for n=2. This result shows that the final correlation is close to the average of the correlations from the original sample and the new sample used in the infilling of the empty rows and columns indicated above.

More Details
Results 51–75 of 76
Results 51–75 of 76