Publications

106 Results
Skip to search filters

Deployment of Multifidelity Uncertainty Quantification for Thermal Battery Assessment Part I: Algorithms and Single Cell Results

Eldred, Michael S.; Adams, Brian M.; Geraci, Gianluca G.; Portone, Teresa P.; Ridgway, Elliott M.; Stephens, John A.; Wildey, Timothy M.

This report documents the results of an FY22 ASC V&V level 2 milestone demonstrating new algorithms for multifidelity uncertainty quantification. Part I of the report describes the algorithms, studies their performance on a simple model problem, and then deploys the methods to a thermal battery example from the open literature. Part II (restricted distribution) applies the multifidelity UQ methods to specific thermal batteries of interest to the NNSA/ASC program.

More Details

Dakota, A Multilevel Parallel Object-Oriented Framework for Design Optimization, Parameter Estimation, Uncertainty Quantification, and Sensitivity Analysis: Version 6.15 User's Manual

Adams, Brian M.; Bohnhoff, William B.; Dalbey, Keith D.; Ebeida, Mohamed S.; Eddy, John E.; Eldred, Michael E.; Hooper, Russell H.; Hough, Patricia H.; Hu, Kenneth H.; Jakeman, John J.; Khalil, Mohammad K.; Maupin, Kathryn M.; Monschke, Jason A.; Ridgway, Elliott R.; Rushdi, Ahmad A.; Seidl, Daniel S.; Stephens, John A.; Winokur, Justin W.

The Dakota toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the Dakota toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers.

More Details

Dakota A Multilevel Parallel Object-Oriented Framework for Design Optimization Parameter Estimation Uncertainty Quantification and Sensitivity Analysis (V.6.14) (User's Manual)

Adams, Brian M.; Bohnhoff, William B.; Dalbey, Keith D.; Ebeida, Mohamed S.; Eddy, John E.; Eldred, Michael E.; Hooper, Russell H.; Hough, Patricia H.; Hu, Kenneth H.; Jakeman, John J.; Khalil, Mohammad K.; Maupin, Kathryn M.; Monschke, Jason A.; Ridgway, Elliott R.; Rushdi, Ahmad A.; Seidl, Daniel S.; Stephens, John A.; Winokur, Justin W.

The Dakota toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the Dakota toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a users manual for the Dakota software and provides capability overviews and procedures for software execution, as well as a variety of example studies.

More Details

Dakota, A Multilevel Parallel Object-Oriented Framework for Design Optimization, Parameter Estimation, Uncertainty Quantification, and Sensitivity Analysis: Version 6.13 User's Manual

Adams, Brian M.; Bohnhoff, William B.; Dalbey, Keith D.; Ebeida, Mohamed S.; Eddy, John E.; Eldred, Michael E.; Hooper, Russell H.; Hough, Patricia H.; Hu, Kenneth H.; Jakeman, John J.; Khalil, Mohammad K.; Maupin, Kathryn M.; Monschke, Jason A.; Ridgway, Elliott R.; Rushdi, Ahmad A.; Seidl, Daniel S.; Stephens, John A.; Winokur, Justin W.

The Dakota toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the Dakota toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user’s manual for the Dakota software and provides capability overviews and procedures for software execution, as well as a variety of example studies.

More Details

Geometric uncertainty quantification and robust design for 2D satellite shielding

International Conference on Mathematics and Computational Methods Applied to Nuclear Science and Engineering, M and C 2019

Pautz, Shawn D.; Adams, Brian M.; Bruss, Donald E.

The design of satellites usually includes the objective of minimizing mass due to high launch costs, which is challenging due to the need to protect sensitive electronics from the space radiation environment by means of radiation shielding. This is further complicated by the need to account for uncertainties, e.g. in manufacturing. There is growing interest in automated design optimization and uncertainty quantification (UQ) techniques to help achieve that objective. Traditional optimization and UQ approaches that rely exclusively on response functions (e.g. dose calculations) can be quite expensive when applied to transport problems. Previously we showed how adjoint-based transport sensitivities used in conjunction with gradient-based optimization algorithms can be quite effective in designing mass-efficient electron and/or proton shields in one- or two-dimensional Cartesian geometries. In this paper we extend that work to UQ and to robust design (i.e. optimization that considers uncertainties) in 2D. This consists primarily of using the sensitivities to geometric changes, originally derived for optimization, within relevant algorithms for UQ and robust design. We perform UQ analyses on previous optimized designs given some assumed manufacturing uncertainties. We also conduct a new optimization exercise that accounts for the same uncertainties. Our results show much improved computational efficiencies over previous approaches.

More Details

Sample Generation for Nuclear Data

Swiler, Laura P.; Adams, Brian M.; Wieselquist, William W.

This report summarizes a NEAMS (Nuclear Energy Advanced Modeling and Simiution) project focused on developing a sampling capability that can handle the challenges of generating samples from nuclear cross-section data. The covariance information between energy groups tends to be very ill-conditioned and thus poses a problem using traditional methods for generated correlated samples. This report outlines a method that addresses the sample generation from cross-section matrices. The treatment allows one to assume the cross sections are distributed with a multivariate normal distribution, lognormal distribution, or truncated normal distribution.

More Details

Treatment of Nuclear Data Covariance Information in Sample Generation

Swiler, Laura P.; Adams, Brian M.; Wieselquist, William W.

This report summarizes a NEAMS (Nuclear Energy Advanced Modeling and Simulation) project focused on developing a sampling capability that can handle the challenges of generating samples from nuclear cross-section data. The covariance information between energy groups tends to be very ill-conditioned and thus poses a problem using traditional methods for generated correlated samples. This report outlines a method that addresses the sample generation from cross-section matrices.

More Details

User Guidelines and Best Practices for CASL VUQ Analysis Using Dakota

Adams, Brian M.; Coleman, Kayla M.; Hooper, Russell H.; Khuwaileh, Bassam A.; Lewis, Allison L.; Smith, Ralph S.; Swiler, Laura P.; Turinsky, Paul J.; Williams, Brian W.

Sandia's Dakota software (available at http://dakota.sandia.gov) supports science and engineering transformation through advanced exploration of simulations. Specifically, it manages and analyzes ensembles of simulations to provide broader and deeper perspective for analysts and decision makers. This enables them to enhance understanding of risk, improve products, and assess simulation credibility. This manual offers Consortium for Advanced Simulation of Light Water Reactors (LWRs) (CASL) partners a guide to conducting Dakota-based VUQ studies for CASL problems. It motivates various classes of Dakota methods and includes examples of their use on representative application problems. On reading, a CASL analyst should understand why and how to apply Dakota to a simulation problem.

More Details

Overview of selected DOE/NNSA predictive science initiatives: The predictive science academic alliance program and the DAKOTA project

53rd AIAA Aerospace Sciences Meeting

Eldred, Michael S.; Swiler, Laura P.; Adams, Brian M.; Jakeman, J.D.

This paper supports a special session on "Frontiers of Uncertainty Management for Com- plex Aerospace Systems" with the intent of summarizing two aspects of the DOE/NNSA Accelerated Strategic Computing (ASC) program, each of which is focused on predictive science using complex simulation models. The first aspect is academic outreach, as enabled by the Predictive Science Academic Alliance Program (PSAAP). The second aspect is the Dakota project at Sandia National Laboratories, which develops and deploys uncertainty quantification capabilities focused on high fidelity modeling and simulation on large-scale parallel computers.

More Details

Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis :

Adams, Brian M.; Jakeman, John D.; Swiler, Laura P.; Stephens, John A.; Vigil, Dena V.; Wildey, Timothy M.; Bauman, Lara E.; Bohnhoff, William J.; Dalbey, Keith D.; Eddy, John P.; Ebeida, Mohamed S.; Eldred, Michael S.; Hough, Patricia D.; Hu, Kenneth H.

The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the Dakota software and provides capability overviews and procedures for software execution, as well as a variety of example studies.

More Details

Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis version 6.0 theory manual

Adams, Brian M.; Jakeman, John D.; Swiler, Laura P.; Stephens, John A.; Vigil, Dena V.; Wildey, Timothy M.; Bauman, Lara E.; Bohnhoff, William J.; Dalbey, Keith D.; Eddy, John P.; Ebeida, Mohamed S.; Eldred, Michael S.; Hough, Patricia D.; Hu, Kenneth H.

The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a theoretical manual for selected algorithms implemented within the Dakota software. It is not intended as a comprehensive theoretical treatment, since a number of existing texts cover general optimization theory, statistical analysis, and other introductory topics. Rather, this manual is intended to summarize a set of Dakota-related research publications in the areas of surrogate-based optimization, uncertainty quanti cation, and optimization under uncertainty that provide the foundation for many of Dakota's iterative analysis capabilities.

More Details

User guidelines and best practices for CASL VUQ analysis using Dakota

Adams, Brian M.; Hooper, Russell H.

Sandia's Dakota software (available at http://dakota.sandia.gov) supports science and engineering transformation through advanced exploration of simulations. Specifically it manages and analyzes ensembles of simulations to provide broader and deeper perspective for analysts and decision makers. This enables them to enhance understanding of risk, improve products, and assess simulation credibility. This manual offers Consortium for Advanced Simulation of Light Water Reactors (LWRs) (CASL) partners a guide to conducting Dakota-based VUQ studies for CASL problems. It motivates various classes of Dakota methods and includes examples of their use on representative application problems. On reading, a CASL analyst should understand why and how to apply Dakota to a simulation problem. This SAND report constitutes the product of CASL milestone L3:VUQ.V&V.P8.01 and is also being released as a CASL unlimited release report with number CASL-U-2014-0038-000.

More Details

ALEGRA Update: Modernization and Resilience Progress

Robinson, Allen C.; Petney, Sharon P.; Drake, Richard R.; Weirs, Vincent G.; Adams, Brian M.; Vigil, Dena V.; Carpenter, John H.; Garasi, Christopher J.; Wong, Michael K.; Robbins, Joshua R.; Siefert, Christopher S.; Strack, Otto E.; Wills, Ann E.; Trucano, Timothy G.; Bochev, Pavel B.; Summers, Randall M.; Stewart, James R.; Ober, Curtis C.; Rider, William J.; Haill, Thomas A.; Lemke, Raymond W.; Cochrane, Kyle C.; Desjarlais, Michael P.; Love, Edward L.; Voth, Thomas E.; Mosso, Stewart J.; Niederhaus, John H.

Abstract not provided.

Sensitivity analysis techniques applied to a system of hyperbolic conservation laws

Reliability Engineering and System Safety

Weirs, V.G.; Kamm, James R.; Swiler, Laura P.; Tarantola, Stefano; Ratto, Marco; Adams, Brian M.; Rider, William J.; Eldred, Michael S.

Sensitivity analysis is comprised of techniques to quantify the effects of the input variables on a set of outputs. In particular, sensitivity indices can be used to infer which input parameters most significantly affect the results of a computational model. With continually increasing computing power, sensitivity analysis has become an important technique by which to understand the behavior of large-scale computer simulations. Many sensitivity analysis methods rely on sampling from distributions of the inputs. Such sampling-based methods can be computationally expensive, requiring many evaluations of the simulation; in this case, the Sobol method provides an easy and accurate way to compute variance-based measures, provided a sufficient number of model evaluations are available. As an alternative, meta-modeling approaches have been devised to approximate the response surface and estimate various measures of sensitivity. In this work, we consider a variety of sensitivity analysis methods, including different sampling strategies, different meta-models, and different ways of evaluating variance-based sensitivity indices. The problem we consider is the 1-D Riemann problem. By a careful choice of inputs, discontinuous solutions are obtained, leading to discontinuous response surfaces; such surfaces can be particularly problematic for meta-modeling approaches. The goal of this study is to compare the estimated sensitivity indices with exact values and to evaluate the convergence of these estimates with increasing samples sizes and under an increasing number of meta-model evaluations. © 2011 Elsevier Ltd. All rights reserved.

More Details

CASL L1 Milestone report : CASL.P4.01, sensitivity and uncertainty analysis for CIPS with VIPRE-W and BOA

Adams, Brian M.

The CASL Level 1 Milestone CASL.P4.01, successfully completed in December 2011, aimed to 'conduct, using methodologies integrated into VERA, a detailed sensitivity analysis and uncertainty quantification of a crud-relevant problem with baseline VERA capabilities (ANC/VIPRE-W/BOA).' The VUQ focus area led this effort, in partnership with AMA, and with support from VRI. DAKOTA was coupled to existing VIPRE-W thermal-hydraulics and BOA crud/boron deposit simulations representing a pressurized water reactor (PWR) that previously experienced crud-induced power shift (CIPS). This work supports understanding of CIPS by exploring the sensitivity and uncertainty in BOA outputs with respect to uncertain operating and model parameters. This report summarizes work coupling the software tools, characterizing uncertainties, and analyzing the results of iterative sensitivity and uncertainty studies. These studies focused on sensitivity and uncertainty of CIPS indicators calculated by the current version of the BOA code used in the industry. Challenges with this kind of analysis are identified to inform follow-on research goals and VERA development targeting crud-related challenge problems.

More Details

DAKOTA : a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis

Adams, Brian M.; Bohnhoff, William J.; Dalbey, Keith D.; Eddy, John P.; Eldred, Michael S.; Hough, Patricia D.; Lefantzi, Sophia L.; Swiler, Laura P.; Vigil, Dena V.

The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a theoretical manual for selected algorithms implemented within the DAKOTA software. It is not intended as a comprehensive theoretical treatment, since a number of existing texts cover general optimization theory, statistical analysis, and other introductory topics. Rather, this manual is intended to summarize a set of DAKOTA-related research publications in the areas of surrogate-based optimization, uncertainty quantification, and optimization under uncertainty that provide the foundation for many of DAKOTA's iterative analysis capabilities.

More Details

CASL L2 milestone report : VUQ.Y1.03, %22Enable statistical sensitivity and UQ demonstrations for VERA.%22

Adams, Brian M.; Witkowski, Walter R.

The CASL Level 2 Milestone VUQ.Y1.03, 'Enable statistical sensitivity and UQ demonstrations for VERA,' was successfully completed in March 2011. The VUQ focus area led this effort, in close partnership with AMA, and with support from VRI. DAKOTA was coupled to VIPRE-W thermal-hydraulics simulations representing reactors of interest to address crud-related challenge problems in order to understand the sensitivity and uncertainty in simulation outputs with respect to uncertain operating and model form parameters. This report summarizes work coupling the software tools, characterizing uncertainties, selecting sensitivity and uncertainty quantification algorithms, and analyzing the results of iterative studies. These demonstration studies focused on sensitivity and uncertainty of mass evaporation rate calculated by VIPRE-W, a key predictor for crud-induced power shift (CIPS).

More Details

Assessing the Near-Term Risk of Climate Uncertainty:Interdependencies among the U.S. States

Backus, George A.; Trucano, Timothy G.; Robinson, David G.; Adams, Brian M.; Richards, Elizabeth H.; Siirola, John D.; Boslough, Mark B.; Taylor, Mark A.; Conrad, Stephen H.; Kelic, Andjelka; Roach, Jesse D.; Warren, Drake E.; Ballantine, Marissa D.; Stubblefield, W.A.; Snyder, Lillian A.; Finley, Ray E.; Horschel, Daniel S.; Ehlen, Mark E.; Klise, Geoffrey T.; Malczynski, Leonard A.; Stamber, Kevin L.; Tidwell, Vincent C.; Vargas, Vanessa N.; Zagonel, Aldo A.

Abstract not provided.

Final report for %22High performance computing for advanced national electric power grid modeling and integration of solar generation resources%22, LDRD Project No. 149016

Schoenwald, David A.; Richardson, Bryan T.; Riehm, Andrew C.; Wolfenbarger, Paul W.; Adams, Brian M.; Reno, Matthew J.; Hansen, Clifford H.; Oldfield, Ron A.; Stamp, Jason E.; Stein, Joshua S.; Hoekstra, Robert J.; Munoz-Ramos, Karina M.; McLendon, William C.; Russo, Thomas V.; Phillips, Laurence R.

Design and operation of the electric power grid (EPG) relies heavily on computational models. High-fidelity, full-order models are used to study transient phenomena on only a small part of the network. Reduced-order dynamic and power flow models are used when analysis involving thousands of nodes are required due to the computational demands when simulating large numbers of nodes. The level of complexity of the future EPG will dramatically increase due to large-scale deployment of variable renewable generation, active load and distributed generation resources, adaptive protection and control systems, and price-responsive demand. High-fidelity modeling of this future grid will require significant advances in coupled, multi-scale tools and their use on high performance computing (HPC) platforms. This LDRD report demonstrates SNL's capability to apply HPC resources to these 3 tasks: (1) High-fidelity, large-scale modeling of power system dynamics; (2) Statistical assessment of grid security via Monte-Carlo simulations of cyber attacks; and (3) Development of models to predict variability of solar resources at locations where little or no ground-based measurements are available.

More Details

Surrogate modeling with surfpack

Adams, Brian M.; Dalbey, Keith D.; Swiler, Laura P.

Surfpack is a library of multidimensional function approximation methods useful for efficient surrogate-based sensitivity/uncertainty analysis or calibration/optimization. I will survey current Surfpack meta-modeling capabilities for continuous variables and describe recent progress generalizing to both continuous and categorical factors, including relevant test problems and analysis comparisons.

More Details

DAKOTA : a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis. Version 5.0, user's manual

Adams, Brian M.; Dalbey, Keith D.; Eldred, Michael S.; Swiler, Laura P.; Bohnhoff, William J.; Eddy, John P.; Haskell, Karen H.; Hough, Patricia D.

The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the DAKOTA software and provides capability overviews and procedures for software execution, as well as a variety of example studies.

More Details

DAKOTA : a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis. Version 5.0, user's reference manual

Adams, Brian M.; Dalbey, Keith D.; Eldred, Michael S.; Swiler, Laura P.; Bohnhoff, William J.; Eddy, John P.; Haskell, Karen H.; Hough, Patricia D.

The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a reference manual for the commands specification for the DAKOTA software, providing input overviews, option descriptions, and example specifications.

More Details

DAKOTA : a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis. Version 5.0, developers manual

Adams, Brian M.; Dalbey, Keith D.; Eldred, Michael S.; Swiler, Laura P.; Bohnhoff, William J.; Eddy, John P.; Haskell, Karen H.; Hough, Patricia D.

The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a developers manual for the DAKOTA software and describes the DAKOTA class hierarchies and their interrelationships. It derives directly from annotation of the actual source code and provides detailed class documentation, including all member functions and attributes.

More Details

Distributed micro-releases of bioterror pathogens : threat characterizations and epidemiology from uncertain patient observables

Adams, Brian M.; Devine, Karen D.; Najm, H.N.; Marzouk, Youssef M.

Terrorist attacks using an aerosolized pathogen preparation have gained credibility as a national security concern since the anthrax attacks of 2001. The ability to characterize the parameters of such attacks, i.e., to estimate the number of people infected, the time of infection, the average dose received, and the rate of disease spread in contemporary American society (for contagious diseases), is important when planning a medical response. For non-contagious diseases, we address the characterization problem by formulating a Bayesian inverse problem predicated on a short time-series of diagnosed patients exhibiting symptoms. To keep the approach relevant for response planning, we limit ourselves to 3.5 days of data. In computational tests performed for anthrax, we usually find these observation windows sufficient, especially if the outbreak model employed in the inverse problem is accurate. For contagious diseases, we formulated a Bayesian inversion technique to infer both pathogenic transmissibility and the social network from outbreak observations, ensuring that the two determinants of spreading are identified separately. We tested this technique on data collected from a 1967 smallpox epidemic in Abakaliki, Nigeria. We inferred, probabilistically, different transmissibilities in the structured Abakaliki population, the social network, and the chain of transmission. Finally, we developed an individual-based epidemic model to realistically simulate the spread of a rare (or eradicated) disease in a modern society. This model incorporates the mixing patterns observed in an (American) urban setting and accepts, as model input, pathogenic transmissibilities estimated from historical outbreaks that may have occurred in socio-economic environments with little resemblance to contemporary society. Techniques were also developed to simulate disease spread on static and sampled network reductions of the dynamic social networks originally in the individual-based model, yielding faster, though approximate, network-based epidemic models. These reduced-order models are useful in scenario analysis for medical response planning, as well as in computationally intensive inverse problems.

More Details

Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis:version 4.0 developers manual

Brown, Shannon L.; Griffin, Joshua G.; Hough, Patricia D.; Kolda, Tamara G.; Martinez-Canales, Monica L.; Williams, Pamela J.; Adams, Brian M.; Dunlavy, Daniel D.; Swiler, Laura P.; Giunta, Anthony A.; Hart, William E.; Watson, Jean-Paul W.; Eddy, John P.

The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a developers manual for the DAKOTA software and describes the DAKOTA class hierarchies and their interrelationships. It derives directly from annotation of the actual source code and provides detailed class documentation, including all member functions and attributes.

More Details

Solution-verified reliability analysis and design of bistable MEMS using error estimation and adaptivity

Adams, Brian M.; Wittwer, Jonathan W.; Bichon, Barron J.; Carnes, Brian C.; Copps, Kevin D.; Eldred, Michael S.; Hopkins, Matthew M.; Neckels, David C.; Notz, Patrick N.; Subia, Samuel R.

This report documents the results for an FY06 ASC Algorithms Level 2 milestone combining error estimation and adaptivity, uncertainty quantification, and probabilistic design capabilities applied to the analysis and design of bistable MEMS. Through the use of error estimation and adaptive mesh refinement, solution verification can be performed in an automated and parameter-adaptive manner. The resulting uncertainty analysis and probabilistic design studies are shown to be more accurate, efficient, reliable, and convenient.

More Details

DAKOTA, a multilevel parellel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis:version 4.0 uers's manual

Swiler, Laura P.; Giunta, Anthony A.; Hart, William E.; Watson, Jean-Paul W.; Eddy, John P.; Griffin, Joshua G.; Hough, Patricia D.; Kolda, Tamara G.; Martinez-Canales, Monica L.; Williams, Pamela J.; Eldred, Michael S.; Brown, Shannon L.; Adams, Brian M.; Dunlavy, Daniel D.

The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the DAKOTA software and provides capability overviews and procedures for software execution, as well as a variety of example studies.

More Details

DAKOTA, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis:version 4.0 reference manual

Brown, Shannon L.; Griffin, Joshua G.; Hough, Patricia D.; Kolda, Tamara G.; Martinez-Canales, Monica L.; Williams, Pamela J.; Adams, Brian M.; Dunlavy, Daniel D.; Swiler, Laura P.; Giunta, Anthony A.; Hart, William E.; Watson, Jean-Paul W.; Eddy, John P.

The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a reference manual for the commands specification for the DAKOTA software, providing input overviews, option descriptions, and example specifications.

More Details
106 Results
106 Results