Publications

31 Results
Skip to search filters

Optimization of large-scale heterogeneous system-of-systems models

Gray, Genetha A.; Hart, William E.; Hough, Patricia D.; Parekh, Ojas D.; Phillips, Cynthia A.; Siirola, John D.; Swiler, Laura P.; Watson, Jean-Paul W.

Decision makers increasingly rely on large-scale computational models to simulate and analyze complex man-made systems. For example, computational models of national infrastructures are being used to inform government policy, assess economic and national security risks, evaluate infrastructure interdependencies, and plan for the growth and evolution of infrastructure capabilities. A major challenge for decision makers is the analysis of national-scale models that are composed of interacting systems: effective integration of system models is difficult, there are many parameters to analyze in these systems, and fundamental modeling uncertainties complicate analysis. This project is developing optimization methods to effectively represent and analyze large-scale heterogeneous system of systems (HSoS) models, which have emerged as a promising approach for describing such complex man-made systems. These optimization methods enable decision makers to predict future system behavior, manage system risk, assess tradeoffs between system criteria, and identify critical modeling uncertainties.

More Details

QMU as an approach to strengthening the predictive capabilities of complex models

Gray, Genetha A.; Boggs, Paul T.; Grace, Matthew G.

Complex systems are made up of multiple interdependent parts, and the behavior of the entire system cannot always be directly inferred from the behavior of the individual parts. They are nonlinear and system responses are not necessarily additive. Examples of complex systems include energy, cyber and telecommunication infrastructures, human and animal social structures, and biological structures such as cells. To meet the goals of infrastructure development, maintenance, and protection for cyber-related complex systems, novel modeling and simulation technology is needed. Sandia has shown success using M&S in the nuclear weapons (NW) program. However, complex systems represent a significant challenge and relative departure from the classical M&S exercises, and many of the scientific and mathematical M&S processes must be re-envisioned. Specifically, in the NW program, requirements and acceptable margins for performance, resilience, and security are well-defined and given quantitatively from the start. The Quantification of Margins and Uncertainties (QMU) process helps to assess whether or not these safety, reliability and performance requirements have been met after a system has been developed. In this sense, QMU is used as a sort of check that requirements have been met once the development process is completed. In contrast, performance requirements and margins may not have been defined a priori for many complex systems, (i.e. the Internet, electrical distribution grids, etc.), particularly not in quantitative terms. This project addresses this fundamental difference by investigating the use of QMU at the start of the design process for complex systems. Three major tasks were completed. First, the characteristics of the cyber infrastructure problem were collected and considered in the context of QMU-based tools. Second, UQ methodologies for the quantification of model discrepancies were considered in the context of statistical models of cyber activity. Third, Bayesian methods for optimal testing in the QMU framework were developed. This completion of this project represent an increased understanding of how to apply and use the QMU process as a means for improving model predictions of the behavior of complex systems. 4

More Details

Hybrid optimization schemes for simulation-based problems

Gray, Genetha A.

The inclusion of computer simulations in the study and design of complex engineering systems has created a need for efficient approaches to simulation-based optimization. For example, in water resources management problems, optimization problems regularly consist of objective functions and constraints that rely on output from a PDE-based simulator. Various assumptions can be made to simplify either the objective function or the physical system so that gradient-based methods apply, however the incorporation of realistic objection functions can be accomplished given the availability of derivative-free optimization methods. A wide variety of derivative-free methods exist and each method has both advantages and disadvantages. Therefore, to address such problems, we propose a hybrid approach, which allows the combining of beneficial elements of multiple methods in order to more efficiently search the design space. Specifically, in this paper, we illustrate the capabilities of two novel algorithms; one which hybridizes pattern search optimization with Gaussian Process emulation and the other which hybridizes pattern search and a genetic algorithm. We describe the hybrid methods and give some numerical results for a hydrological application which illustrate that the hybrids find an optimal solution under conditions for which traditional optimal search methods fail.

More Details

Some guidance on preparing validation plans for the DART Full System Models

Gray, Genetha A.; Hills, Richard G.

Planning is an important part of computational model verification and validation (V&V) and the requisite planning document is vital for effectively executing the plan. The document provides a means of communicating intent to the typically large group of people, from program management to analysts to test engineers, who must work together to complete the validation activities. This report provides guidelines for writing a validation plan. It describes the components of such a plan and includes important references and resources. While the initial target audience is the DART Full System Model teams in the nuclear weapons program, the guidelines are generally applicable to other modeling efforts. Our goal in writing this document is to provide a framework for consistency in validation plans across weapon systems, different types of models, and different scenarios. Specific details contained in any given validation plan will vary according to application requirements and available resources.

More Details

SNL-NUMO collaborative : development of a deterministic site characterization tool using multi-model ranking and inference

Arnold, Bill W.; Gray, Genetha A.; Grace, Matthew G.; Ahlmann, Michael A.

Uncertainty in site characterization arises from a lack of data and knowledge about a site and includes uncertainty in the boundary conditions, uncertainty in the characteristics, location, and behavior of major features within an investigation area (e.g., major faults as barriers or conduits), uncertainty in the geologic structure, as well as differences in numerical implementation (e.g., 2-D versus 3-D, finite difference versus finite element, grid resolution, deterministic versus stochastic, etc.). Since the true condition at a site can never be known, selection of the best conceptual model is very difficult. In addition, limiting the understanding to a single conceptualization too early in the process, or before data can support that conceptualization, may lead to confidence in a characterization that is unwarranted as well as to data collection efforts and field investigations that are misdirected and/or redundant. Using a series of numerical modeling experiments, this project examined the application and use of information criteria within the site characterization process. The numerical experiments are based on models of varying complexity that were developed to represent one of two synthetically developed groundwater sites; (1) a fully hypothetical site that represented a complex, multi-layer, multi-faulted site, and (2) a site that was based on the Horonobe site in northern Japan. Each of the synthetic sites were modeled in detail to provide increasingly informative 'field' data over successive iterations to the representing numerical models. The representing numerical models were calibrated to the synthetic site data and then ranked and compared using several different information criteria approaches. Results show, that for the early phases of site characterization, low-parameterized models ranked highest while more complex models generally ranked lowest. In addition, predictive capabilities were also better with the low-parameterized models. For the latter iterations, when more data were available, the information criteria rankings tended to converge on the higher parameterized models. Analysis of the numerical experiments suggest that information criteria rankings can be extremely useful for site characterization, but only when the rankings are placed in context and when the contribution of each bias term is understood.

More Details

Penetrator reliability investigation and design exploration : from conventional design processes to innovative uncertainty-capturing algorithms

Swiler, Laura P.; Hough, Patricia D.; Gray, Genetha A.; Chiesa, Michael L.; Heaphy, Robert T.; Thomas, Stephen W.; Trucano, Timothy G.

This project focused on research and algorithmic development in optimization under uncertainty (OUU) problems driven by earth penetrator (EP) designs. While taking into account uncertainty, we addressed three challenges in current simulation-based engineering design and analysis processes. The first challenge required leveraging small local samples, already constructed by optimization algorithms, to build effective surrogate models. We used Gaussian Process (GP) models to construct these surrogates. We developed two OUU algorithms using 'local' GPs (OUU-LGP) and one OUU algorithm using 'global' GPs (OUU-GGP) that appear competitive or better than current methods. The second challenge was to develop a methodical design process based on multi-resolution, multi-fidelity models. We developed a Multi-Fidelity Bayesian Auto-regressive process (MF-BAP). The third challenge involved the development of tools that are computational feasible and accessible. We created MATLAB{reg_sign} and initial DAKOTA implementations of our algorithms.

More Details

Developing a computationally efficient dynamic multilevel hybrid optimization scheme using multifidelity model interactions

Castro, Joseph P.; Gray, Genetha A.; Giunta, Anthony A.; Hough, Patricia D.

Many engineering application problems use optimization algorithms in conjunction with numerical simulators to search for solutions. The formulation of relevant objective functions and constraints dictate possible optimization algorithms. Often, a gradient based approach is not possible since objective functions and constraints can be nonlinear, nonconvex, non-differentiable, or even discontinuous and the simulations involved can be computationally expensive. Moreover, computational efficiency and accuracy are desirable and also influence the choice of solution method. With the advent and increasing availability of massively parallel computers, computational speed has increased tremendously. Unfortunately, the numerical and model complexities of many problems still demand significant computational resources. Moreover, in optimization, these expenses can be a limiting factor since obtaining solutions often requires the completion of numerous computationally intensive simulations. Therefore, we propose a multifidelity optimization algorithm (MFO) designed to improve the computational efficiency of an optimization method for a wide range of applications. In developing the MFO algorithm, we take advantage of the interactions between multi fidelity models to develop a dynamic and computational time saving optimization algorithm. First, a direct search method is applied to the high fidelity model over a reduced design space. In conjunction with this search, a specialized oracle is employed to map the design space of this high fidelity model to that of a computationally cheaper low fidelity model using space mapping techniques. Then, in the low fidelity space, an optimum is obtained using gradient or non-gradient based optimization, and it is mapped back to the high fidelity space. In this paper, we describe the theory and implementation details of our MFO algorithm. We also demonstrate our MFO method on some example problems and on two applications: earth penetrators and groundwater remediation.

More Details

Implementation of a data fusion algorithm for RODS, a real-time outbreak and disease surveillance system

Gray, Genetha A.; Brown, Douglas J.

Due to the nature of many infectious agents, such as anthrax, symptoms may either take several days to manifest or resemble those of less serious illnesses leading to misdiagnosis. Thus, bioterrorism attacks that include the release of such agents are particularly dangerous and potentially deadly. For this reason, a system is needed for the quick and correct identification of disease outbreaks. The Real-time Outbreak Disease Surveillance System (RODS), initially developed by Carnegie Mellon University and the University of Pittsburgh, was created to meet this need. The RODS software implements different classifiers for pertinent health surveillance data in order to determine whether or not an outbreak has occurred. In an effort to improve the capability of RODS at detecting outbreaks, we incorporate a data fusion method. Data fusion is used to improve the results of a single classification by combining the output of multiple classifiers. This paper documents the first stages of the development of a data fusion system that can combine the output of the classifiers included in RODS.

More Details

Parameter identification for the electrical modeling of semiconductor bridges

Gray, Genetha A.

Semiconductor bridges (SCBs) are commonly used as initiators for explosive and pyrotechnic devices. Their advantages include reduced voltage and energy requirements and exceptional safety features. Moreover, the design of systems which implement SCBs can be expedited using electrical simulation software. Successful use of this software requires that certain parameters be correctly chosen. In this paper, we explain how these parameters can be identified using optimization. We describe the problem focusing on the application of a direct optimization method for its solution, and present some numerical results.

More Details

Model-building codes for membrane proteins

Brown, William M.; Faulon, Jean-Loup M.; Gray, Genetha A.; Hunt, Thomas W.; Schoeniger, Joseph S.; Slepoy, Alexander S.; Young, Malin M.

We have developed a novel approach to modeling the transmembrane spanning helical bundles of integral membrane proteins using only a sparse set of distance constraints, such as those derived from MS3-D, dipolar-EPR and FRET experiments. Algorithms have been written for searching the conformational space of membrane protein folds matching the set of distance constraints, which provides initial structures for local conformational searches. Local conformation search is achieved by optimizing these candidates against a custom penalty function that incorporates both measures derived from statistical analysis of solved membrane protein structures and distance constraints obtained from experiments. This results in refined helical bundles to which the interhelical loops and amino acid side-chains are added. Using a set of only 27 distance constraints extracted from the literature, our methods successfully recover the structure of dark-adapted rhodopsin to within 3.2 {angstrom} of the crystal structure.

More Details

APPSPACK 4.0 : asynchronous parallel pattern search for derivative-free optimization

Gray, Genetha A.

APPSPACK is software for solving unconstrained and bound constrained optimization problems. It implements an asynchronous parallel pattern search method that has been specifically designed for problems characterized by expensive function evaluations. Using APPSPACK to solve optimization problems has several advantages: No derivative information is needed; the procedure for evaluating the objective function can be executed via a separate program or script; the code can be run in serial or parallel, regardless of whether or not the function evaluation itself is parallel; and the software is freely available. We describe the underlying algorithm, data structures, and features of APPSPACK version 4.0 as well as how to use and customize the software.

More Details

Optimal bundling of transmembrane helices using sparse distance constraints

Protein Science

Sale, Ken; Faulon, Jean-Loup M.; Gray, Genetha A.; Schoeniger, Joseph S.; Young, Malin M.

We present a two-step approach to modeling the transmembrane spanning helical bundles of integral membrane proteins using only sparse distance constraints, such as those derived from chemical cross-linking, dipolar EPR and FRET experiments. In Step 1, using an algorithm, we developed, the conformational space of membrane protein folds matching a set of distance constraints is explored to provide initial structures for local conformational searches. In Step 2, these structures refined against a custom penalty function that incorporates both measures derived from statistical analysis of solved membrane protein structures and distance constraints obtained from experiments. We begin by describing the statistical analysis of the solved membrane protein structures from which the theoretical portion of the penalty function was derived. We then describe the penalty function, and, using a set of six test cases, demonstrate that it is capable of distinguishing helical bundles that are close to the native bundle from those that are far from the native bundle. Finally, using a set of only 27 distance constraints extracted from the literature, we show that our method successfully recovers the structure of dark-adapted rhodopsin to within 3.2 Å of the crystal structure.

More Details
31 Results
31 Results