Publications

44 Results
Skip to search filters

Digital quantum simulation of molecular dynamics and control

Physical Review Research

Magann, Alicia B.; Grace, Matthew G.; Rabitz, Herschel A.; Sarovar, Mohan S.

Optimally-shaped electromagnetic fields have the capacity to coherently control the dynamics of quantum systems and thus offer a promising means for controlling molecular transformations relevant to chemical, biological, and materials applications. Currently, advances in this area are hindered by the prohibitive cost of the quantum dynamics simulations needed to explore the principles and possibilities of molecular control. However, the emergence of nascent quantum-computing devices suggests that efficient simulations of quantum dynamics may be on the horizon. In this article, we study how quantum computers could be employed to design optimally-shaped fields to control molecular systems. We introduce a hybrid algorithm that utilizes a quantum computer for simulating the field-induced quantum dynamics of a molecular system in polynomial time, in combination with a classical optimization approach for updating the field. Qubit encoding methods relevant for molecular control problems are described, and procedures for simulating the quantum dynamics and obtaining the simulation results are discussed. Numerical illustrations are then presented that explicitly treat paradigmatic vibrational and rotational control problems, and also consider how optimally-shaped fields could be used to elucidate the mechanisms of energy transfer in light-harvesting complexes. Resource estimates, as well as a numerical assessment of the impact of hardware noise and the prospects of near-term hardware implementations, are provided for the latter task.

More Details

Probabilistic models for feedback systems

Grace, Matthew G.

In previous work, we developed a Bayesian-based methodology to analyze the reliability of hierarchical systems. The output of the procedure is a statistical distribution of the reliability, thus allowing many questions to be answered. The principal advantage of the approach is that along with an estimate of the reliability, we also can provide statements of confidence in the results. The model is quite general in that it allows general representations of all of the distributions involved, it incorporates prior knowledge into the models, it allows errors in the 'engineered' nodes of a system to be determined by the data, and leads to the ability to determine optimal testing strategies. In this report, we provide the preliminary steps necessary to extend this approach to systems with feedback. Feedback is an essential component of 'complexity' and provides interesting challenges in modeling the time-dependent action of a feedback loop. We provide a mechanism for doing this and analyze a simple case. We then consider some extensions to more interesting examples with local control affecting the entire system. Finally, a discussion of the status of the research is also included.

More Details

QMU as an approach to strengthening the predictive capabilities of complex models

Gray, Genetha A.; Boggs, Paul T.; Grace, Matthew G.

Complex systems are made up of multiple interdependent parts, and the behavior of the entire system cannot always be directly inferred from the behavior of the individual parts. They are nonlinear and system responses are not necessarily additive. Examples of complex systems include energy, cyber and telecommunication infrastructures, human and animal social structures, and biological structures such as cells. To meet the goals of infrastructure development, maintenance, and protection for cyber-related complex systems, novel modeling and simulation technology is needed. Sandia has shown success using M&S in the nuclear weapons (NW) program. However, complex systems represent a significant challenge and relative departure from the classical M&S exercises, and many of the scientific and mathematical M&S processes must be re-envisioned. Specifically, in the NW program, requirements and acceptable margins for performance, resilience, and security are well-defined and given quantitatively from the start. The Quantification of Margins and Uncertainties (QMU) process helps to assess whether or not these safety, reliability and performance requirements have been met after a system has been developed. In this sense, QMU is used as a sort of check that requirements have been met once the development process is completed. In contrast, performance requirements and margins may not have been defined a priori for many complex systems, (i.e. the Internet, electrical distribution grids, etc.), particularly not in quantitative terms. This project addresses this fundamental difference by investigating the use of QMU at the start of the design process for complex systems. Three major tasks were completed. First, the characteristics of the cyber infrastructure problem were collected and considered in the context of QMU-based tools. Second, UQ methodologies for the quantification of model discrepancies were considered in the context of statistical models of cyber activity. Third, Bayesian methods for optimal testing in the QMU framework were developed. This completion of this project represent an increased understanding of how to apply and use the QMU process as a means for improving model predictions of the behavior of complex systems. 4

More Details

Robustness of optimally controlled unitary quantum gates

Grace, Matthew G.

A unitary quantum gate is the basic functioning element of a quantum computer. Summary of results: (1) Robustness of a general n-qubit gate = 1 - F {proportional_to} 2{sup n}; (2) Robustness of a universal gate with complete isolation of one-and two-qubit subgates = 1 - F {proportional_to} n; and (3) Robustness of a universal gate with small unwanted couplings between the qubits is unclear.

More Details

Practical reliability and uncertainty quantification in complex systems : final report

Grace, Matthew G.; Red-Horse, John R.; Pebay, Philippe P.; Ringland, James T.; Zurn, Rena M.; Diegert, Kathleen V.

The purpose of this project was to investigate the use of Bayesian methods for the estimation of the reliability of complex systems. The goals were to find methods for dealing with continuous data, rather than simple pass/fail data; to avoid assumptions of specific probability distributions, especially Gaussian, or normal, distributions; to compute not only an estimate of the reliability of the system, but also a measure of the confidence in that estimate; to develop procedures to address time-dependent or aging aspects in such systems, and to use these models and results to derive optimal testing strategies. The system is assumed to be a system of systems, i.e., a system with discrete components that are themselves systems. Furthermore, the system is 'engineered' in the sense that each node is designed to do something and that we have a mathematical description of that process. In the time-dependent case, the assumption is that we have a general, nonlinear, time-dependent function describing the process. The major results of the project are described in this report. In summary, we developed a sophisticated mathematical framework based on modern probability theory and Bayesian analysis. This framework encompasses all aspects of epistemic uncertainty and easily incorporates steady-state and time-dependent systems. Based on Markov chain, Monte Carlo methods, we devised a computational strategy for general probability density estimation in the steady-state case. This enabled us to compute a distribution of the reliability from which many questions, including confidence, could be addressed. We then extended this to the time domain and implemented procedures to estimate the reliability over time, including the use of the method to predict the reliability at a future time. Finally, we used certain aspects of Bayesian decision analysis to create a novel method for determining an optimal testing strategy, e.g., we can estimate the 'best' location to take the next test to minimize the risk of making a wrong decision about the fitness of a system. We conclude this report by proposing additional fruitful areas of research.

More Details

Sandia National Laboratories environmental fluid dynamics code : sediment transport user manual

Thanh, Phi X.; Grace, Matthew G.

This document describes the sediment transport subroutines and input files for the Sandia National Laboratories Environmental Fluid Dynamics Code (SNL-EFDC). Detailed descriptions of the input files containing data from Sediment Erosion at Depth flume (SEDflume) measurements are provided along with the description of the source code implementing sediment transport. Both the theoretical description of sediment transport employed in SNL-EFDC and the source code are described. This user manual is meant to be used in conjunction with the EFDC manual (Hamrick 1996) because there will be no reference to the hydrodynamics in EFDC. Through this document, the authors aim to provide the necessary information for new users who wish to implement sediment transport in EFDC and obtain a clear understanding of the source code.

More Details

SNL-NUMO collaborative : development of a deterministic site characterization tool using multi-model ranking and inference

Arnold, Bill W.; Gray, Genetha A.; Grace, Matthew G.; Ahlmann, Michael A.

Uncertainty in site characterization arises from a lack of data and knowledge about a site and includes uncertainty in the boundary conditions, uncertainty in the characteristics, location, and behavior of major features within an investigation area (e.g., major faults as barriers or conduits), uncertainty in the geologic structure, as well as differences in numerical implementation (e.g., 2-D versus 3-D, finite difference versus finite element, grid resolution, deterministic versus stochastic, etc.). Since the true condition at a site can never be known, selection of the best conceptual model is very difficult. In addition, limiting the understanding to a single conceptualization too early in the process, or before data can support that conceptualization, may lead to confidence in a characterization that is unwarranted as well as to data collection efforts and field investigations that are misdirected and/or redundant. Using a series of numerical modeling experiments, this project examined the application and use of information criteria within the site characterization process. The numerical experiments are based on models of varying complexity that were developed to represent one of two synthetically developed groundwater sites; (1) a fully hypothetical site that represented a complex, multi-layer, multi-faulted site, and (2) a site that was based on the Horonobe site in northern Japan. Each of the synthetic sites were modeled in detail to provide increasingly informative 'field' data over successive iterations to the representing numerical models. The representing numerical models were calibrated to the synthetic site data and then ranked and compared using several different information criteria approaches. Results show, that for the early phases of site characterization, low-parameterized models ranked highest while more complex models generally ranked lowest. In addition, predictive capabilities were also better with the low-parameterized models. For the latter iterations, when more data were available, the information criteria rankings tended to converge on the higher parameterized models. Analysis of the numerical experiments suggest that information criteria rankings can be extremely useful for site characterization, but only when the rankings are placed in context and when the contribution of each bias term is understood.

More Details
44 Results
44 Results