Publications

Results 201–235 of 235
Skip to search filters

Efficient uncertainty quantification methodologies for high-dimensional climate land models

Sargsyan, Khachik S.; Safta, Cosmin S.; Berry, Robert D.; Ray, Jaideep R.; Debusschere, Bert D.; Najm, H.N.

In this report, we proposed, examined and implemented approaches for performing efficient uncertainty quantification (UQ) in climate land models. Specifically, we applied Bayesian compressive sensing framework to a polynomial chaos spectral expansions, enhanced it with an iterative algorithm of basis reduction, and investigated the results on test models as well as on the community land model (CLM). Furthermore, we discussed construction of efficient quadrature rules for forward propagation of uncertainties from high-dimensional, constrained input space to output quantities of interest. The work lays grounds for efficient forward UQ for high-dimensional, strongly non-linear and computationally costly climate models. Moreover, to investigate parameter inference approaches, we have applied two variants of the Markov chain Monte Carlo (MCMC) method to a soil moisture dynamics submodel of the CLM. The evaluation of these algorithms gave us a good foundation for further building out the Bayesian calibration framework towards the goal of robust component-wise calibration.

More Details

Real-time characterization of partially observed epidemics using surrogate models

Safta, Cosmin S.; Ray, Jaideep R.; Sargsyan, Khachik S.; Lefantzi, Sophia L.

We present a statistical method, predicated on the use of surrogate models, for the 'real-time' characterization of partially observed epidemics. Observations consist of counts of symptomatic patients, diagnosed with the disease, that may be available in the early epoch of an ongoing outbreak. Characterization, in this context, refers to estimation of epidemiological parameters that can be used to provide short-term forecasts of the ongoing epidemic, as well as to provide gross information on the dynamics of the etiologic agent in the affected population e.g., the time-dependent infection rate. The characterization problem is formulated as a Bayesian inverse problem, and epidemiological parameters are estimated as distributions using a Markov chain Monte Carlo (MCMC) method, thus quantifying the uncertainty in the estimates. In some cases, the inverse problem can be computationally expensive, primarily due to the epidemic simulator used inside the inversion algorithm. We present a method, based on replacing the epidemiological model with computationally inexpensive surrogates, that can reduce the computational time to minutes, without a significant loss of accuracy. The surrogates are created by projecting the output of an epidemiological model on a set of polynomial chaos bases; thereafter, computations involving the surrogate model reduce to evaluations of a polynomial. We find that the epidemic characterizations obtained with the surrogate models is very close to that obtained with the original model. We also find that the number of projections required to construct a surrogate model is O(10)-O(10{sup 2}) less than the number of samples required by the MCMC to construct a stationary posterior distribution; thus, depending upon the epidemiological models in question, it may be possible to omit the offline creation and caching of surrogate models, prior to their use in an inverse problem. The technique is demonstrated on synthetic data as well as observations from the 1918 influenza pandemic collected at Camp Custer, Michigan.

More Details

Uncertainty quantification given discontinuous climate model response and a limited number of model runs

Sargsyan, Khachik S.; Safta, Cosmin S.; Debusschere, Bert D.; Najm, H.N.

Uncertainty quantification in complex climate models is challenged by the sparsity of available climate model predictions due to the high computational cost of model runs. Another feature that prevents classical uncertainty analysis from being readily applicable is bifurcative behavior in climate model response with respect to certain input parameters. A typical example is the Atlantic Meridional Overturning Circulation. The predicted maximum overturning stream function exhibits discontinuity across a curve in the space of two uncertain parameters, namely climate sensitivity and CO2 forcing. We outline a methodology for uncertainty quantification given discontinuous model response and a limited number of model runs. Our approach is two-fold. First we detect the discontinuity with Bayesian inference, thus obtaining a probabilistic representation of the discontinuity curve shape and location for arbitrarily distributed input parameter values. Then, we construct spectral representations of uncertainty, using Polynomial Chaos (PC) expansions on either side of the discontinuity curve, leading to an averaged-PC representation of the forward model that allows efficient uncertainty quantification. The approach is enabled by a Rosenblatt transformation that maps each side of the discontinuity to regular domains where desirable orthogonality properties for the spectral bases hold. We obtain PC modes by either orthogonal projection or Bayesian inference, and argue for a hybrid approach that targets a balance between the accuracy provided by the orthogonal projection and the flexibility provided by the Bayesian inference - where the latter allows obtaining reasonable expansions without extra forward model runs. The model output, and its associated uncertainty at specific design points, are then computed by taking an ensemble average over PC expansions corresponding to possible realizations of the discontinuity curve. The methodology is tested on synthetic examples of discontinuous model data with adjustable sharpness and structure.

More Details

Uncertainty quantification for large-scale ocean circulation predictions

Safta, Cosmin S.; Sargsyan, Khachik S.; Debusschere, Bert D.; Najm, H.N.

Uncertainty quantificatio in climate models is challenged by the sparsity of the available climate data due to the high computational cost of the model runs. Another feature that prevents classical uncertainty analyses from being easily applicable is the bifurcative behavior in the climate data with respect to certain parameters. A typical example is the Meridional Overturning Circulation in the Atlantic Ocean. The maximum overturning stream function exhibits discontinuity across a curve in the space of two uncertain parameters, namely climate sensitivity and CO{sub 2} forcing. We develop a methodology that performs uncertainty quantificatio in the presence of limited data that have discontinuous character. Our approach is two-fold. First we detect the discontinuity location with a Bayesian inference, thus obtaining a probabilistic representation of the discontinuity curve location in presence of arbitrarily distributed input parameter values. Furthermore, we developed a spectral approach that relies on Polynomial Chaos (PC) expansions on each sides of the discontinuity curve leading to an averaged-PC representation of the forward model that allows efficient uncertainty quantification and propagation. The methodology is tested on synthetic examples of discontinuous data with adjustable sharpness and structure.

More Details

Uncertainty quantification of cinematic imaging for development of predictive simulations of turbulent combustion

Frank, Jonathan H.; Lawson, Matthew L.; Sargsyan, Khachik S.; Debusschere, Bert D.; Najm, H.N.

Recent advances in high frame rate complementary metal-oxide-semiconductor (CMOS) cameras coupled with high repetition rate lasers have enabled laser-based imaging measurements of the temporal evolution of turbulent reacting flows. This measurement capability provides new opportunities for understanding the dynamics of turbulence-chemistry interactions, which is necessary for developing predictive simulations of turbulent combustion. However, quantitative imaging measurements using high frame rate CMOS cameras require careful characterization of the their noise, non-linear response, and variations in this response from pixel to pixel. We develop a noise model and calibration tools to mitigate these problems and to enable quantitative use of CMOS cameras. We have demonstrated proof of principle for image de-noising using both wavelet methods and Bayesian inference. The results offer new approaches for quantitative interpretation of imaging measurements from noisy data acquired with non-linear detectors. These approaches are potentially useful in many areas of scientific research that rely on quantitative imaging measurements.

More Details

Uncertainty quantification in the presence of limited climate model data with discontinuities

Sargsyan, Khachik S.; Safta, Cosmin S.; Debusschere, Bert D.; Najm, H.N.

Uncertainty quantification in climate models is challenged by the prohibitive cost of a large number of model evaluations for sampling. Another feature that often prevents classical uncertainty analysis from being readily applicable is the bifurcative behavior in the climate data with respect to certain parameters. A typical example is the Meridional Overturning Circulation in the Atlantic Ocean. The maximum overturning stream function exhibits a discontinuity across a curve in the space of two uncertain parameters, namely climate sensitivity and CO2 forcing. In order to propagate uncertainties from model parameters to model output we use polynomial chaos (PC) expansions to represent the maximum overturning stream function in terms of the uncertain climate sensitivity and CO2 forcing parameters. Since the spectral methodology assumes a certain degree of smoothness, the presence of discontinuities suggests that separate PC expansions on each side of the discontinuity will lead to more accurate descriptions of the climate model output compared to global PC expansions. We propose a methodology that first finds a probabilistic description of the discontinuity given a number of data points. Assuming the discontinuity curve is a polynomial, the algorithm is based on Bayesian inference of its coefficients. Markov chain Monte Carlo sampling is used to obtain joint distributions for the polynomial coefficients, effectively parameterizing the distribution over all possible discontinuity curves. Next, we apply the Rosenblatt transformation to the irregular parameter domains on each side of the discontinuity. This transformation maps a space of uncertain parameters with specific probability distributions to a space of i.i.d standard random variables where orthogonal projections can be used to obtain PC coefficients. In particular, we use uniform random variables that are compatible with PC expansions based on Legendre polynomials. The Rosenblatt transformation and the corresponding PC expansions for the model output on either side of the discontinuity are applied successively for several realizations of the discontinuity curve. The climate model output and its associated uncertainty at specific design points is then computed by taking a quadrature-based integration average over PC expansions corresponding to possible realizations of the discontinuity curve.

More Details

Advanced methods for uncertainty quantification in tail regions of climate model predictions

Sargsyan, Khachik S.; Safta, Cosmin S.; Debusschere, Bert D.; Najm, H.N.

Conventional methods for uncertainty quantification are generally challenged in the 'tails' of probability distributions. This is specifically an issue for many climate observables since extensive sampling to obtain a reasonable accuracy in tail regions is especially costly in climate models. Moreover, the accuracy of spectral representations of uncertainty is weighted in favor of more probable ranges of the underlying basis variable, which, in conventional bases does not particularly target tail regions. Therefore, what is ideally desired is a methodology that requires only a limited number of full computational model evaluations while remaining accurate enough in the tail region. To develop such a methodology, we explore the use of surrogate models based on non-intrusive Polynomial Chaos expansions and Galerkin projection. We consider non-conventional and custom basis functions, orthogonal with respect to probability distributions that exhibit fat-tailed regions. We illustrate how the use of non-conventional basis functions, and surrogate model analysis, improves the accuracy of the spectral expansions in the tail regions. Finally, we also demonstrate these methodologies using precipitation data from CCSM simulations.

More Details

Bayesian methods for discontinuity detection in climate model predictions

Safta, Cosmin S.; Debusschere, Bert D.; Najm, H.N.; Sargsyan, Khachik S.

Discontinuity detection is an important component in many fields: Image recognition, Digital signal processing, and Climate change research. Current methods shortcomings are: Restricted to one- or two-dimensional setting, Require uniformly spaced and/or dense input data, and Give deterministic answers without quantifying the uncertainty. Spectral methods for Uncertainty Quantification with global, smooth bases are challenged by discontinuities in model simulation results. Domain decomposition reduces the impact of nonlinearities and discontinuities. However, while gaining more smoothness in each subdomain, the current domain refinement methods require prohibitively many simulations. Therefore, detecting discontinuities up front and refining accordingly provides huge improvement to the current methodologies.

More Details

Quantifying prediction fidelity in multiscale multiphysics simulations

Adalsteinsson, Helgi A.; Debusschere, Bert D.; Najm, H.N.; Jones, Reese E.; Sargsyan, Khachik S.

Multiscale multiphysics problems arise in a host of application areas of significant relevance to DOE, including electrical storage systems (membranes and electrodes in fuel cells, batteries, and ultracapacitors), water surety, chemical analysis and detection systems, and surface catalysis. Multiscale methods aim to provide detailed physical insight into these complex systems by incorporating coupled effects of relevant phenomena on all scales. However, many sources of uncertainty and modeling inaccuracies hamper the predictive fidelity of multiscale multiphysics simulations. These include parametric and model uncertainties in the models on all scales, and errors associated with coupling, or information transfer, across scales/physics. This presentation introduces our work on the development of uncertainty quantification methods for spatially decomposed atomistic-to-continuum (A2C) multiscale simulations. The key thrusts of this research effort are: inference of uncertain parameters or observables from experimental or simulation data; propagation of uncertainty through particle models; propagation of uncertainty through continuum models; propagation of information and uncertainty across model/scale interfaces; and numerical and computational analysis and control. To enable the bidirectional coupling between the atomistic and continuum simulations, a general formulation has been developed for the characterization of sampling noise due to intrinsic variability in particle simulations, and for the propagation of both this sampling noise and parametric uncertainties through coupled A2C multiscale simulations. Simplified tests of noise quantification in particle computations are conducted through Bayesian inference of diffusion rates in an idealized isothermal binary material system. A proof of concept is finally presented based on application of the present formulation to the propagation of uncertainties in a model plane Couette flow, where the near wall region is handled with molecular dynamics while the bulk region is handled with continuum methods.

More Details

Uncertainty quantification in the presence of limited climate model data with discontinuities

Sargsyan, Khachik S.; Safta, Cosmin S.; Debusschere, Bert D.; Najm, H.N.

Uncertainty quantification in climate models is challenged by the sparsity of the available climate data due to the high computational cost of the model runs. Another feature that prevents classical uncertainty analyses from being easily applicable is the bifurcative behavior in the climate data with respect to certain parameters. A typical example is the Meridional Overturning Circulation in the Atlantic Ocean. The maximum overturning stream function exhibits discontinuity across a curve in the space of two uncertain parameters, namely climate sensitivity and CO2 forcing. We develop a methodology that performs uncertainty quantification in this context in the presence of limited data.

More Details

Predictability and reduced order modeling in stochastic reaction networks

Sargsyan, Khachik S.; Debusschere, Bert D.; Najm, H.N.

Many systems involving chemical reactions between small numbers of molecules exhibit inherent stochastic variability. Such stochastic reaction networks are at the heart of processes such as gene transcription, cell signaling or surface catalytic reactions, which are critical to bioenergy, biomedical, and electrical storage applications. The underlying molecular reactions are commonly modeled with chemical master equations (CMEs), representing jump Markov processes, or stochastic differential equations (SDEs), rather than ordinary differential equations (ODEs). As such reaction networks are often inferred from noisy experimental data, it is not uncommon to encounter large parametric uncertainties in these systems. Further, a wide range of time scales introduces the need for reduced order representations. Despite the availability of mature tools for uncertainty/sensitivity analysis and reduced order modeling in deterministic systems, there is a lack of robust algorithms for such analyses in stochastic systems. In this talk, we present advances in algorithms for predictability and reduced order representations for stochastic reaction networks and apply them to bistable systems of biochemical interest. To study the predictability of a stochastic reaction network in the presence of both parametric uncertainty and intrinsic variability, an algorithm was developed to represent the system state with a spectral polynomial chaos (PC) expansion in the stochastic space representing parametric uncertainty and intrinsic variability. Rather than relying on a non-intrusive collocation-based Galerkin projection [1], this PC expansion is obtained using Bayesian inference, which is ideally suited to handle noisy systems through its probabilistic formulation. To accommodate state variables with multimodal distributions, an adaptive multiresolution representation is used [2]. As the PC expansion directly relates the state variables to the uncertain parameters, the formulation lends itself readily to sensitivity analysis. Reduced order modeling in the time dimension is accomplished using a Karhunen-Loeve (KL) decomposition of the stochastic process in terms of the eigenmodes of its covariance matrix. Subsequently, a Rosenblatt transformation relates the random variables in the KL decomposition to a set of independent random variables, allowing the representation of the system state with a PC expansion in those independent random variables. An adaptive clustering method is used to handle multimodal distributions efficiently, and is well suited for high-dimensional spaces. The spectral representation of the stochastic reaction networks makes these systems more amenable to analysis, enabling a detailed understanding of their functionality, and robustness under experimental data uncertainty and inherent variability.

More Details
Results 201–235 of 235
Results 201–235 of 235