Publications

Results 201–378 of 378
Skip to search filters

Multiparameter spectral representation of noise-induced competence in bacillus subtilis

IEEE/ACM Transactions on Computational Biology and Bioinformatics

Sargsyan, Khachik S.; Safta, Cosmin S.; Debusschere, Bert D.; Najm, H.N.

In this work, the problem of representing a stochastic forward model output with respect to a large number of input parameters is considered. The methodology is applied to a stochastic reaction network of competence dynamics in Bacillus subtilis bacterium. In particular, the dependence of the competence state on rate constants of underlying reactions is investigated. We base our methodology on Polynomial Chaos (PC) spectral expansions that allow effective propagation of input parameter uncertainties to outputs of interest. Given a number of forward model training runs at sampled input parameter values, the PC modes are estimated using a Bayesian framework. As an outcome, these PC modes are described with posterior probability distributions. The resulting expansion can be regarded as an uncertain response function and can further be used as a computationally inexpensive surrogate instead of the original reaction model for subsequent analyses such as calibration or optimization studies. Furthermore, the methodology is enhanced with a classification-based mixture PC formulation that overcomes the difficulties associated with representing potentially nonsmooth input-output relationships. Finally, the global sensitivity analysis based on the multiparameter spectral representation of an observable of interest provides biological insight and reveals the most important reactions and their couplings for the competence dynamics. © 2013 IEEE.

More Details

Automated exploration of the mechanism of elementary reactions

Najm, H.N.; Zador, Judit Z.

Optimization of new transportation fuels and engine technologies requires the characterization of the combustion chemistry of a wide range of fuel classes. Theoretical studies of elementary reactions — the building blocks of complex reaction mechanisms — are essential to accurately predict important combustion processes such as autoignition of biofuels. The current bottleneck for these calculations is a user-intensive exploration of the underlying potential energy surface (PES), which relies on the “chemical intuition” of the scientist to propose initial guesses for the relevant chemical configurations. For newly emerging fuels, this approach cripples the rate of progress because of the system size and complexity. The KinBot program package aims to accelerate the detailed chemical kinetic description of combustion, and enables large-scale systematic studies on the sub-mechanism level.

More Details

Efficient uncertainty quantification methodologies for high-dimensional climate land models

Sargsyan, Khachik S.; Safta, Cosmin S.; Berry, Robert D.; Ray, Jaideep R.; Debusschere, Bert D.; Najm, H.N.

In this report, we proposed, examined and implemented approaches for performing efficient uncertainty quantification (UQ) in climate land models. Specifically, we applied Bayesian compressive sensing framework to a polynomial chaos spectral expansions, enhanced it with an iterative algorithm of basis reduction, and investigated the results on test models as well as on the community land model (CLM). Furthermore, we discussed construction of efficient quadrature rules for forward propagation of uncertainties from high-dimensional, constrained input space to output quantities of interest. The work lays grounds for efficient forward UQ for high-dimensional, strongly non-linear and computationally costly climate models. Moreover, to investigate parameter inference approaches, we have applied two variants of the Markov chain Monte Carlo (MCMC) method to a soil moisture dynamics submodel of the CLM. The evaluation of these algorithms gave us a good foundation for further building out the Bayesian calibration framework towards the goal of robust component-wise calibration.

More Details

TChem - A Software Toolkit for the Analysis of Complex Kinetic Models

Safta, Cosmin S.; Najm, H.N.

The TChem toolkit is a software library that enables numerical simulations using complex chemistry and facilitates the analysis of detailed kinetic models. The toolkit provide capabilities for thermodynamic properties based on NASA polynomials and species production/consumption rates. It incorporates methods that can selectively modify reaction parameters for sensitivity analysis. The library contains several functions that provide analytically computed Jacobian matrices necessary for the efficient time advancement and analysis of detailed kinetic models.

More Details

Uncertainty quantification given discontinuous climate model response and a limited number of model runs

Sargsyan, Khachik S.; Safta, Cosmin S.; Debusschere, Bert D.; Najm, H.N.

Uncertainty quantification in complex climate models is challenged by the sparsity of available climate model predictions due to the high computational cost of model runs. Another feature that prevents classical uncertainty analysis from being readily applicable is bifurcative behavior in climate model response with respect to certain input parameters. A typical example is the Atlantic Meridional Overturning Circulation. The predicted maximum overturning stream function exhibits discontinuity across a curve in the space of two uncertain parameters, namely climate sensitivity and CO2 forcing. We outline a methodology for uncertainty quantification given discontinuous model response and a limited number of model runs. Our approach is two-fold. First we detect the discontinuity with Bayesian inference, thus obtaining a probabilistic representation of the discontinuity curve shape and location for arbitrarily distributed input parameter values. Then, we construct spectral representations of uncertainty, using Polynomial Chaos (PC) expansions on either side of the discontinuity curve, leading to an averaged-PC representation of the forward model that allows efficient uncertainty quantification. The approach is enabled by a Rosenblatt transformation that maps each side of the discontinuity to regular domains where desirable orthogonality properties for the spectral bases hold. We obtain PC modes by either orthogonal projection or Bayesian inference, and argue for a hybrid approach that targets a balance between the accuracy provided by the orthogonal projection and the flexibility provided by the Bayesian inference - where the latter allows obtaining reasonable expansions without extra forward model runs. The model output, and its associated uncertainty at specific design points, are then computed by taking an ensemble average over PC expansions corresponding to possible realizations of the discontinuity curve. The methodology is tested on synthetic examples of discontinuous model data with adjustable sharpness and structure.

More Details

Uncertainty quantification for large-scale ocean circulation predictions

Safta, Cosmin S.; Sargsyan, Khachik S.; Debusschere, Bert D.; Najm, H.N.

Uncertainty quantificatio in climate models is challenged by the sparsity of the available climate data due to the high computational cost of the model runs. Another feature that prevents classical uncertainty analyses from being easily applicable is the bifurcative behavior in the climate data with respect to certain parameters. A typical example is the Meridional Overturning Circulation in the Atlantic Ocean. The maximum overturning stream function exhibits discontinuity across a curve in the space of two uncertain parameters, namely climate sensitivity and CO{sub 2} forcing. We develop a methodology that performs uncertainty quantificatio in the presence of limited data that have discontinuous character. Our approach is two-fold. First we detect the discontinuity location with a Bayesian inference, thus obtaining a probabilistic representation of the discontinuity curve location in presence of arbitrarily distributed input parameter values. Furthermore, we developed a spectral approach that relies on Polynomial Chaos (PC) expansions on each sides of the discontinuity curve leading to an averaged-PC representation of the forward model that allows efficient uncertainty quantification and propagation. The methodology is tested on synthetic examples of discontinuous data with adjustable sharpness and structure.

More Details

Uncertainty quantification of cinematic imaging for development of predictive simulations of turbulent combustion

Frank, Jonathan H.; Lawson, Matthew L.; Sargsyan, Khachik S.; Debusschere, Bert D.; Najm, H.N.

Recent advances in high frame rate complementary metal-oxide-semiconductor (CMOS) cameras coupled with high repetition rate lasers have enabled laser-based imaging measurements of the temporal evolution of turbulent reacting flows. This measurement capability provides new opportunities for understanding the dynamics of turbulence-chemistry interactions, which is necessary for developing predictive simulations of turbulent combustion. However, quantitative imaging measurements using high frame rate CMOS cameras require careful characterization of the their noise, non-linear response, and variations in this response from pixel to pixel. We develop a noise model and calibration tools to mitigate these problems and to enable quantitative use of CMOS cameras. We have demonstrated proof of principle for image de-noising using both wavelet methods and Bayesian inference. The results offer new approaches for quantitative interpretation of imaging measurements from noisy data acquired with non-linear detectors. These approaches are potentially useful in many areas of scientific research that rely on quantitative imaging measurements.

More Details

Uncertainty quantification in the presence of limited climate model data with discontinuities

Sargsyan, Khachik S.; Safta, Cosmin S.; Debusschere, Bert D.; Najm, H.N.

Uncertainty quantification in climate models is challenged by the prohibitive cost of a large number of model evaluations for sampling. Another feature that often prevents classical uncertainty analysis from being readily applicable is the bifurcative behavior in the climate data with respect to certain parameters. A typical example is the Meridional Overturning Circulation in the Atlantic Ocean. The maximum overturning stream function exhibits a discontinuity across a curve in the space of two uncertain parameters, namely climate sensitivity and CO2 forcing. In order to propagate uncertainties from model parameters to model output we use polynomial chaos (PC) expansions to represent the maximum overturning stream function in terms of the uncertain climate sensitivity and CO2 forcing parameters. Since the spectral methodology assumes a certain degree of smoothness, the presence of discontinuities suggests that separate PC expansions on each side of the discontinuity will lead to more accurate descriptions of the climate model output compared to global PC expansions. We propose a methodology that first finds a probabilistic description of the discontinuity given a number of data points. Assuming the discontinuity curve is a polynomial, the algorithm is based on Bayesian inference of its coefficients. Markov chain Monte Carlo sampling is used to obtain joint distributions for the polynomial coefficients, effectively parameterizing the distribution over all possible discontinuity curves. Next, we apply the Rosenblatt transformation to the irregular parameter domains on each side of the discontinuity. This transformation maps a space of uncertain parameters with specific probability distributions to a space of i.i.d standard random variables where orthogonal projections can be used to obtain PC coefficients. In particular, we use uniform random variables that are compatible with PC expansions based on Legendre polynomials. The Rosenblatt transformation and the corresponding PC expansions for the model output on either side of the discontinuity are applied successively for several realizations of the discontinuity curve. The climate model output and its associated uncertainty at specific design points is then computed by taking a quadrature-based integration average over PC expansions corresponding to possible realizations of the discontinuity curve.

More Details

Advanced methods for uncertainty quantification in tail regions of climate model predictions

Sargsyan, Khachik S.; Safta, Cosmin S.; Debusschere, Bert D.; Najm, H.N.

Conventional methods for uncertainty quantification are generally challenged in the 'tails' of probability distributions. This is specifically an issue for many climate observables since extensive sampling to obtain a reasonable accuracy in tail regions is especially costly in climate models. Moreover, the accuracy of spectral representations of uncertainty is weighted in favor of more probable ranges of the underlying basis variable, which, in conventional bases does not particularly target tail regions. Therefore, what is ideally desired is a methodology that requires only a limited number of full computational model evaluations while remaining accurate enough in the tail region. To develop such a methodology, we explore the use of surrogate models based on non-intrusive Polynomial Chaos expansions and Galerkin projection. We consider non-conventional and custom basis functions, orthogonal with respect to probability distributions that exhibit fat-tailed regions. We illustrate how the use of non-conventional basis functions, and surrogate model analysis, improves the accuracy of the spectral expansions in the tail regions. Finally, we also demonstrate these methodologies using precipitation data from CCSM simulations.

More Details

Eigenvalue analysis of uncertain ODE systems

Debusschere, Bert D.; Berry, Robert D.; Najm, H.N.

The Polynomial chaos expansion provides a means of representing any L2 random variable as a sum of polynomials that are orthogonal with respect to a chosen measure. Examples include the Hermite polynomials with Gaussian measure on the real line and the Legendre polynomials with uniform measure on an interval. Polynomial chaos can be used to reformulate an uncertain ODE system, using Galerkin projection, as a new, higher-dimensional, deterministic ODE system which describes the evolution of each mode of the polynomial chaos expansion. It is of interest to explore the eigenstructure of the original and reformulated ODE systems by studying the eigenvalues and eigenvectors of their Jacobians. In this talk, we study the distribution of the eigenvalues of the two Jacobians. We outline in general the location of the eigenvalues of the new system with respect to those of the original system, and examine the effect of expansion order on this distribution.

More Details

Uncertainty quantification in reacting flow

Najm, H.N.; Berry, Robert D.; Debusschere, Bert D.

Chemically reacting flow models generally involve inputs and parameters that are determined from empirical measurements, and therefore exhibit a certain degree of uncertainty. Estimating the propagation of this uncertainty into computational model output predictions is crucial for purposes of reacting flow model validation, model exploration, as well as design optimization. Recent years have seen great developments in probabilistic methods and tools for efficient uncertainty quantification (UQ) in computational models. These tools are grounded in the use of Polynomial Chaos (PC) expansions for representation of random variables. The utility and effectiveness of PC methods have been demonstrated in a range of physical models, including structural mechanics, transport in porous media, fluid dynamics, aeronautics, heat transfer, and chemically reacting flow. While high-dimensionality remains nominally an ongoing challenge, great strides have been made in dealing with moderate dimensionality along with non-linearity and oscillatory dynamics. In this talk, I will give an overview of UQ in chemical systems. I will cover both: (1) the estimation of uncertain input parameters from empirical data, and (2) the forward propagation of parametric uncertainty to model outputs. I will cover the basics of forward PC UQ methods with examples of their use. I will also highlight the need for accurate estimation of the joint probability density over the uncertain parameters, in order to arrive at meaningful estimates of model output uncertainties. Finally, I will discuss recent developments on the inference of this density given partial information from legacy experiments, in the absence of raw data.

More Details

Bayesian methods for discontinuity detection in climate model predictions

Safta, Cosmin S.; Debusschere, Bert D.; Najm, H.N.; Sargsyan, Khachik S.

Discontinuity detection is an important component in many fields: Image recognition, Digital signal processing, and Climate change research. Current methods shortcomings are: Restricted to one- or two-dimensional setting, Require uniformly spaced and/or dense input data, and Give deterministic answers without quantifying the uncertainty. Spectral methods for Uncertainty Quantification with global, smooth bases are challenged by discontinuities in model simulation results. Domain decomposition reduces the impact of nonlinearities and discontinuities. However, while gaining more smoothness in each subdomain, the current domain refinement methods require prohibitively many simulations. Therefore, detecting discontinuities up front and refining accordingly provides huge improvement to the current methodologies.

More Details

Data-free inference of uncertain model parameters

Debusschere, Bert D.; Najm, H.N.; Berry, Robert D.; Adalsteinsson, Helgi A.

It is known that, in general, the correlation structure in the joint distribution of model parameters is critical to the uncertainty analysis of that model. Very often, however, studies in the literature only report nominal values for parameters inferred from data, along with confidence intervals for these parameters, but no details on the correlation or full joint distribution of these parameters. When neither posterior nor data are available, but only summary statistics such as nominal values and confidence intervals, a joint PDF must be chosen. Given the summary statistics it may not be reasonable nor necessary to assume the parameters are independent random variables. We demonstrate, using a Bayesian inference procedure, how to construct a posterior density for the parameters exhibiting self consistent correlations, in the absence of data, given (1) the fit-model, (2) nominal parameter values, (3) bounds on the parameters, and (4) a postulated statistical model, around the fit-model, for the missing data. Our approach ensures external Bayesian updating while marginalizing over possible data realizations. We then address the matching of given parameter bounds through the choice of hyperparameters, which are introduced in postulating the statistical model, but are not given nominal values. We discuss some possible approaches, including (1) inferring them in a separate Bayesian inference loop and (2) optimization. We also perform an empirical evaluation of the algorithm showing the posterior obtained with this data free inference compares well with the true posterior obtained from inference against the full data set.

More Details

Data-free inference of the joint distribution of uncertain model parameters

Berry, Robert D.; Najm, H.N.; Debusschere, Bert D.; Adalsteinsson, Helgi A.

It is known that, in general, the correlation structure in the joint distribution of model parameters is critical to the uncertainty analysis of that model. Very often, however, studies in the literature only report nominal values for parameters inferred from data, along with confidence intervals for these parameters, but no details on the correlation or full joint distribution of these parameters. When neither posterior nor data are available, but only summary statistics such as nominal values and confidence intervals, a joint PDF must be chosen. Given the summary statistics it may not be reasonable nor necessary to assume the parameters are independent random variables. We demonstrate, using a Bayesian inference procedure, how to construct a posterior density for the parameters exhibiting self consistent correlations, in the absence of data, given (1) the fit-model, (2) nominal parameter values, (3) bounds on the parameters, and (4) a postulated statistical model, around the fit-model, for the missing data. Our approach ensures external Bayesian updating while marginalizing over possible data realizations. We then address the matching of given parameter bounds through the choice of hyperparameters, which are introduced in postulating the statistical model, but are not given nominal values. We discuss some possible approaches, including (1) inferring them in a separate Bayesian inference loop and (2) optimization. We also perform an empirical evaluation of the algorithm showing the posterior obtained with this data free inference compares well with the true posterior obtained from inference against the full data set.

More Details

Uncertainty quantification in reacting flow

Najm, H.N.

Chemically reacting flow models generally involve inputs and parameters that are determined from empirical measurements, and therefore exhibit a certain degree of uncertainty. Estimating the propagation of this uncertainty into computational model output predictions is crucial for purposes of reacting flow model validation, model exploration, as well as design optimization. Recent years have seen great developments in probabilistic methods and tools for efficient uncertainty quantification (UQ) in computational models. These tools are grounded in the use of Polynomial Chaos (PC) expansions for representation of random variables. The utility and effectiveness of PC methods have been demonstrated in a range of physical models, including structural mechanics, transport in porous media, fluid dynamics, aeronautics, heat transfer, and chemically reacting flow. While high-dimensionality remains nominally an ongoing challenge, great strides have been made in dealing with moderate dimensionality along with non-linearity and oscillatory dynamics. In this talk, I will give an overview of UQ in chemical systems. I will cover both: (1) the estimation of uncertain input parameters from empirical data, and (2) the forward propagation of parametric uncertainty to model outputs. I will cover the basics of forward PC UQ methods with examples of their use. I will also highlight the need for accurate estimation of the joint probability density over the uncertain parameters, in order to arrive at meaningful estimates of model output uncertainties. Finally, I will discuss recent developments on the inference of this density given partial information from legacy experiments, in the absence of raw data.

More Details

Quantifying prediction fidelity in multiscale multiphysics simulations

Adalsteinsson, Helgi A.; Debusschere, Bert D.; Najm, H.N.; Jones, Reese E.; Sargsyan, Khachik S.

Multiscale multiphysics problems arise in a host of application areas of significant relevance to DOE, including electrical storage systems (membranes and electrodes in fuel cells, batteries, and ultracapacitors), water surety, chemical analysis and detection systems, and surface catalysis. Multiscale methods aim to provide detailed physical insight into these complex systems by incorporating coupled effects of relevant phenomena on all scales. However, many sources of uncertainty and modeling inaccuracies hamper the predictive fidelity of multiscale multiphysics simulations. These include parametric and model uncertainties in the models on all scales, and errors associated with coupling, or information transfer, across scales/physics. This presentation introduces our work on the development of uncertainty quantification methods for spatially decomposed atomistic-to-continuum (A2C) multiscale simulations. The key thrusts of this research effort are: inference of uncertain parameters or observables from experimental or simulation data; propagation of uncertainty through particle models; propagation of uncertainty through continuum models; propagation of information and uncertainty across model/scale interfaces; and numerical and computational analysis and control. To enable the bidirectional coupling between the atomistic and continuum simulations, a general formulation has been developed for the characterization of sampling noise due to intrinsic variability in particle simulations, and for the propagation of both this sampling noise and parametric uncertainties through coupled A2C multiscale simulations. Simplified tests of noise quantification in particle computations are conducted through Bayesian inference of diffusion rates in an idealized isothermal binary material system. A proof of concept is finally presented based on application of the present formulation to the propagation of uncertainties in a model plane Couette flow, where the near wall region is handled with molecular dynamics while the bulk region is handled with continuum methods.

More Details

Uncertainty quantification in the presence of limited climate model data with discontinuities

Sargsyan, Khachik S.; Safta, Cosmin S.; Debusschere, Bert D.; Najm, H.N.

Uncertainty quantification in climate models is challenged by the sparsity of the available climate data due to the high computational cost of the model runs. Another feature that prevents classical uncertainty analyses from being easily applicable is the bifurcative behavior in the climate data with respect to certain parameters. A typical example is the Meridional Overturning Circulation in the Atlantic Ocean. The maximum overturning stream function exhibits discontinuity across a curve in the space of two uncertain parameters, namely climate sensitivity and CO2 forcing. We develop a methodology that performs uncertainty quantification in this context in the presence of limited data.

More Details

Time integration of reacting flows with CSP tabulation

Debusschere, Bert D.; Najm, H.N.

This paper presents recent progress on the use of Computational Singular Perturbation (CSP) techniques for time integration of stiff chemical systems. The CSP integration approach removes fast time scales from the reaction system, thereby enabling integration with explicit time stepping algorithms. For further efficiency improvements, a tabulation strategy was developed to allow reuse of the relevant CSP quantities. This paper outlines the method and demonstrates its use on the simulation of hydrogen-air ignition.

More Details

Skeletal mechanism generation with CSP and validation for premixed n-heptane flames

Proceedings of the Combustion Institute

Prager, Jens; Najm, H.N.; Valorani, Mauro; Goussis, Dimitris A.

An automated procedure has been previously developed to generate simplified skeletal reaction mechanisms for the combustion of n-heptane/air mixtures at equivalence ratios between 0.5 and 2.0 and different pressures. The algorithm is based on a Computational Singular Perturbation (CSP)-generated database of importance indices computed from homogeneous n-heptane/air ignition solutions. In this paper, we examine the accuracy of these simplified mechanisms when they are used for modeling laminar n-heptane/air premixed flames. The objective is to evaluate the accuracy of the simplified models when transport processes lead to local mixture compositions that are not necessarily part of the comprehensive homogeneous ignition databases. The detailed mechanism was developed by Curran et al. and involves 560 species and 2538 reactions. The smallest skeletal mechanism considered consists of 66 species and 326 reactions. We show that these skeletal mechanisms yield good agreement with the detailed model for premixed n-heptane flames, over a wide range of equivalence ratios and pressures, for global flame properties. They also exhibit good accuracy in predicting certain elements of internal flame structure, especially the profiles of temperature and major chemical species. On the other hand, we find larger errors in the concentrations of many minor/radical species, particularly in the region where low-temperature chemistry plays a significant role. We also observe that the low-temperature chemistry of n-heptane can play an important role at very lean or very rich mixtures, reaching these limits first at high pressure. This has implications to numerical simulations of non-premixed flames where these lean and rich regions occur naturally. © 2009 The Combustion Institute. Published by Elsevier Inc. All rights reserved.

More Details

Identification of viruses using microfluidic protein profiling and bayesian classification

Analytical Chemistry

Fruetel, Julia A.; West, Jason A.A.; Debusschere, Bert D.; Hukari, Kyle; Lane, Todd L.; Najm, H.N.; Ortega, Jose; Renzi, Ronald F.; Shokair, Isaac R.; VanderNoot, Victoria A.

We present a rapid method for the identification of viruses using microfluidic chip gel electrophoresis (CGE) of high-copy number proteins to generate unique protein profiles. Viral proteins are solubilized by heating at 95°C in borate buffer containing detergent (5 min), then labeled with fluorescamine dye (10 s), and analyzed using the μChemLab CGE system (5 min). Analyses of closely related T2 and T4 bacteriophage demonstrate sufficient assay sensitivity and peak resolution to distinguish the two phage. CGE analyses of four additional viruses - MS2 bacteriophage, Epstein - Barr, respiratory syncytial, and vaccinia viruses - demonstrate reproducible and visually distinct protein profiles. To evaluate the suitability of the method for unique identification of viruses, we employed a Bayesian classification approach. Using a subset of 126 replicate electropherograms of the six viruses and phage for training purposes, successful classification with non-training data was 66/69 or 95% with no false positives. The classification method is based on a single attribute (elution time), although other attributes such as peak width, peak amplitude, or peak shape could be incorporated and may improve performance further. The encouraging results suggest a rapid and simple way to identify viruses without requiring specialty reagents such as PCR probes and antibodies. © 2008 American Chemical Society.

More Details

Computational and experimental study of nanoporous membranes for water desalination and decontamination

Debusschere, Bert D.; Zendejas, Frank Z.; Adalsteinsson, Helgi A.; Tran, Huu T.; Najm, H.N.; Chinn, Douglas A.; Kent, Michael S.; Simmons, Blake S.

Fundamentals of ion transport in nanopores were studied through a joint experimental and computational effort. The study evaluated both nanoporous polymer membranes and track-etched nanoporous polycarbonate membranes. The track-etched membranes provide a geometrically well characterized platform, while the polymer membranes are more closely related to ion exchange systems currently deployed in RO and ED applications. The experimental effort explored transport properties of the different membrane materials. Poly(aniline) membranes showed that flux could be controlled by templating with molecules of defined size. Track-etched polycarbonate membranes were modified using oxygen plasma treatments, UV-ozone exposure, and UV-ozone with thermal grafting, providing an avenue to functionalized membranes, increased wettability, and improved surface characteristic lifetimes. The modeling effort resulted in a novel multiphysics multiscale simulation model for field-driven transport in nanopores. This model was applied to a parametric study of the effects of pore charge and field strength on ion transport and charge exclusion in a nanopore representative of a track-etched polycarbonate membrane. The goal of this research was to uncover the factors that control the flux of ions through a nanoporous material and to develop tools and capabilities for further studies. Continuation studies will build toward more specific applications, such as polymers with attached sulfonate groups, and complex modeling methods and geometries.

More Details

Distributed micro-releases of bioterror pathogens : threat characterizations and epidemiology from uncertain patient observables

Adams, Brian M.; Devine, Karen D.; Najm, H.N.; Marzouk, Youssef M.

Terrorist attacks using an aerosolized pathogen preparation have gained credibility as a national security concern since the anthrax attacks of 2001. The ability to characterize the parameters of such attacks, i.e., to estimate the number of people infected, the time of infection, the average dose received, and the rate of disease spread in contemporary American society (for contagious diseases), is important when planning a medical response. For non-contagious diseases, we address the characterization problem by formulating a Bayesian inverse problem predicated on a short time-series of diagnosed patients exhibiting symptoms. To keep the approach relevant for response planning, we limit ourselves to 3.5 days of data. In computational tests performed for anthrax, we usually find these observation windows sufficient, especially if the outbreak model employed in the inverse problem is accurate. For contagious diseases, we formulated a Bayesian inversion technique to infer both pathogenic transmissibility and the social network from outbreak observations, ensuring that the two determinants of spreading are identified separately. We tested this technique on data collected from a 1967 smallpox epidemic in Abakaliki, Nigeria. We inferred, probabilistically, different transmissibilities in the structured Abakaliki population, the social network, and the chain of transmission. Finally, we developed an individual-based epidemic model to realistically simulate the spread of a rare (or eradicated) disease in a modern society. This model incorporates the mixing patterns observed in an (American) urban setting and accepts, as model input, pathogenic transmissibilities estimated from historical outbreaks that may have occurred in socio-economic environments with little resemblance to contemporary society. Techniques were also developed to simulate disease spread on static and sampled network reductions of the dynamic social networks originally in the individual-based model, yielding faster, though approximate, network-based epidemic models. These reduced-order models are useful in scenario analysis for medical response planning, as well as in computationally intensive inverse problems.

More Details

Predictability and reduced order modeling in stochastic reaction networks

Sargsyan, Khachik S.; Debusschere, Bert D.; Najm, H.N.

Many systems involving chemical reactions between small numbers of molecules exhibit inherent stochastic variability. Such stochastic reaction networks are at the heart of processes such as gene transcription, cell signaling or surface catalytic reactions, which are critical to bioenergy, biomedical, and electrical storage applications. The underlying molecular reactions are commonly modeled with chemical master equations (CMEs), representing jump Markov processes, or stochastic differential equations (SDEs), rather than ordinary differential equations (ODEs). As such reaction networks are often inferred from noisy experimental data, it is not uncommon to encounter large parametric uncertainties in these systems. Further, a wide range of time scales introduces the need for reduced order representations. Despite the availability of mature tools for uncertainty/sensitivity analysis and reduced order modeling in deterministic systems, there is a lack of robust algorithms for such analyses in stochastic systems. In this talk, we present advances in algorithms for predictability and reduced order representations for stochastic reaction networks and apply them to bistable systems of biochemical interest. To study the predictability of a stochastic reaction network in the presence of both parametric uncertainty and intrinsic variability, an algorithm was developed to represent the system state with a spectral polynomial chaos (PC) expansion in the stochastic space representing parametric uncertainty and intrinsic variability. Rather than relying on a non-intrusive collocation-based Galerkin projection [1], this PC expansion is obtained using Bayesian inference, which is ideally suited to handle noisy systems through its probabilistic formulation. To accommodate state variables with multimodal distributions, an adaptive multiresolution representation is used [2]. As the PC expansion directly relates the state variables to the uncertain parameters, the formulation lends itself readily to sensitivity analysis. Reduced order modeling in the time dimension is accomplished using a Karhunen-Loeve (KL) decomposition of the stochastic process in terms of the eigenmodes of its covariance matrix. Subsequently, a Rosenblatt transformation relates the random variables in the KL decomposition to a set of independent random variables, allowing the representation of the system state with a PC expansion in those independent random variables. An adaptive clustering method is used to handle multimodal distributions efficiently, and is well suited for high-dimensional spaces. The spectral representation of the stochastic reaction networks makes these systems more amenable to analysis, enabling a detailed understanding of their functionality, and robustness under experimental data uncertainty and inherent variability.

More Details

Analysis and reduction of chemical models under uncertainty

Debusschere, Bert D.; Najm, H.N.

While models of combustion processes have been successful in developing engines with improved fuel economy, more costly simulations are required to accurately model pollution chemistry. These simulations will also involve significant parametric uncertainties. Computational singular perturbation (CSP) and polynomial chaos-uncertainty quantification (PC-UQ) can be used to mitigate the additional computational cost of modeling combustion with uncertain parameters. PC-UQ was used to interrogate and analyze the Davis-Skodje model, where the deterministic parameter in the model was replaced with an uncertain parameter. In addition, PC-UQ was combined with CSP to explore how model reduction could be combined with uncertainty quantification to understand how reduced models are affected by parametric uncertainty.

More Details

Analysis of NO structure in a methane-air edge flame

Najm, H.N.; Prager, Jens

We present computations of a methane-air edge flame stabilized against an incoming flow mixing layer, using detailed methane-air chemistry. We analyze the computed edge flame, with a focus on NO-structure. We examine the spatial distribution of NO and its production/consumption rate. We investigate the breakdown of the NO source term among the thermal, prompt, N{sub 2}O, and NO{sub 2} pathways. We examine the contributions of the four pathways at different locations, as the edge flame structure changes with downstream distance, tending to a classical diffusion flame structure. We also examine the dominant reaction flux contributions in each pathway. We compare the results to those in premixed, non-premixed, and opposed-jet triple flames.

More Details

Computationally efficient Bayesian inference for inverse problems

Marzouk, Youssef M.; Najm, H.N.; Rahn, Larry A.

Bayesian statistics provides a foundation for inference from noisy and incomplete data, a natural mechanism for regularization in the form of prior information, and a quantitative assessment of uncertainty in the inferred results. Inverse problems - representing indirect estimation of model parameters, inputs, or structural components - can be fruitfully cast in this framework. Complex and computationally intensive forward models arising in physical applications, however, can render a Bayesian approach prohibitive. This difficulty is compounded by high-dimensional model spaces, as when the unknown is a spatiotemporal field. We present new algorithmic developments for Bayesian inference in this context, showing strong connections with the forward propagation of uncertainty. In particular, we introduce a stochastic spectral formulation that dramatically accelerates the Bayesian solution of inverse problems via rapid evaluation of a surrogate posterior. We also explore dimensionality reduction for the inference of spatiotemporal fields, using truncated spectral representations of Gaussian process priors. These new approaches are demonstrated on scalar transport problems arising in contaminant source inversion and in the inference of inhomogeneous material or transport properties. We also present a Bayesian framework for parameter estimation in stochastic models, where intrinsic stochasticity may be intermingled with observational noise. Evaluation of a likelihood function may not be analytically tractable in these cases, and thus several alternative Markov chain Monte Carlo (MCMC) schemes, operating on the product space of the observations and the parameters, are introduced.

More Details

Model reduction and physical understanding of slowly oscillating processes: The circadian cycle

Multiscale Modeling and Simulation

Goussis, Dimitris A.; Najm, H.N.

A differential system that models the circadian rhythm in Drosophila is analyzed with the computational singular perturbation (CSP) algorithm. Reduced nonstiff models of prespecified accuracy are constructed, the form and size of which are time-dependent. When compared with conventional asymptotic analysis, CSP exhibits superior performance in constructing reduced models, since it can algorithmically identify and apply all the required order of magnitude estimates and algebraic manipulations. A similar performance is demonstrated by CSP in generating data that allow for the acquisition of physical understanding. It is shown that the processes driving the circadian cycle are (i) mRNA translation into monomer protein, and monomer protein destruction by phosphorylation and degradation (along the largest portion of the cycle); and (ii) mRNA synthesis (along a short portion of the cycle). These are slow processes. Their action in driving the cycle is allowed by the equilibration of the fastest processes; (1) the monomer dimerization with the dimer dissociation (along the largest portion of the cycle); and (2) the net production of monomer+dimmer proteins with that of mRNA (along the short portion of the cycle). Additional results (regarding the time scales of the established equilibria, their origin, the rate limiting steps, the couplings among the variables, etc.) highlight the utility of CSP for automated identification of the important underlying dynamical features, otherwise accessible only for simple systems whose various suitable simplifications can easily be recognized. © 2006 Society for Industrial and Applied Mathematics.

More Details

Using high-order methods on adaptively refined block-structured meshes - discretizations, interpolations, and filters

Najm, H.N.

Block-structured adaptively refined meshes (SAMR) strive for efficient resolution of partial differential equations (PDEs) solved on large computational domains by clustering mesh points only where required by large gradients. Previous work has indicated that fourth-order convergence can be achieved on such meshes by using a suitable combination of high-order discretizations, interpolations, and filters and can deliver significant computational savings over conventional second-order methods at engineering error tolerances. In this paper, we explore the interactions between the errors introduced by discretizations, interpolations and filters. We develop general expressions for high-order discretizations, interpolations, and filters, in multiple dimensions, using a Fourier approach, facilitating the high-order SAMR implementation. We derive a formulation for the necessary interpolation order for given discretization and derivative orders. We also illustrate this order relationship empirically using one and two-dimensional model problems on refined meshes. We study the observed increase in accuracy with increasing interpolation order. We also examine the empirically observed order of convergence, as the effective resolution of the mesh is increased by successively adding levels of refinement, with different orders of discretization, interpolation, or filtering.

More Details

On chain branching and its role in homogeneous ignition and premixed flame propagation

3rd M.I.T. Conference on Computational Fluid and Solid Mechanics

Lee, J.C.; Najm, H.N.; Lefantzi, S.; Ray, J.; Frenklach, M.; Valorani, M.; Goussis, D.A.

The role of chain branching in a chemical kinetic system was investigated by analyzing the eigenvalues of the system. We found that in the homogeneous ignition of the hydrogen/air and methane/air mixtures, the branching mechanism gives rise to explosive modes (eigenvalues with positive real parts) in the induction period as expected; however, in their respective premixed flames, we found none. Thus, their existence is not a necessary condition for the propagation of a premixed flame. © 2005 Elsevier Ltd.

More Details

Stochastic spectral methods for efficient Bayesian solution of inverse problems

AIP Conference Proceedings

Marzouk, Youssef M.; Najm, H.N.; Rahn, Larry A.

The Bayesian setting for inverse problems provides a rigorous foundation for inference from noisy data and uncertain forward models, a natural mechanism for incorporating prior information, and a quantitative assessment of uncertainty in the inferred results. Obtaining useful information from the posterior density - e.g., computing expectations via Markov Chain Monte Carlo (MCMC) - may be a computationally expensive undertaking, however. For complex and high-dimensional forward models, such as those that arise in inverting systems of PDEs, the cost of likelihood evaluations may render MCMC simulation prohibitive. We explore the use of polynomial chaos (PC) expansions for spectral representation of stochastic model parameters in the Bayesian context. The PC construction employs orthogonal polynomials in i.i.d. random variables as a basis for the space of square-integrable random variables. We use a Galerkin projection of the forward operator onto this basis to obtain a PC expansion for the outputs of the forward problem. Evaluation of integrals over the parameter space is recast as Monte Carlo sampling of the random variables underlying the PC expansion. We evaluate the utility of this technique on a transient diffusion problem arising in contaminant source inversion. The accuracy of posterior estimates is examined with respect to the order of the PC representation and the decomposition of the support of the prior. We contrast the computational cost of the new scheme with that of direct sampling. © 2005 American Institute of Physics.

More Details

Quantifying uncertainty in chemical systems modeling

International Journal of Chemical Kinetics

Reagan, M.T.; Najm, H.N.; Pébay, P.P.; Knio, O.M.; Ghanem, R.G.

This study compares two techniques for uncertainty quantification in chemistry computations, one based on sensitivity analysis and error propagation, and the other on stochastic analysis using polynomial chaos techniques. The two constructions are studied in the context of H 2-O 2 ignition under supercritical-water conditions. They are compared in terms of their prediction of uncertainty in species concentrations and the sensitivity of selected species concentrations to given parameters. The formulation is extended to one-dimensional reacting-flow simulations. The computations are used to study sensitivities to both reaction rate pre-exponentials and enthalpies, and to examine how this information must be evaluated in light of known, inherent parametric uncertainties in simulation parameters. The results indicate that polynomial chaos methods provide similar first-order information to conventional sensitivity analysis, while preserving higher-order information that is needed for accurate uncertainty quantification and for assigning confidence intervals on sensitivity coefficients. These higher-order effects can be significant, as the analysis reveals substantial uncertainties in the sensitivity coefficients themselves. © 2005 Wiley Periodicals, Inc.

More Details

The role of explosive modes in homogeneous ignition and premixed flames

Najm, H.N.

We performed calculations to investigate the classical theories of chain branching and thermal--run--away that lead to the rapid oxidation of fuels. Mathematically, both theories infer the existence of eigenvalues with positive real parts i.e., explosive modes. We found in studies of homogeneous hydrogen--air and the methane--air mixtures that when ignition is initiated by a sufficiently high initial temperature, the transient response of the system exhibits two stages. The first stage is characterized by the existence of explosive modes. The ensuing second stage consists of fast exponential decay modes that bring the system to its equilibrium point. We demonstrated with two examples that the existence of explosive modes is not a necessary condition for the existence of a premixed flame. Homogeneous ignition calculations for mixtures with an initial concentration of radical species suggest that the diffusive transport of radical species is probably responsible for the lack of explosive modes in premixed flames.

More Details
Results 201–378 of 378
Results 201–378 of 378