Publications

Results 51–95 of 95
Skip to search filters

Null-hypothesis testing using distance metrics for verification of arms-control treaties

2016 IEEE Nuclear Science Symposium, Medical Imaging Conference and Room-Temperature Semiconductor Detector Workshop, NSS/MIC/RTSD 2016

Khalil, Mohammad K.; Brubaker, Erik B.; Hilton, Nathan R.; Kupinski, Matthew A.; MacGahan, Christopher J.; Marleau, Peter M.

We investigate the feasibility of constructing a data-driven distance metric for use in null-hypothesis testing in the context of arms-control treaty verification. The distance metric is used in testing the hypothesis that the available data are representative of a certain object or otherwise, as opposed to binary-classification tasks studied previously. The metric, being of strictly quadratic form, is essentially computed using projections of the data onto a set of optimal vectors. These projections can be accumulated in list mode. The relatively low number of projections hampers the possible reconstruction of the object and subsequently the access to sensitive information. The projection vectors that channelize the data are optimal in capturing the Mahalanobis squared distance of the data associated with a given object under varying nuisance parameters. The vectors are also chosen such that the resulting metric is insensitive to the difference between the trusted object and another object that is deemed to contain sensitive information. Data used in this study were generated using the GEANT4 toolkit to model gamma transport using a Monte Carlo method. For numerical illustration, the methodology is applied to synthetic data obtained using custom models for plutonium inspection objects. The resulting metric based on a relatively low number of channels shows moderate agreement with the Mahalanobis distance metric for the trusted object but enabling a capability to obscure sensitive information.

More Details

Chemical model reduction under uncertainty

Combustion and Flame

Malpica Galassi, Riccardo; Valorani, Mauro; Najm, H.N.; Safta, Cosmin S.; Khalil, Mohammad K.; Ciottoli, Pietro P.

A general strategy for analysis and reduction of uncertain chemical kinetic models is presented, and its utility is illustrated in the context of ignition of hydrocarbon fuel–air mixtures. The strategy is based on a deterministic analysis and reduction method which employs computational singular perturbation analysis to generate simplified kinetic mechanisms, starting from a detailed reference mechanism. We model uncertain quantities in the reference mechanism, namely the Arrhenius rate parameters, as random variables with prescribed uncertainty factors. We propagate this uncertainty to obtain the probability of inclusion of each reaction in the simplified mechanism. We propose probabilistic error measures to compare predictions from the uncertain reference and simplified models, based on the comparison of the uncertain dynamics of the state variables, where the mixture entropy is chosen as progress variable. We employ the construction for the simplification of an uncertain mechanism in an n-butane–air mixture homogeneous ignition case, where a 176-species, 1111-reactions detailed kinetic model for the oxidation of n-butane is used with uncertainty factors assigned to each Arrhenius rate pre-exponential coefficient. This illustration is employed to highlight the utility of the construction, and the performance of a family of simplified models produced depending on chosen thresholds on importance and marginal probabilities of the reactions.

More Details

Inference of H2O2 thermal decomposition rate parameters from experimental statistics

10th U.S. National Combustion Meeting

Casey, Tiernan A.; Khalil, Mohammad K.; Najm, H.N.

The thermal decomposition of H2O2 is an important process in hydrocarbon combustion playing a particularly crucial role in providing a source of radicals at high pressure where it controls the 3rd explosion limit in the H2-O2 system, and also as a branching reaction in intermediatetemperature hydrocarbon oxidation. As such, understanding the uncertainty in the rate expression for this reaction is crucial for predictive combustion computations. Raw experimental measurement data, and its associated noise and uncertainty, is typically unreported in most investigations of elementary reaction rates, making the direct derivation of the joint uncertainty structure of the parameters in rate expressions difficult. To overcome this, we employ a statistical inference procedure, relying on maximum entropy and approximate Bayesian computation methods, and using a two-level nested Markov Chain Monte Carlo algorithm, to arrive at a posterior density on rate parameters for a selected case of laser absorption measurements in a shock tube study, subject to the constraints imposed by the reported experimental statistics. The procedure constructs a set of H2O2 concentration decay profiles consistent with these reported statistics. These consistent data sets are then used to determine the joint posterior density on the rate parameters through straightforward Bayesian inference. Broadly, the method also provides a framework for the replication and comparison of missing data from different experiments, based on reported statistics, for the generation of consensus rate expressions.

More Details

Inference of reaction rate parameters based on summary statistics from experiments

Proceedings of the Combustion Institute

Khalil, Mohammad K.; Chowdhary, K.; Safta, Cosmin S.; Sargsyan, Khachik S.; Najm, H.N.

Bayesian inference and maximum entropy methods were employed for the estimation of the joint probability density for the Arrhenius rate parameters of the rate coefficient of the H2/O2-mechanism chain branching reaction H + O2 → OH + O. A consensus joint posterior on the parameters was obtained by pooling the posterior parameter densities given each consistent data set. Efficient surrogates for the OH concentration were constructed using a combination of Padé and polynomial approximants. Gauss-Hermite quadrature with Gaussian proposal probability density functions for moment computation were used resulting in orders of magnitude speedup in data likelihood evaluation. The consistent data sets resulted in nearly Gaussian conditional parameter probability density functions. The resulting pooled parameter probability density function was propagated through stoichiometric H2-air auto-ignition computations to illustrate the necessity for correlation among the Arrhenius rate parameters of one reaction and across rate parameters of different reactions to be considered.

More Details

Online mapping and forecasting of epidemics using open-source indicators

Ray, Jaideep R.; Lefantzi, Sophia L.; Bauer, Joshua B.; Khalil, Mohammad K.; Rothfuss, Andrew J.; Cauthen, Katherine R.; Finley, Patrick D.; Smith, Halley S.

Open-source indicators have been proposed as a way of tracking and forecasting disease outbreaks. Some, such are meteorological data, are readily available as reanalysis products. Others, such as those derived from our online behavior (web searches, media article etc.) are gathered easily and are more timely than public health reporting. In this study we investigate how these datastreams may be combined to provide useful epidemiological information. The investigation is performed by building data assimilation systems to track influenza in California and dengue in India. The first does not suffer from incomplete data and was chosen to explore disease modeling needs. The second explores the case when observational data is sparse and disease modeling complexities are beside the point. The two test cases are for opposite ends of the disease tracking spectrum. We find that data assimilation systems that produce disease activity maps can be constructed. Further, being able to combine multiple open-source datastreams is a necessity as any one individually is not very infor- mative. The data assimilation systems have very little in common except that they contain disease models, calibration algorithms and some ability to impute missing data. Thus while the data assimilation systems share the goal for accurate forecasting, they are practically designed to compensate for the shortcomings of the datastreams. Thus we expect them to be disease and location-specific.

More Details

Bayesian analysis of the flutter margin method in aeroelasticity

Journal of Sound and Vibration

Khalil, Mohammad K.; Poirel, Dominique P.; Sarkar, Abhijit S.

A Bayesian statistical framework is presented for Zimmerman and Weissenburger flutter margin method which considers the uncertainties in aeroelastic modal parameters. The proposed methodology overcomes the limitations of the previously developed least-square based estimation technique which relies on the Gaussian approximation of the flutter margin probability density function (pdf). Using the measured free-decay responses at subcritical (preflutter) airspeeds, the joint non-Gaussain posterior pdf of the modal parameters is sampled using the Metropolis–Hastings (MH) Markov chain Monte Carlo (MCMC) algorithm. The posterior MCMC samples of the modal parameters are then used to obtain the flutter margin pdfs and finally the flutter speed pdf. The usefulness of the Bayesian flutter margin method is demonstrated using synthetic data generated from a two-degree-of-freedom pitch-plunge aeroelastic model. The robustness of the statistical framework is demonstrated using different sets of measurement data. In conclusion, it will be shown that the probabilistic (Bayesian) approach reduces the number of test points required in providing a flutter speed estimate for a given accuracy and precision.

More Details
Results 51–95 of 95
Results 51–95 of 95