In this report, we proposed, examined and implemented approaches for performing efficient uncertainty quantification (UQ) in climate land models. Specifically, we applied Bayesian compressive sensing framework to a polynomial chaos spectral expansions, enhanced it with an iterative algorithm of basis reduction, and investigated the results on test models as well as on the community land model (CLM). Furthermore, we discussed construction of efficient quadrature rules for forward propagation of uncertainties from high-dimensional, constrained input space to output quantities of interest. The work lays grounds for efficient forward UQ for high-dimensional, strongly non-linear and computationally costly climate models. Moreover, to investigate parameter inference approaches, we have applied two variants of the Markov chain Monte Carlo (MCMC) method to a soil moisture dynamics submodel of the CLM. The evaluation of these algorithms gave us a good foundation for further building out the Bayesian calibration framework towards the goal of robust component-wise calibration.
Monge first posed his (L{sup 1}) optimal mass transfer problem: to find a mapping of one distribution into another, minimizing total distance of transporting mass, in 1781. It remained unsolved in R{sup n} until the late 1990's. This result has since been extended to Riemannian manifolds. In both cases, optimal mass transfer relies upon a key lemma providing a Lipschitz control on the directions of geodesics. We will discuss the Lipschitz control of geodesics in the (subRiemannian) Heisenberg group. This provides an important step towards a potential theoretic proof of Monge's problem in the Heisenberg group.
The Polynomial chaos expansion provides a means of representing any L2 random variable as a sum of polynomials that are orthogonal with respect to a chosen measure. Examples include the Hermite polynomials with Gaussian measure on the real line and the Legendre polynomials with uniform measure on an interval. Polynomial chaos can be used to reformulate an uncertain ODE system, using Galerkin projection, as a new, higher-dimensional, deterministic ODE system which describes the evolution of each mode of the polynomial chaos expansion. It is of interest to explore the eigenstructure of the original and reformulated ODE systems by studying the eigenvalues and eigenvectors of their Jacobians. In this talk, we study the distribution of the eigenvalues of the two Jacobians. We outline in general the location of the eigenvalues of the new system with respect to those of the original system, and examine the effect of expansion order on this distribution.
Chemically reacting flow models generally involve inputs and parameters that are determined from empirical measurements, and therefore exhibit a certain degree of uncertainty. Estimating the propagation of this uncertainty into computational model output predictions is crucial for purposes of reacting flow model validation, model exploration, as well as design optimization. Recent years have seen great developments in probabilistic methods and tools for efficient uncertainty quantification (UQ) in computational models. These tools are grounded in the use of Polynomial Chaos (PC) expansions for representation of random variables. The utility and effectiveness of PC methods have been demonstrated in a range of physical models, including structural mechanics, transport in porous media, fluid dynamics, aeronautics, heat transfer, and chemically reacting flow. While high-dimensionality remains nominally an ongoing challenge, great strides have been made in dealing with moderate dimensionality along with non-linearity and oscillatory dynamics. In this talk, I will give an overview of UQ in chemical systems. I will cover both: (1) the estimation of uncertain input parameters from empirical data, and (2) the forward propagation of parametric uncertainty to model outputs. I will cover the basics of forward PC UQ methods with examples of their use. I will also highlight the need for accurate estimation of the joint probability density over the uncertain parameters, in order to arrive at meaningful estimates of model output uncertainties. Finally, I will discuss recent developments on the inference of this density given partial information from legacy experiments, in the absence of raw data.
It is known that, in general, the correlation structure in the joint distribution of model parameters is critical to the uncertainty analysis of that model. Very often, however, studies in the literature only report nominal values for parameters inferred from data, along with confidence intervals for these parameters, but no details on the correlation or full joint distribution of these parameters. When neither posterior nor data are available, but only summary statistics such as nominal values and confidence intervals, a joint PDF must be chosen. Given the summary statistics it may not be reasonable nor necessary to assume the parameters are independent random variables. We demonstrate, using a Bayesian inference procedure, how to construct a posterior density for the parameters exhibiting self consistent correlations, in the absence of data, given (1) the fit-model, (2) nominal parameter values, (3) bounds on the parameters, and (4) a postulated statistical model, around the fit-model, for the missing data. Our approach ensures external Bayesian updating while marginalizing over possible data realizations. We then address the matching of given parameter bounds through the choice of hyperparameters, which are introduced in postulating the statistical model, but are not given nominal values. We discuss some possible approaches, including (1) inferring them in a separate Bayesian inference loop and (2) optimization. We also perform an empirical evaluation of the algorithm showing the posterior obtained with this data free inference compares well with the true posterior obtained from inference against the full data set.
It is known that, in general, the correlation structure in the joint distribution of model parameters is critical to the uncertainty analysis of that model. Very often, however, studies in the literature only report nominal values for parameters inferred from data, along with confidence intervals for these parameters, but no details on the correlation or full joint distribution of these parameters. When neither posterior nor data are available, but only summary statistics such as nominal values and confidence intervals, a joint PDF must be chosen. Given the summary statistics it may not be reasonable nor necessary to assume the parameters are independent random variables. We demonstrate, using a Bayesian inference procedure, how to construct a posterior density for the parameters exhibiting self consistent correlations, in the absence of data, given (1) the fit-model, (2) nominal parameter values, (3) bounds on the parameters, and (4) a postulated statistical model, around the fit-model, for the missing data. Our approach ensures external Bayesian updating while marginalizing over possible data realizations. We then address the matching of given parameter bounds through the choice of hyperparameters, which are introduced in postulating the statistical model, but are not given nominal values. We discuss some possible approaches, including (1) inferring them in a separate Bayesian inference loop and (2) optimization. We also perform an empirical evaluation of the algorithm showing the posterior obtained with this data free inference compares well with the true posterior obtained from inference against the full data set.