Publications
Verification Validation and Uncertainty/Sensitivity Studies of Fire Modeling at Sandia National Laboratories
Abstract not provided.
Type X and Y Errors and Data & Model Conditioning for Systematic Uncertainty in Model Calibration Validation and Extrapolation
Abstract not provided.
A Paradigm of Model Validation and Model Conditioning for Best-Estimate-Plus-Uncertainty Predictions in Hierarchical Modeling
Computer Methods in Applied Mechanics and Engineering
Abstract not provided.
Processing of Experiments and Simulations into a Validated Failure Model (for a Device)
Abstract not provided.
Effective Robust Design Strategy employing Ordinal Variance Minimization and Adaptive Mean Constraint Satisfaction
Abstract not provided.
A paradigm of model validation and validated models for best-estimate-plus-uncertainty predicitons in systems engineering
Abstract not provided.
Error estimation approaches for progressive response surfaces -more results
Conference Proceedings of the Society for Experimental Mechanics Series
Response surface functions are often used as simple and inexpensive replacements for computationally expensive computer models that simulate the behavior of a complex system over some parameter space. "Progressive" response surfaces are built up incrementally as global information is added from new sample points added to the previous points in the parameter space. As the response surfaces are globally upgraded, indicators of the convergence of the response surface approximation to the exact (fitted) function can be inferred. Sampling points can be incrementally added in a structured or unstructured fashion. Whatever the approach, it is usually desirable to sample the entire parameter space uniformly (at least in early stages of sampling). At later stages of sampling, depending on the nature of the quantity being resolved, it may be desirable to continue sampling uniformly (progressive response surfaces), or to switch to a focusing/economizing strategy of preferentially sampling certain regions of the parameter space based on information gained in previous stages of sampling ("adaptive" response surfaces). Here we consider progressive response surfaces where a balanced representation of global response over the parameter space is desired. We use Kriging and Moving-Least-Squares methods to fit Halton quasi-Monte-Carlo data samples and interpolate over the parameter space. On 2-D test problems we use the response surfaces to compute various response measures and assess the accuracy/applicability of heuristic error estimates based on convergence behavior of the computed response quantities. Where applicable we apply Richardson Extrapolation for estimates of error, and assess the accuracy of these estimates. We seek to develop a robust methodology for constructing progressive response surface approximations with reliable error estimates.
Development of a new adaptive ordinal approach to continuous-variable probabilistic optimization
A very general and robust approach to solving continuous-variable optimization problems involving uncertainty in the objective function is through the use of ordinal optimization. At each step in the optimization problem, improvement is based only on a relative ranking of the uncertainty effects on local design alternatives, rather than on precise quantification of the effects. One simply asks ''Is that alternative better or worse than this one?'' -not ''HOW MUCH better or worse is that alternative to this one?'' The answer to the latter question requires precise characterization of the uncertainty--with the corresponding sampling/integration expense for precise resolution. However, in this report we demonstrate correct decision-making in a continuous-variable probabilistic optimization problem despite extreme vagueness in the statistical characterization of the design options. We present a new adaptive ordinal method for probabilistic optimization in which the trade-off between computational expense and vagueness in the uncertainty characterization can be conveniently managed in various phases of the optimization problem to make cost-effective stepping decisions in the design space. Spatial correlation of uncertainty in the continuous-variable design space is exploited to dramatically increase method efficiency. Under many circumstances the method appears to have favorable robustness and cost-scaling properties relative to other probabilistic optimization methods, and uniquely has mechanisms for quantifying and controlling error likelihood in design-space stepping decisions. The method is asymptotically convergent to the true probabilistic optimum, so could be useful as a reference standard against which the efficiency and robustness of other methods can be compared--analogous to the role that Monte Carlo simulation plays in uncertainty propagation.
DAKOTA and its use in Computational Experiments
Abstract not provided.
Methodology Status and Needs: Verification Validation and Uncertainty Quantification
Abstract not provided.
Modeling Boundary Conditions and Thermocouple Response in a Thermal Experiment
Abstract not provided.
On Model Validation and Extrapolation for Best-Estimate-Plus-Uncertainty Predictions
Abstract not provided.
Efficiencies from spatial correlation of uncertainty and correlated sampling in continuous-variable ordinal optimization under uncertainty
Proposed for publication in AIAA.
Procedures for risk and qualification analysis of complex system response in fires
Abstract not provided.
Pragmatic Experiences integrating V&V into risk and qualification analysis of complex coupled systems
Abstract not provided.
Progressive response surfaces
Abstract not provided.
Advanced nuclear energy analysis technology
A two-year effort focused on applying ASCI technology developed for the analysis of weapons systems to the state-of-the-art accident analysis of a nuclear reactor system was proposed. The Sandia SIERRA parallel computing platform for ASCI codes includes high-fidelity thermal, fluids, and structural codes whose coupling through SIERRA can be specifically tailored to the particular problem at hand to analyze complex multiphysics problems. Presently, however, the suite lacks several physics modules unique to the analysis of nuclear reactors. The NRC MELCOR code, not presently part of SIERRA, was developed to analyze severe accidents in present-technology reactor systems. We attempted to: (1) evaluate the SIERRA code suite for its current applicability to the analysis of next generation nuclear reactors, and the feasibility of implementing MELCOR models into the SIERRA suite, (2) examine the possibility of augmenting ASCI codes or alternatives by coupling to the MELCOR code, or portions thereof, to address physics particular to nuclear reactor issues, especially those facing next generation reactor designs, and (3) apply the coupled code set to a demonstration problem involving a nuclear reactor system. We were successful in completing the first two in sufficient detail to determine that an extensive demonstration problem was not feasible at this time. In the future, completion of this research would demonstrate the feasibility of performing high fidelity and rapid analyses of safety and design issues needed to support the development of next generation power reactor systems.
Application of probabilistic ordinal optimization concepts to a continuous-variable probabilistic optimization problem
A very general and robust approach to solving optimization problems involving probabilistic uncertainty is through the use of Probabilistic Ordinal Optimization. At each step in the optimization problem, improvement is based only on a relative ranking of the probabilistic merits of local design alternatives, rather than on crisp quantification of the alternatives. Thus, we simply ask the question: 'Is that alternative better or worse than this one?' to some level of statistical confidence we require, not: 'HOW MUCH better or worse is that alternative to this one?'. In this paper we illustrate an elementary application of probabilistic ordinal concepts in a 2-D optimization problem. Two uncertain variables contribute to uncertainty in the response function. We use a simple Coordinate Pattern Search non-gradient-based optimizer to step toward the statistical optimum in the design space. We also discuss more sophisticated implementations, and some of the advantages and disadvantages versus non-ordinal approaches for optimization under uncertainty.
Initial evaluation of Centroidal Voronoi Tessellation method for statistical sampling and function integration
A recently developed Centroidal Voronoi Tessellation (CVT) unstructured sampling method is investigated here to assess its suitability for use in statistical sampling and function integration. CVT efficiently generates a highly uniform distribution of sample points over arbitrarily shaped M-Dimensional parameter spaces. It has recently been shown on several 2-D test problems to provide superior point distributions for generating locally conforming response surfaces. In this paper, its performance as a statistical sampling and function integration method is compared to that of Latin-Hypercube Sampling (LHS) and Simple Random Sampling (SRS) Monte Carlo methods, and Halton and Hammersley quasi-Monte-Carlo sequence methods. Specifically, sampling efficiencies are compared for function integration and for resolving various statistics of response in a 2-D test problem. It is found that on balance CVT performs best of all these sampling methods on our test problems.
Initial application and evaluation of a promising new sampling method for response surface generation: Centroidal Voronoi tessellation
Collection of Technical Papers - AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics and Materials Conference
A recently developed Centroidal Voronoi Tessellation (CVT) sampling method is investigated here to assess its suitability for use in response surface generation. CVT is an unstructured sampling method that can generate nearly uniform point spacing over arbitrarily shaped M-dimensional parameter spaces. For rectangular parameter spaces (hypercubes), CVT appears to extend to higher dimensions more effectively and inexpensively than "Distributed" and "Improved Distributed" Latin Hypercube Monte Carlo methods, and CVT does not appear to suffer from spurious correlation effects in higher dimensions and at high sampling densities as quasi-Monte-Carlo methods such as Halton and Sobol sequences typically do. CVT is described briefly in this paper and its impact on response surface accuracy in a 2-D test problem is compared to the accuracy yielded by Latin Hypercube Sampling (LHS) and a deterministic structured-uniform sampling method. To accommodate the different point patterns over the parameter space given by the different sampling methods, Moving Least Squares (MLS) for interpolation of arbitrarily located data points is used. It is found that CVT performs better than LHS in 11 of 12 test cases investigated here, and as often as not performs better than the structured sampling method with its deterministically uniform point placement over the 2-D parameter space.
Description of the Sandia Validation Metrics Project
This report describes the underlying principles and goals of the Sandia ASCI Verification and Validation Program Validation Metrics Project. It also gives a technical description of two case studies, one in structural dynamics and the other in thermomechanics, that serve to focus the technical work of the project in Fiscal Year 2001.
Uncertainty Analysis of Decomposing Polyurethane Foam
Abstract not provided.
Effect of initial seed and number of samples on simple-random and Latin-Hypercube Monte Carlo probabilities (confidence interval considerations)
In order to devise an algorithm for autonomously terminating Monte Carlo sampling when sufficiently small and reliable confidence intervals (CI) are achieved on calculated probabilities, the behavior of CI estimators must be characterized. This knowledge is also required in comparing the accuracy of other probability estimation techniques to Monte Carlo results. Based on 100 trials in a hypothesis test, estimated 95% CI from classical approximate CI theory are empirically examined to determine if they behave as true 95% CI over spectrums of probabilities (population proportions) ranging from 0.001 to 0.99 in a test problem. Tests are conducted for population sizes of 500 and 10,000 samples where applicable. Significant differences between true and estimated 95% CI are found to occur at probabilities between 0.1 and 0.9, such that estimated 95% CI can be rejected as not being true 95% CI at less than a 40% chance of incorrect rejection. With regard to Latin Hypercube sampling (LHS), though no general theory has been verified for accurately estimating LHS CI, recent numerical experiments on the test problem have found LHS to be conservatively over an order of magnitude more efficient than SRS for similar sized CI on probabilities ranging between 0.25 and 0.75. The efficiency advantage of LHS vanishes, however, as the probability extremes of 0 and 1 are approached.
Application of finite element, global polynomial, and kriging response surfaces in Progressive Lattice Sampling designs
This paper examines the modeling accuracy of finite element interpolation, kriging, and polynomial regression used in conjunction with the Progressive Lattice Sampling (PLS) incremental design-of-experiments approach. PLS is a paradigm for sampling a deterministic hypercubic parameter space by placing and incrementally adding samples in a manner intended to maximally reduce lack of knowledge in the parameter space. When combined with suitable interpolation methods, PLS is a formulation for progressive construction of response surface approximations (RSA) in which the RSA are efficiently upgradable, and upon upgrading, offer convergence information essential in estimating error introduced by the use of RSA in the problem. The three interpolation methods tried here are examined for performance in replicating an analytic test function as measured by several different indicators. The process described here provides a framework for future studies using other interpolation schemes, test functions, and measures of approximation quality.