Assessment of a UQ Approach for Handling Sparse Samples of Discrete Random Functions (Material Stress-Strain Curves)
Abstract not provided.
Abstract not provided.
Abstract not provided.
ASME International Mechanical Engineering Congress and Exposition, Proceedings (IMECE)
This work examines the variability of predicted responses when multiple stress-strain curves (reflecting variability from replicate material tests) are propagated through a transient dynamics finite element model of a ductile steel can being slowly crushed. An elastic-plastic constitutive model is employed in the large-deformation simulations. The present work assigns the same material to all the can parts: lids, walls, and weld. Time histories of 18 response quantities of interest (including displacements, stresses, strains, and calculated measures of material damage) at several locations on the can and various points in time are monitored in the simulations. Each response quantity's behavior varies according to the particular stressstrain curves used for the materials in the model. We estimate response variability due to variability of the input material curves. When only a few stress-strain curves are available from material testing, response variance will usually be significantly underestimated. This is undesirable for many engineering purposes. This paper describes the can-crush model and simulations used to evaluate a simple classical statistical method, Tolerance Intervals (TIs), for effectively compensating for sparse stress-strain curve data in the can-crush problem. Using the simulation results presented here, the accuracy and reliability of the TI method are being evaluated on the highly nonlinear inputto- output response mappings and non-standard response distributions in the can-crush UQ problem.
Abstract not provided.
This SAND report summarizes our work on the Sandia National Laboratory LDRD project titled "Efficient Probability of Failure Calculations for QMU using Computational Geometry" which was project #165617 and proposal #13-0144. This report merely summarizes our work. Those interested in the technical details are encouraged to read the full published results, and contact the report authors for the status of the software and follow-on projects.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
SIAM Journal on Uncertainty Quantification
Abstract not provided.
Abstract not provided.
This report demonstrates versatile and practical model validation and uncertainty quantification techniques applied to the accuracy assessment of a computational model of heated steel pipes pressurized to failure. The Real Space validation methodology segregates aleatory and epistemic uncertainties to form straightforward model validation metrics especially suited for assessing models to be used in the analysis of performance and safety margins. The methodology handles difficulties associated with representing and propagating interval and/or probabilistic uncertainties from multiple correlated and uncorrelated sources in the experiments and simulations including: material variability characterized by non-parametric random functions (discrete temperature dependent stress-strain curves); very limited (sparse) experimental data at the coupon testing level for material characterization and at the pipe-test validation level; boundary condition reconstruction uncertainties from spatially sparse sensor data; normalization of pipe experimental responses for measured input-condition differences among tests and for random and systematic uncertainties in measurement/processing/inference of experimental inputs and outputs; numerical solution uncertainty from model discretization and solver effects.
Conference Proceedings of the Society for Experimental Mechanics Series
A unique quasi-static temperature dependent low strain rate finite element constitutive failure model has been developed at Sandia National Laboratories (Dempsey JF, Antoun B, Wellman G, Romero V, Scherzinger W (2010) Coupled thermal pressurization failure simulations with validation experiments. Presentation at ASME 2010 international mechanical engineering congress & exposition, Vancouver, 12-18 Nov 2010) and is being to be used to predict failure initiation of pressurized components at high temperature. In order to assess the accuracy of this constitutive model, validation experiments of a cylindrical stainless steel pipe, heated and pressurized to failure is performed. This "pipe bomb" is instrumented with thermocouples and a pressure sensor whereby temperatures and pressure are recorded with time until failure occurs. The pressure and thermocouple temperatures are then mapped to a finite element model of this pipe bomb. Mesh refinement and temperature mapping impacts on failure pressure prediction in support of the model validation assessment is discussed. © The Society for Experimental Mechanics Inc. 2014.
This report discusses the treatment of uncertainties stemming from relatively few samples of random quantities. The importance of this topic extends beyond experimental data uncertainty to situations involving uncertainty in model calibration, validation, and prediction. With very sparse data samples it is not practical to have a goal of accurately estimating the underlying probability density function (PDF). Rather, a pragmatic goal is that the uncertainty representation should be conservative so as to bound a specified percentile range of the actual PDF, say the range between 0.025 and .975 percentiles, with reasonable reliability. A second, opposing objective is that the representation not be overly conservative; that it minimally over-estimate the desired percentile range of the actual PDF. The presence of the two opposing objectives makes the sparse-data uncertainty representation problem interesting and difficult. In this report, five uncertainty representation techniques are characterized for their performance on twenty-one test problems (over thousands of trials for each problem) according to these two opposing objectives and other performance measures. Two of the methods, statistical Tolerance Intervals and a kernel density approach specifically developed for handling sparse data, exhibit significantly better overall performance than the others.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Proposed for publication in Springer book - 304742_Antoun/.
Abstract not provided.