Publications

Results 1–25 of 34
Skip to search filters

A geometric approach for computing tolerance bounds for elastic functional data

Journal of Applied Statistics

Tucker, James D.; Lewis, John R.; King, Caleb; Kurtek, Sebastian

We develop a method for constructing tolerance bounds for functional data with random warping variability. In particular, we define a generative, probabilistic model for the amplitude and phase components of such observations, which parsimoniously characterizes variability in the baseline data. Based on the proposed model, we define two different types of tolerance bounds that are able to measure both types of variability, and as a result, identify when the data has gone beyond the bounds of amplitude and/or phase. The first functional tolerance bounds are computed via a bootstrap procedure on the geometric space of amplitude and phase functions. The second functional tolerance bounds utilize functional Principal Component Analysis to construct a tolerance factor. This work is motivated by two main applications: process control and disease monitoring. The problem of statistical analysis and modeling of functional data in process control is important in determining when a production has moved beyond a baseline. Similarly, in biomedical applications, doctors use long, approximately periodic signals (such as the electrocardiogram) to diagnose and monitor diseases. In this context, it is desirable to identify abnormalities in these signals. We additionally consider a simulated example to assess our approach and compare it to two existing methods.

More Details

Elastic functional principal component regression

Statistical Analysis and Data Mining

Tucker, James D.; Lewis, John R.; Srivastava, Anuj

We study regression using functional predictors in situations where these functions contains both phase and amplitude variability. In other words, the functions are misaligned due to errors in time measurements, and these errors can significantly degrade both model estimation and prediction performance. The current techniques either ignore the phase variability, or handle it via preprocessing, that is, use an off-the-shelf technique for functional alignment and phase removal. We develop a functional principal component regression model which has a comprehensive approach in handling phase and amplitude variability. The model utilizes a mathematical representation of the data known as the square-root slope function. These functions preserve the L 2 norm under warping and are ideally suited for simultaneous estimation of regression and warping parameters. Using both simulated and real-world data sets, we demonstrate our approach and evaluate its prediction performance relative to current models. In addition, we propose an extension to functional logistic and multinomial logistic regression.

More Details

Simple effective conservative treatment of uncertainty from sparse samples of random functions

ASCE-ASME Journal of Risk and Uncertainty in Engineering Systems. Part B. Mechanical Engineering

Romero, Vicente J.; Schroeder, Benjamin B.; Dempsey, James F.; Lewis, John R.; Breivik, Nicole L.; Orient, George E.; Antoun, Bonnie R.; Winokur, Justin W.; Glickman, Matthew R.; Red-Horse, John R.

This paper examines the variability of predicted responses when multiple stress-strain curves (reflecting variability from replicate material tests) are propagated through a finite element model of a ductile steel can being slowly crushed. Over 140 response quantities of interest (including displacements, stresses, strains, and calculated measures of material damage) are tracked in the simulations. Each response quantity’s behavior varies according to the particular stress-strain curves used for the materials in the model. We desire to estimate response variability when only a few stress-strain curve samples are available from material testing. Propagation of just a few samples will usually result in significantly underestimated response uncertainty relative to propagation of a much larger population that adequately samples the presiding random-function source. A simple classical statistical method, Tolerance Intervals, is tested for effectively treating sparse stress-strain curve data. The method is found to perform well on the highly nonlinear input-to-output response mappings and non-standard response distributions in the can-crush problem. The results and discussion in this paper support a proposition that the method will apply similarly well for other sparsely sampled random variable or function data, whether from experiments or models. Finally, the simple Tolerance Interval method is also demonstrated to be very economical.

More Details

Utilizing Distributional Measurements of Material Characteristics from SEM Images for Inverse Prediction

Ries, Daniel R.; Ries, Daniel R.; Lewis, John R.; Lewis, John R.; Zhang, Adah S.; Zhang, Adah S.; Anderson-Cook, Christine A.; Anderson-Cook, Christine A.; Wilkerson, Marianne W.; Wilkerson, Marianne W.; Wagner, Gregory L.; Wagner, Gregory L.; Gravelle, Julie G.; Gravelle, Julie G.; Dorhout, Jacquelyn D.; Dorhout, Jacquelyn D.

Abstract not provided.

Selecting an informative/discriminating multivariate response for inverse prediction

Journal of Quality Technology

Thomas, Edward V.; Lewis, John R.; Anderson-Cook, Christine M.; Burr, Tom; Hamada, Michael S.

inverse prediction is important in a variety of scientific and engineering applications, such as to predict properties/characteristics of an object by using multiple measurements obtained from it. Inverse prediction can be accomplished by inverting parameterized forward models that relate the measurements (responses) to the properties/characteristics of interest. Sometimes forward models are computational/science based; but often, forward models are empirically based response surface models, obtained by using the results of controlled experimentation. For empirical models, it is important that the experiments provide a sound basis to develop accurate forward models in terms of the properties/characteristics (factors). While nature dictates the causal relationships between factors and responses, experimenters can control the complexity, accuracy, and precision of forward models constructed via selection of factors, factor levels, and the set of trials that are performed. Recognition of the uncertainty in the estimated forward models leads to an errors-in-variables approach for inverse prediction. The forward models (estimated by experiments or science based) can also be used to analyze how well candidate responses complement one another for inverse prediction over the range of the factor space of interest. One may find that some responses are complementary, redundant, or noninformative. Simple analysis and examples illustrate how an informative and discriminating subset of responses could be selected among candidates in cases where the number of responses that can be acquired during inverse prediction is limited by difficulty, expense, and/or availability of material.

More Details

xLPR Scenario Analysis Report

Eckert, Aubrey C.; Lewis, John R.; Brooks, Dusty M.; Martin, Nevin S.; Hund, Lauren H.; Clark, Andrew; Mariner, Paul M.

This report describes the methods, results, and conclusions of the analysis of 11 scenarios defined to exercise various options available in the xLPR (Extremely Low Probability of Rupture) Version 2 .0 code. The scope of the scenario analysis is three - fold: (i) exercise the various options and components comprising xLPR v2.0 and defining each scenario; (ii) develop and exercise methods for analyzing and interpreting xLPR v2.0 outputs ; and (iii) exercise the various sampling options available in xLPR v2.0. The simulation workflow template developed during the course of this effort helps to form a basis for the application of the xLPR code to problems with similar inputs and probabilistic requirements and address in a systematic manner the three points covered by the scope.

More Details
Results 1–25 of 34
Results 1–25 of 34