Publications

Publications / Conference

We underestimate uncertainties in our predictions

Pilch, Martin P.

Prediction is defined in the American Heritage Dictionary as follows: 'To state, tell about, or make known in advance, especially on the basis of special knowledge.' What special knowledge do we demand of modeling and simulation to assert that we have a predictive capability for high consequence applications? The 'special knowledge' question can be answered in two dimensions: the process and rigor by which modeling and simulation is executed and assessment results for the specific application. Here we focus on the process and rigor dimension and address predictive capability in terms of six attributes: (1) geometric and representational fidelity, (2) physics and material model fidelity, (3) code verification, (4) solution verification, (5) validation, and (6) uncertainty quantification. This presentation will demonstrate through mini-tutorials, simple examples, and numerous case studies how each attribute creates opportunities for errors, biases, or uncertainties to enter into simulation results. The demonstrations will motivate a set of practices that minimize the risk in using modeling and simulation for high-consequence applications while defining important research directions. It is recognized that there are cultural, technical, infrastructure, and resource barriers that prevent analysts from performing all analyses at the highest levels of rigor. Consequently, the audience for this talk is (1) analysts, so they can know what is expected of them, (2) decision makers, so they can know what to expect from modeling and simulation, and (3) the R&D community, so they can address the technical and infrastructure issues that prevent analysts from executing analyses in a practical, timely, and quality manner.