Title of article :
Calibration, validation, and sensitivity analysis: Whatʹs what
Author/Authors :
T.G. Trucano، نويسنده , , L.P. Swiler، نويسنده , , T. Igusa، نويسنده , , W.L. Oberkampf، نويسنده , , Martin M. Pilch، نويسنده ,
Issue Information :
روزنامه با شماره پیاپی سال 2006
Abstract :
One very simple interpretation of calibration is to adjust a set of parameters associated with a computational science and engineering code so that the model agreement is maximized with respect to a set of experimental data. One very simple interpretation of validation is to quantify our belief in the predictive capability of a computational code through comparison with a set of experimental data. Uncertainty in both the data and the code are important and must be mathematically understood to correctly perform both calibration and validation. Sensitivity analysis, being an important methodology in uncertainty analysis, is thus important to both calibration and validation. In this paper, we intend to clarify the language just used and express some opinions on the associated issues. We will endeavor to identify some technical challenges that must be resolved for successful validation of a predictive modeling capability. One of these challenges is a formal description of a “model discrepancy” term. Another challenge revolves around the general adaptation of abstract learning theory as a formalism that potentially encompasses both calibration and validation in the face of model uncertainty.
Journal title :
Reliability Engineering and System Safety
Journal title :
Reliability Engineering and System Safety