• Title of article

    Calibration, validation, and sensitivity analysis: Whatʹs what

  • Author/Authors

    Trucano، نويسنده , , T.G. and Swiler، نويسنده , , L.P. and Igusa، نويسنده , , T. and Oberkampf، نويسنده , , W.L. and Pilch، نويسنده , , M.، نويسنده ,

  • Issue Information
    روزنامه با شماره پیاپی سال 2006
  • Pages
    27
  • From page
    1331
  • To page
    1357
  • Abstract
    One very simple interpretation of calibration is to adjust a set of parameters associated with a computational science and engineering code so that the model agreement is maximized with respect to a set of experimental data. One very simple interpretation of validation is to quantify our belief in the predictive capability of a computational code through comparison with a set of experimental data. Uncertainty in both the data and the code are important and must be mathematically understood to correctly perform both calibration and validation. Sensitivity analysis, being an important methodology in uncertainty analysis, is thus important to both calibration and validation. In this paper, we intend to clarify the language just used and express some opinions on the associated issues. We will endeavor to identify some technical challenges that must be resolved for successful validation of a predictive modeling capability. One of these challenges is a formal description of a “model discrepancy” term. Another challenge revolves around the general adaptation of abstract learning theory as a formalism that potentially encompasses both calibration and validation in the face of model uncertainty.
  • Journal title
    Reliability Engineering and System Safety
  • Serial Year
    2006
  • Journal title
    Reliability Engineering and System Safety
  • Record number

    1569177