Tony O'Hagan - Academic pages - Abstracts

## Uncertainty Analysis
and Other Inference Tools for Complex Computer Codes.

Anthony O'Hagan, Marc C. Kennedy and Jeremy E. Oakley

*University of Nottingham*

**Publication details: **
In *Bayesian Statistics 6*, J. M. Bernardo *et al* (eds.).
Oxford University Press, 503-524, 1999.

### Abstract

This paper builds on work by Haylock and O'Hagan which developed a Bayesian approach to
uncertainty analysis. The generic problem is to make posterior inference about the output of a
complex computer code, and the specific problem of uncertainty analysis is to make inference
when the "true" values of the input parameters are unknown. Given the distribution of the input
parameters (which is often a subjective distribution derived from expert opinion), we wish to
make inference about the implied distribution of the output. The computer code is sufficiently
complex that the time to compute the output for any input configuration is substantial. The
Bayesian approach was shown to improve dramatically on the classical approach, which is based
on drawing a sample of values of the input parameters and thereby obtaining a sample from
the output distribution. We review the basic Bayesian approach to the generic problem of
inference for complex computer codes, and present some recent advances - inference about the
distribution of quantile functions of the uncertainty distribution, calibration of models, and the
use of runs of the computer code at different levels of complexity to make efficient use of the
quicker, cruder, versions of the code. The emphasis is on practical applications.

**Keywords: **COMPUTATIONAL EXPERIMENT; SIMULATION; GAUSSIAN PROCESS; SENSITIVITY
ANALYSIS; UNCERTAINTY DISTRIBUTION; CALIBRATION; MULTILEVEL CODES;
MODEL INADEQUACY.

Return to my publications page.

Updated: 17 February 2000
Maintained by: Tony O'Hagan