Jeremy E. Oakley and Anthony O'Hagan
Department of Probability and Statistics, University of Sheffield, Sheffield, England
Publication details: Journal of the Royal Statistical Society, Series B 66, 751-769, 2004.
In many areas of science and technology, mathematical models are built to simulate complex real-world phenomena. Such models are typically implemented in large computer programs, and are also very complex, such that the way that the model responds to changes in its inputs is not transparent. Sensitivity analysis is concerned with understanding how changes in the model inputs influence the outputs. This may be motivated simply by a wish to understand the implications of a complex model, but often arises because there is uncertainty about the true values of the inputs that should be used for a particular application.
A broad range of measures have been advocated in the literature to quantify and describe the sensitivity of a model's output to variation in its inputs. In practice the most useful measures are those based on formulating uncertainty in the model inputs by a joint probability distribution and then analysing the induced uncertainty in outputs, an approach known as probabilistic sensitivity analysis. We present a Bayesian framework which unifies the various tools of probabilistic sensitivity analysis.
The Bayesian approach is computationally highly efficient. It allows effective sensitivity analysis to be achieved using far smaller numbers of model runs than standard Monte Carlo methods. Furthermore, all measures of interest may be computed from a single set of runs.