Tony O'Hagan - Consulting

Consultancy in health economics

My expertise

Hea;th economics is a relatively young discipline, and I was fortunate to be involved in its early years in setting the standard for statistical analyses. I have been perhaps the most active and influential developer of Bayesian methods in health economics. I have pioneered statistical methods for assessing cost-effectiveness from trial data (where resource information has been gathered) and for probabilistic sensitivity analysis based on economic modelling. I have also developed new methods for analysing health-related quality of life data, to derive better utility functions.

A course of five lectures by me on Bayesian methods in health economics is available from Henry Stewart Talks. I can also present these lectures, with additional material as appropriate, for in-house training.

My clients

I have advised both the pharmaceutical industry (AstraZeneca, Johnson & Johnson, Merck, Novartis, Pfizer) and the UK cost-effectiveness regulator, NICE.

Typical challenges

In constructing submissions on cost-effectiveness, particularly for NICE but also in some other jurisdictions, companies need to conduct sensitivity analysis. NICE, in particular, demands probabilistic sensitivity analysis, which means careful quantification of uncertainty about all the inputs to the economic model. Simplistic approaches to this, and convenient assumptions to support the company's case, can be easily seen through and can seriously damage the submission. In particular, proper assessment of uncertainty in inputs derived from registry data, meta-analysis, evidence synthesis or expert judgement is not straightforward. Correlations between inputs can be particularly hard to identify, and yet can substantially affect measures such as the cost-effectiveness acceptability curve, for instance. I can offer state-of-the-art quantification of uncertainty that will stand up to the closest scrutiny.

There are several tricky areas associated with patient-level simulation models (also known as micro-simulation models). These models are essentially stochastic, and there are many pitfalls in constructing the right stochastic structure. They can also take significant amounts of computing time to run, making conventional Monte Carlo approaches to probabilistic sensitivity analysis (using, for instance, TreeAge or Crystal Ball) impractical. My expertise in probabilistic modelling is particularly suited to the first of these problems. I have also developed very powerful and efficient methods for probabilistic sensitivity analysis (not based on Monte Carlo simulation) to address the second. I can offer leading edge skills in building and using this demanding class of models.

In order to demonstrate cost-effectiveness in many therapeutic areas, the conventional generic utility instruments such as EQ-5D and SF-6D can be insufficiently sensitive to the benefits provided by a new treatment. There is therefore much interest in instruments tailored to particular therapeutic areas. Obtaining reliable mapping of such health state descriptive systems to utility involves interviewing samples of people, which is expensive and requires specialist expertise. I have developed powerful Bayesian methods that potentially allow smaller samples and more faithfully estimate the underlying preferences. This can not only be more efficient but can also lead to better discrimination of treatment efficacy on the QALY scale.

I am happy to consult on these or other problems in health economics.

My home page My academic pages My consulting pages Software and other resources My personal pages Contact me

Updated: 18 January 2017
Maintained by: Tony O'Hagan