Off last week to flood-bound Exeter, for a stimulating
two-day conference led by Martin Pitt at Peninsula Medical School (http://www.hsrlive.org/events/change-by-design-systems-modelling-and-simulation-in-health-care). It was designed to bring together clinicians,
managers and patients with researchers practising those strange sciences of
systems modelling and simulation. These
techniques have been under-used in health, but there was a palpable sense of
excitement over these two days that this was an approach whose time had
come.
This is not new – health planners in the 1950s were using
primitive modelling methods for booking outpatient systems. But latest techniques embrace the complexities
of health and social care, the uncertainties and the multiple interests of
commissioners, range of providers and patients.
It is no longer – and perhaps never has been – a two dimensional
numerical exercise.
We heard inspiring stories of how the particular techniques
of operational research had been brought to bear on tricky NHS problems. These
include using queuing theory to allocate and share scarce specialist mental
health assessment slots between teams; applying stochastic modelling techniques
to predict ambulance response times and plan rosters; using scenario planning
to allocate capacity between medical, surgical and cardiac beds on `service
lines’ in paediatric intensive care; and using system dynamics to re-model the
entire unscheduled and emergency care systems in one locality.
There was a great presentation from Paul Harper, using
software animations to illustrate the dangers of planning capacity on
averages. If you fail to build in
variability, a given in most systems dependent on human behaviour, your estimated
average wait of 30 minutes in a walk-in centre becomes two hours. Check out his youtube presentation (http://www.profpaulharper.com/home/research/research-materials)
. This made me think of a brilliant book
I read recently on the dangers of relying on `common sense’ by the US engineer
turned sociologist, Duncan Watts (http://www.amazon.com/Everything-Obvious-Common-Sense-Fails/dp/0307951790). A common sense planner would schedule
outpatient waits based on average times from reception to work-up with nurse to doctor. This would be wrong. A
quote by Watts - “the whole trick is to know what variables to look at and then
know how to add” – could itself be an epigraph for operational research.
One of the best parts of the two day event was a sandpit
exercise where small groups of service leaders and operational researchers
quickly worked up bids for new projects.
These were pitched to the room, dragon den style. The outputs were impressive - from using
location analysis to site diagnostic services across one region to modelling
how best to implement NICE guidelines for DVT care.
I ended the day talking to a paediatrician who had stumbled
on the event, with no prior knowledge of systems modelling, and was inspired to
get analytic help when making a business case for a new specialist epilepsy
nurse and pathway redesign. There is a
tension though between the very applied, local problem-driven analytics and a more
lasting body of knowledge. Sally
Brailsford (mathematician, turned nurse, turned health modelling academic) had
pointed to the paradox – we have a huge body of evidence, but few
generaliseable outputs. She had
identified 1008 individual papers on re-modelling emergency department flows. Were all these necessary? How can we learn from the best? As
well as the embedded local analysts within a health organisation focused on
particular problems, we need high quality research studies to generate national
learning, by testing and validating models and carrying out robust evaluations
of impact.
And so a long return from Exeter, with rather trying
transport arrangements given the flood damage.
During discussion, some had raised the old argument that healthcare was
just too complex to lend itself to mathematical techniques. The same of course used to be said of
weather forecasting, where predictions of more than three days were notoriously inaccurate. But today’s weather modelling techniques, using
historic data from multiple sensors and understanding the interplay of solar
activity, land masses, water temperatures and wind flow are much better. Applied to health, techniques such as system
dynamics can build in uncertainties (such as patient preferences) and
variability (patient and clinician behaviours), with more sophisticated
understanding of interactions (through network analysis and other) to predict
more accurately how services might be used and savings could be made. Scenario planning can also present various
`what-ifs’ to integrate strategic uncertainties – a given in the NHS – into the
planning process. Numbers themselves are
not enough. But at a time of ever
tighter financial pressures, can we afford to ignore the weathermen?