What is the beating heart of a healthy evidence ecosystem? Professor Jonathan Shepherd finds that systems work best when practitioner vocation to improve services coalesces with the capacity to test solutions in rigorous experiments. This is the beating heart of an evidence ecosystem. Separating these elements makes it far more difficult to resolve real service problems and uncertainties.
The five new ‘What Works Centres’ fill big gaps in evidence systems and, order by doing so, are helping to improve education, economic growth, crime reduction, and, fundamentally, quality of life from the earliest years to old age.
Evidence generation, synthesis, translation into guidelines and policy and implementation are all parts of a process. None of these functions, and the structures which provide them, should be thought of in isolation; they need to be connected.
This is the basis of my report, part funded by to the Cabinet Office, for the What Works Network ‘How to achieve more effective services: the evidence ecosystem’.
This report is based on my research to identify the elements of the evidence supply chain and then to identify gaps and barriers to evidence production, flow and implementation.
Curiosity, skills and values – the engine of evidence systems
A common but striking theme emerged. An engine which powers these systems can be found where the curiosity of people equipped with the necessary evaluation skills is combined with public service values. Put another way, when practitioners/policy makers are located with researchers with well-honed experimental skills, useful evidence emerges.
Damaging acts: separating evaluators and practitioners
This fusion of practitioner vocation and applied science helps ensure that real problems and uncertainties are identified and potential solutions are properly tested. By the same token, there are likely to be few acts more damaging to problem solving, innovation and the whole evidence system than when evaluators and practitioners are separated – as is the case now in public services such as education and nursing.
Examples of this integration, all of which have obvious benefits, were apparent across sectors.
Integrated services and research – the rise of RCTs in medicine
Milestone randomised trial of such clinical interventions as by-pass surgery and treatments for AIDS were led by clinical scientists motivated by concerns for their own patients. Reflecting the scaling up of this model in medicine, the feast of randomised trials in healthcare compared with comparative famine in other public services can be explained by the integrated service and research functions of university medical schools, all integrated with hospitals and led by practitioners who are also trained researchers.
Examples of alignment in social policy sphere
The Behavioural Insights Team’s (BIT) objective: “to use behavioural science to encourage people to make better choices for themselves and society”, exemplifies this blend of public service values and controlled experiment. Working in Her Majesty’s Revenue and Customs, for example, BIT identified an important problem – low tax repayment rates, tested potential solutions in a randomised trial, and delivered a solution which markedly improved repayment rates.
In the aging better sector, an academic with a leading role in the My Home Life programme, motivated by his regular interaction with the “horrific lives older people often lead”, is a knowledge broker for the care home sector. He distils and conveys evidence from and through facilitators to 500 care home managers.
This perspective highlights an important difference between the oil industry analogy in my report, and the evidence ecosystem. In the former, oil/evidence wells are distant from motorist/practitioner end users, whereas in the evidence ecosystem, the evidence wells are also where evidence needs to be implemented. Analogies can be taken too far of course, but another one comes to mind – the Harvian principle that, in a circular system – pumps are of central importance.
In something as complex as an evidence ecosystem, it would be wrong to single out one component as more important than all the rest. But if this analysis is right, we should be doing all we can to nurture service values and experimental skills and make sure they are exercised in the same place.
Views expressed are the author’s own and do not necessarily represent those of the Alliance for Useful Evidence. Join us (it’s free and open to all) and find out more about the how we champion the use of evidence in social policy and practice.