Articles Joining up the dots…making evidence work for users and commissioners

Joining up the dots…making evidence work for users and commissioners


In this guest blog, Stephen Bediako, Founder and Managing Director of The Social Innovation Partnership, attempts to make sense of the sudden splurge of evidence initiatives in the UK. 

It’s an evidence fest. A sudden splurge in pursing evidence-based agendas has spread across the UK. There are so many, particularly when you consider initiatives like; Inspiring Impact, Realising Ambition, Project Oracle, the Early Intervention Foundation, the Alliance for Useful Evidence, Nesta’s Standards of Evidence, and more. Don’t forget organisations like New Philanthropy Capital, who have been delivering advice and training in evidence for charities and foundations for over ten years. Then there are the new What Works Centres that will support the use of evidence in government from 2013, and so the list goes on. However, with all of this momentum, those of us at the heart of this agenda are in danger of saturating and putting off the very users we wish to attract.

Do not misunderstand these comments: this is all great work which has helped to raise the importance of evidence in the mind of delivery agents, and it is crucial if the delivery of public services is going to become more effective and efficient.

Yet, I am Assistant Director of Project Oracle and we have already heard murmurs from our users – ‘There are so many different standards?’ ‘Which kite mark is the best or right kite mark for us?’, ‘Which initiative should we follow?’ or, the one which hit home for me the most, ‘We don’t even understand this stuff and yet there is so much going on’. I know, I know, anecdotal conversations are not exactly reliable data, but common sense says we should explore whether there is an issue, and if so, how we can nip it in the bud.

In a nutshell, it feels like there is a need to create a shop window for all of these initiatives. This shop window could be an online function that would be super-accessible and simple with minimal complication.  So what would this platform do? Three things come to mind:

1. Aggregation. All initiatives that meet an agreed criterion could be aggregated. The goal shouldn’t be to get into the detail of initiatives or to try to interfere with how these initiatives rank, conduct, or deliver their work; it is simply a case of presenting the initiatives in an ordered and accessible way.

2. Production encouragement. The website could encourage the use of evidence, again not professing support for one type over another but simply encouraging its use and directing users to the relevant initiative to help them achieve this.

3. Use. The website could also encourage the use of evidence, but again not passing judgment on any initiative but simply shining a light on initiatives that have made it on.

To make this happen it would be crucial to determine the scope of the exercise and, most importantly, define what we mean by an initiative, understand if there is demand and how initiatives would be determined to be in or out of the shop window and determine what any site or service would look like. To understand the demand for such a site, a survey of both the converted and unconverted would quickly determine preferences and help people to filter initiatives in terms of what they mean and what they are trying to achieve. Let me give you an example: does anyone who is not an expert/interested party in this space know the difference between evidence-based policy, evidence-based programmes, and evidence-based commissioning? I think I do but don’t quote me! This leads nicely on to another benefit of such an exercise: helping to demystify the language.

There are real benefits for key stakeholders, because commissioners might get a better view of what they are funding and whether it is value for money or meets a need, whilst users will have a clearer view of what they should or should not get involved in, just like the Education Endowment Fund’s site. Altogether, this might hopefully improve the design, and delivery of services, which is the reason why we are doing this work.

There are risks and a trail of failed attempts to create such websites. We all know the typical reasons – making it too complicated, trying to include too many initiatives, a website alone will not be enough, the challenge of generating use of the site – it could end up being another initiative among the saturation of work that I discussed above.

Nevertheless, this should not stop a sensible exploration, aimed at creating some sense and clarity of all of this work. It could be a bit like the Stock Exchange for evidence initiatives – there’s one for social impact. Meanwhile, at Project Oracle we will continue our work and search for ways to collaborate with other initiatives, because while the momentum is high we should all seize this opportunity to make the space more accessible for its users.

TSIP recently ran two Roundtables on this issue, which led to the development of this blog and would like to thank all attendees to the Roundtable series.

 

The views are the author’s own and do not necessarily represent those of the Alliance for Useful Evidence.