Articles Evidencing charities’ impact: why a ‘one-size-fits-all’ approach won’t do

Evidencing charities’ impact: why a ‘one-size-fits-all’ approach won’t do

Focusing on finding the best method for evidencing impact is distracting, argues Helen Wheatley, Assistant Director Research and Evaluation at Catch22. We should be focusing on using mixed methods to capture data on the rich and varied sectors we work in. This approach, writes Wheatley, has allowed Catch22’s Realising Our Ambition programme to stay focused on outcomes for young people while collecting credible impact data.

The voluntary, community and social enterprise sector is a great place to work. It’s dynamic, full of passionate and committed people and generates a rich and varied spectrum of evidence on the impact of our individual and collective work. However it often feels like a struggle to understand how to gather and use this evidence to best – or any – advantage for us and for the people we are working to support.

Evidence wins hearts, minds…and funds

In the past decade we have seen brave moves to change this. As a sector we have begun to grasp how to use the tools we now have available to measure our impact including grappling with data crunching devices and getting wise to KPI’s. This is something we’ve been exploring at Catch22 through our Realising Ambition programme which aims to improve the evidence base of ‘what works’ to prevent young people from offending. This shift towards being able to better demonstrate our own worth and effectiveness and importantly the impact we have with, in my case, the young people and families we work with is a sensible one. Having the evidence of how your work has directly benefited someone is a powerful tool. It can make the winning of the hearts, minds and funds of potential investors more likely and possible.

However it seems like we have some way to go in agreeing on the most effective approach. The idea of having a definitive method – whether it be Randomised Control Trials (RCTs), staged progress mapping or longitudinal studies – for determining effectiveness in what we do might be compelling but it’s distracting us. Focusing our energies on which approach to gathering evidence is best rather than the blended information different tools can provide us with if we ‘pick and mix’.

Use mixed methods, not a single method

The value of using a single method, ‘one-size-fits-all’, approach to measuring impact is questionable, especially when working with vulnerable young people and families. This work demands that all of us – service providers, funders, policy makers and commissioners alike –reassess whether finding an ultimate tool should be our aim or whether our time is better spent examining the range of methods available to us and how and when to best use them. We are some way off being able to even agree on a common language and understanding of what constitutes evidence and learning. To see this in action I suggest you join a group of people working in this area and ask their opinions on the place of RCT’s in our world. Then step back and find a safe place from which to watch the inevitable fiery debate!

We have taken this approach within Realising Ambition, where we have brought together a full spectrum of evidence gathering approaches. We’ve taking 4 of the 22 delivery organisations involved through RCT’s, supported others to undertake pre and post questionnaires and gather case studies on their work.

Whilst our work on Realising Ambition does not have all of the answers we – the delivery organisations and the consortium managing the programme – have explored the measurements and tools available to find a flexible model to capture as much as possible.

It’s enriched the programme to bring together very well measured, process driven interventions with less evidenced but promising work to see how best we can remain focussed on the outcomes for young people while still credibly measuring impact.

We need a more collegiate approach to examining the breadth and variety of evidence available to us and find better ways to use it transparently and effectively to improve planning and support. We need to question whether the ultimate answer to evidencing impact is just around the corner. I believe we should accept that the evidence we generate will always be as rich and varied as our sector and the people we work to support.

 

Views expressed are the author’s own and do not necessarily represent those of the Alliance for Useful Evidence. Join us (it’s free and open to all) and find out more about the how we champion the use of evidence in social policy and practice.