Articles Increasing the evidence base for what does (and doesn’t) work

Increasing the evidence base for what does (and doesn’t) work


Dan Corry, CEO of NPC, explains how the Justice Data Lab, run by the Ministry of Justice, is helping charities to accurately measure their impact on their beneficiaries and create a cultural shift demanding more evidence. 

At NPC, we’re passionate about supporting organisations to measure their impact. By measuring impact, we can see if the all the passion and good intentions of the charity sector are translating into real outcomes for the people we are trying to help. However, even when charities believe in this mission, many still struggle to get far with their measurement.

The challenge:

One problem is that measuring outcomes often requires tracking beneficiaries over time to see what happened to them after their interaction with the charity, and doing this can be complex, costly and sometimes just impractical.

Yet often the government holds this data, and the problem is that charities can’t access it in a simple way. NPC therefore recently recommended that the Ministry of Justice use the data the government and its agencies hold in relation to reoffending to help charities working in this area. The MoJ responded positively, implementing the Justice Data Lab, a service which enables charities that work with offenders/ex-offenders to track the reoffending rate of the people they have worked with some years later. Furthermore they can compare the reoffending rate of those they worked with to a matched sample of offenders/ex-offenders.

Eight months into the one year pilot of the JDL, here are some of our lessons learnt.

Justice Data Lab is increasing awareness and understanding of evaluation methods

Results from the Justice Data Lab are beginning to flow in as organisations use the service. The latest Justice Data Lab report for 46 interventions found that participants in 16 of these interventions had statistically lower re-offending rates than similar participants who did not participate. In only one intervention did participants have a higher re-offending rate than similar non-participants. As with any quasi-experimental design, we cannot be sure as to whether this intervention increased reoffending because of the potential selection bias: it may have been serving the highest-risk offenders. In any case, these results should help charities better understand where they need to improve and help  promote themselves to funders and commissioners.

Whilst it is still quite early to discuss what the results in aggregate mean, we have found it interesting to speak (and listen) to organisations that have both used and attempted to use the Data Lab. For some organisations, going through the Justice Data Lab process has meant they better understand the data that is needed, and also have a clear understanding of the advantages and disadvantages of quantitative-based evaluations.

It has also highlighted the importance of having a clear theory of change as to how an intervention impacts upon reoffending. Using a theory of change approach can help organisations understand how their intervention works and to develop realistic goals for the difference they seek to make. Adam Moll from SafeGround has blogged that they have been able to further validate their theory of change through the Justice Data Lab. We would encourage similar organisations to develop a model of how their service has an impact upon re-offending alongside submitting themselves to the Justice Data Lab.

Supply doesn’t automatically lead to demand

When outlining the case for the Ministry of Justice to develop the Justice Data Lab, we had to estimate the number of charities that might use it. But the estimate, based on answers to a survey of the sector turned out to be far too large. In reality, the Justice Data Lab has had just over 60 requests, with just over a third of the 30 published reports coming from charities and social enterprises. So why has the uptake been so slow?

One factor may be that the results of those who use the JDL are published for all to see, and that this transparency has dampened demand.

We all know that it is tough for charities out there, and with the Transforming Rehabilitation agenda, many charities will be trying to work out how and who they will be operating with in this new space. The risk of finding out that an intervention they have delivered appeared to have no statistical impact on reoffending may not be one many want to take right now. But while understandable, we need to move to a different mind-set within the sector: if organisations are working to improve the lives of service users, then undertaking some form of evaluation of services should be a priority, so that those programmes that are less effective can be changed. Reputational damage would be less if we could shift attitudes so that organisations are supported to be open about any failures rather than hide them.

There are other reasons why take up may have been slower than anticipated, many of which are documented in  our report Creating a Data Lab. Some organisations have been uncertain about whether data can be shared under the Data Protection Act; there have been technical issues of data collection, storage and applicability; and we are picking up a lack of staff time and of technical resources. These barriers are not specific to the Justice Data Lab and relate to evaluation more generally, but given that the pilot lasts only until March 2014, we are actively trying to address these challenges to ensure that the site is useful to its users and the community and that the MoJ will continue this service.

Both through our work on the Data Lab and our National Offender Management Service (NOMS) project, which aims to improve the use of evidence amongst organisations working within criminal justice, we are helping to raise knowledge and understanding in this important field.

Real opportunity to build up an evidence base for what does and does not work

Given time and with more submissions, the Justice Data Lab will form a library of evaluated interventions. This is a big step in the quality of evidence that charities can create and learn from. We believe that if the service is modified and expanded to other government held data sets, it can dramatically add to our collective knowledge of what works in a number of social policy areas. That is why we are looking to develop further Data Labs in employment and benefits, substance misuse and in health, and we would also like to develop one in schools and education. Other steps are also possible. For example, while the service currently assesses impact using a matched sample, it could easily be expanded to allow charities to run small RCTs at low cost, routinely generating robust evidence.

The Justice Data Lab is currently being reviewed by the Ministry of Justice to decide whether, and in what form, it  should be continued. Charities need time to engage and set up the right systems to participate with this programme. We would welcome support from members of the Alliance for its continuation, as we think we are at the beginning of supporting a cultural change in measuring impact and one that will benefit us all.


Views expressed are the author’s own and do not necessarily represent those of the Alliance for Useful Evidence. Join us (it’s free and open to all) and find out more about the how we champion the use of evidence in social policy and practice.