Articles How Failure Can Feed Success: Using Evidence on ‘What Does Not Work’ to Improve Services and External Recognition

How Failure Can Feed Success: Using Evidence on ‘What Does Not Work’ to Improve Services and External Recognition

 

Daniel Schweppenstedde and Anne-Marie Reid  share a case study from the Childhood Development Initiative, demonstrating how evidence that an intervention is not effective can have a valuable purpose.

The use of evidence for improving services is often thought of as a continuous success story from the stage of conception of an intervention, to its application and finally, to the impact on end users. RAND Europe supports the European Platform for Investing in Children (EPIC), which provides information on such successful interventions — EPIC’s “Practices that Work” section features many evidence-based practices that have demonstrated their effectiveness through rigorous research.

However, one can easily imagine why organisations might be reluctant to get their work evaluated externally. If the evaluation shows that a practice that has been developed and rolled out with much effort is not effective (or even harmful in the worst-case scenario), the implementing organisation could fear being associated with a short-term failure rather than a long-term success. Funders might withdraw their funding or might be sceptical of funding again in the future. In short, being honest can be risky. As a result, instead of having to deal with “negative” evaluation outcomes, avoiding external evaluations might be perceived as the safer option.

It is therefore important to recognise that applying practices and interventions requires a wider process of learning not only about “what does work,” but also “what does not.” This can be used as an opportunity to improve organisational performance and external recognition. While negative evaluation findings can be disappointing, the process for acting upon the information can provide an opportunity for strengthening an organisation.

Having expertise in conducting a wide range of evaluations, RAND Europe recognises the value of learning from “what does not work.” One EPIC intervention that RAND Europe has come across, and which serves to illustrate the positive impact of this important learning process, is Mate-Tricks by the Childhood Development Initiative (CDI).

The Childhood Development Initiative and the Mate-Tricks Programme

CDI, based in Tallaght/Dublin, supports the delivery of a range of early intervention and prevention programmes to improve health, safety and learning outcomes for children in disadvantaged areas. “Mate-Tricks” was a pro-social behaviour programme within CDI designed as a response to the local community’s concerns regarding bullying and anti-social behaviour. The programme manual was bespoke, combining elements of both the Strengthening Families and Coping Power Programmes, both of which had been extensively evaluated in the United States.

Doing the Right Thing: Collecting Evidence on Whether the Programme Works or Not

CDI was keen to know whether its idea worked in practice and had the expected impact on the target group. The programme was evaluated by a team from the Centre for Effective Education in Queens University in Belfast, using a randomised controlled trial design alongside a process evaluation, involving 630 children. Evaluations (PDF) were carried out over a three-year period using a rolling cohort design.

How to Handle Underwhelming Evaluation Results: Involving Researchers and All Actors Widely in the Dissemination of the Results

The decision to discontinue the programme came when, after three years of data collection, the results became known (PDF). Of the 21 indicators assessed, 16 had not changed, three were moving in the wrong direction, and two indicated positive improvements.

The dissemination and discussion of the results was a difficult process for CDI. People involved with Mate-Tricks loved the programme and fully supported it. The process evaluation was extremely positive. The expectations were that the programme would be considered a success and would be widely replicated. No wonder that the findings came as a complete shock.

CDI facilitated a process of meetings where  stakeholders were able to meet with the research team to ask any questions or express their concerns with the findings. While this process took time it was carefully planned, and gradually those involved were able to recognise the value of dropping the programme. The joint approach between CDI and the research team meant stakeholders had representatives from both sides to answer their questions. CDI also focussed on the needs of the target group in the transition period in order to mitigate any issues arising from the process, for example by offering a counselling service to children who had been involved in the programme.

Why letting go was a difficult, but important step — and just made the organisation stronger: 3 key messages

Being Able to Use Limited Resources More Efficiently

The Mate-Tricks case (PDF) highlighted the importance of evaluation to ensure that a programme is achieving its intended outcomes, which remains core to CDI’s mission. It was also an opportunity to improve effectiveness of allocating resources. Funding for programmes that are proven to be ineffective can be redirected, allowing investments to be maximised.

Increasing Credibility of Other Programmes and the Wider Organisation

Evaluation findings also strengthened the credibility of the organisation and its other programmes, such as Doodle Den, which were proven to be successful. By discontinuing an ineffective programme and focusing instead on one that was demonstrated to have very positive outcomes, CDI ultimately gained considerable trust from service users and stakeholders..

Providing a Learning Opportunity for Others and Increasing Recognition Externally

The evaluation of Mate-Tricks provided more information on after-school settings as vehicles for promoting pro-social behaviour programmes for 9- and 10-year olds, an area that has had limited attention to date. The findings provide insights into which approaches do not work and can therefore help other organisations and service providers in their service design plans.

This is a shortened version of a blog originally posted on the RAND Europe website.  Read the original post here.

 

Views expressed are the author’s own and do not necessarily represent those of the Alliance for Useful Evidence. Join us (it’s free and open to all) and find out more about the how we champion the use of evidence in social policy and practice.