Articles Re-cycling is good: what can we learn from the ‘What Works for Children’ initiative?

Re-cycling is good: what can we learn from the ‘What Works for Children’ initiative?


NICE has recently been joined by some new kids on the block. Some problems are sufficiently wicked to need a few runs rounds of the block, so is there anything to be re-cycled from the evidence centres funded by the ESRC over a decade ago? 


One was What Works for Children – a partnership between Barnardo’s, the Child Health Research and Policy Unit at City University London, and the University of York.

What worked in that collaboration?   Firstly, there can be no doubt that working closely with front line practitioners and having the weight of a service-providing charity behind us was a shot in the arm; secondly we made great appointments of (then) junior staff who were brainy, energetic, and committed so even though our HR department thought that we were over the top in recruitment it worked for us; finally, we were well-networked, learned from the then still young Cochrane Collaboration and were working in a context where funding (risible by today’s standing) went a long way.

What were some of the problems and how did we try to fix them?

Practitioners told us: “We don’t know where to find research evidence/never get to hear about research “ and what we tried were Evidence Summaries rather cheesily called EvidenceNuggets; we signposted relevant research, websites and databases and because the evidence nuggets relied on high quality evidence to answerable questions, we also worked helping practitioners to formulate research questions and work out just what it was they needed to know. We then logged practitioners’ questions with a view to providing evidence-based answers to practitioners’ questions. After carrying out a review of what practitioners say they want from research we got a list of the favourite research topics asked for and compared it with what research had actually been funded by five major funders in social care to see where there were gaps.

We were told “We don’t have the skills to assess whether research evidence is good or not.”  How much attention should they pay to the item on Radio 4’s Today programme saying ‘Research shows…’  Our collaboration provided training and support in assessing evidence and perhaps more usefully, since it was more accessible for most, an evidence guide providing guidance for practitioners on locating, appraising and using research findings.

While no organization is likely to be against the use of research evidence, some service planners told us “our organisation/management isn’t interested in the use of evidence in practice” and there could be “pressure to direct funding towards particular services irrespective of the evidence.’  Mentoring was a particularly popular intervention at the time (cheap, strong face validity and some heartwarming stories).  We were able to share with service planners research evidence which showed that not all mentoring had the desired effect, and what was great about working with people who ran services was that they didn’t then say ‘so we won’t do it’ but worked to adapt their service to counter some of the difficulties which had caused harm to vulnerable children and young people where mentors had been given insufficient training, or had lost interest or become disillusioned, leaving young people with one more failed relationship.

Not only do we continue to know too little about what works, but there are political and funding pressures for organisations large and small to produce Evaluations-U-Like.  Not surprisingly, these tend to be positive.

Re-inventing wheels

Building an evidence base, and creating, using and implementing the kinds of evidence needed by folk who use services as well as those running them and researching them is a long game.  Re-inventing wheels is often portrayed as wasteful, but building your own wheel, using whatever has worked well in the past and discarding the other bits is often the way to go.

The kind of information which helps us understand what works, is different from the equally important information which describes how it works, or whether what works is acceptable to people on the receiving end. There is space for all of those.

There will always be a key place in academic life for blue skies research and theorising. It is this kind of disinterested scientific enquiry which has brought us some of the greatest advances. But we also need strengthened funding streams for really robust evaluations, and the time-consuming but all-important networks between users, providers of services, and researchers to ensure we’re answering the question  ‘how does it work’ and ‘does it matter’ as well as ‘does it work.’ Without this we are likely to continue to have the same heady mix of good, poor and mediocre research informing policy and practice.


Views expressed are the author’s own and do not necessarily represent those of the Alliance for Useful Evidence. Remember you can join us (it’s free and open to all) and find out more about the how we champion the use of evidence in social policy and practice.