As I head for the virtual exit, I thought I would rummage through some of the reports from the digital archive. I found eight gems, all of them chosen for contemporary usefulness.
One belief that has stuck with me after nine years heading the Alliance for Useful Evidence is this: a well-written think-tank type report is a great vehicle for getting your message out. Despite the shift towards open-access academic journals, many articles are still hidden behind paywalls. Even our parliamentary and government bodies don’t all have licences for research databases. And when the research is free and findable, papers can be irksome, lost to academic arcanery, or lazy jargon. A well-honed, succinct discussion paper can be a valuable thing. Especially when you say what you mean, and mean what you say.
This isn’t just a question of taste, it’s also a matter of evidence. Drawing on linguistics and cognitive science, Harvard Professor Steven Pinker makes a compelling case in his book Sense of Style for how ‘soggy prose’ can sink your ability to persuade. So much writing is hard to understand because of what Pinker calls the ‘curse of knowledge’:
‘The main cause of incomprehensible prose is the difficulty of imagining what it’s like for someone else not to know something that you know’ (p.56)Professor Steven Pinker
Clear-as-glass writing and care with words can cut through to the content underneath. Whilst it’s nice to have at-a-glance online evidence graphs, maps and toolkits, it’s not the easiest way to communicate nuance and complexity. Texts are much better at conveying subtlety and ideas. And isn’t the best part of Twitter a handy link to a longer-form piece of writing? Or maybe that is just me. I never really got the hang of sharing clever aperçus on the socials.
So I have chosen ‘readability’ as a criteria for the list. All of the texts were crafted with the aid of others, so this list is a good excuse to thank those who wrote, reviewed, or researched these reports.
- Future Directions for Scientific Advice. Who knew that Chief Scientists would play such a prominent role in all our lives during the Covid crisis? This 2013 collection of essays gives tips on how to be a chief scientific adviser. Edited by James Wilsdon, who now runs the Research on Research Institute, and Rob Doubleday, who heads Cambridge Centre for Science and Policy, the aim of the report was to influence the thinking of the incoming Government Chief Scientific Adviser Sir Mark Wolpert. The messages ring true today – such as Roger Pielke Jr. reminding us that ‘science advisers are not superheroes’; they can only be intermediary fixers and brokers.
- Using Evidence: What Works? This was us walking the talk and being evidence-informed: it was based on a systematic ‘review of reviews’ and a scoping review, conducted by the EPPI-Centre at UCL, in partnership with What Works Wellbeing and Wellcome Trust. The full technical paper is a monster 320 pages. The shorter paper, and accompanying blog, aimed to give practical advice – blended with lots of concrete examples – on the best techniques for communicating research.
- Evidence for the Frontline. This highly readable 2013 think-piece was by former science teacher Professor Jonathan Sharples, who is now at the Education Endowment Foundation. It sets out the importance of research-informed practice and outlines the elements of a well-functioning evidence system. Our keen interest in professions continued to grow after this report, and snowballed into the signing of the Evidence Declaration for Professional Bodies in 2017. An evidence ‘magna carta’ written by Professor Jonathan Shepherd of Cardiff University, who helped us bring together 27 signatory bodies such as medical Royal Colleges, College of Policing, and Chartered College of Teaching. Two years after the signing, Helen Mthiyane helped us monitor their progress.
- Using Research Evidence; A Practice Guide. More like a manifesto, with advice on every aspect of sifting and shifting evidence, this 2016 guide has been one of our most downloaded reports. The content served as the spine of our first Evidence Masterclasses, which have since been iterated, tested, and scaled to over 45 government departments. Our commitment to strengthening evidence capability, led by Kuranda Morgan, will continue. The report has been adapted for the scrutiny of local government and for the humanitarian aid sector. To dive deeper, there is more granular advice such as an inventory of experimental methods by Anna Hopkins, or an introduction to systematic reviews by the EPPI Centre.
- What Should the What Works Centres Do? This 2013 Nesta publication argued for what these new Centres should focus on. It’s worth revisiting, as many recommendations remain live – such as involving research users in the day-to-day running of the programmes, and governance. Written by Ruth Puttick and Geoff Mulgan – who set up the Alliance – there were then only six centres in the pipeline. Now there are 13 centres, and others in the wings, all nurtured by David Halpern and Jen Gold at the Cabinet Office What Works Team.
- Lessons from abroad. Based on an exploratory sample of evidence institutions from Europe, US, Australia, New Zealand, and Bretton Woods institutions, there is good advice here for any start-up evidence organisation (check out the conclusion on page 23). Ashley Lenihan, who now works for the Academy of Social Sciences, looked at a whole range of players, such as the Washington State Institute for Public Policy in the US, the Productivity Commission in Australia, and the Netherlands Bureau for Economic Policy Analysis. Our guidance has evolved with a checklist of things to consider when setting up new evidence institutions, by Louise Bazalgette at Nesta. There is a lot more we can do to learn from across the UK, as this report by Pippa Coutts at Carnegie UK Trust sets out, and we should embrace differences across the nations, such as the Scottish Approach to Evidence.
- What counts as good evidence? Constructive criticism also has its role to play – such as this provocation paper by Sandra Nutley and colleagues at Research Unit for Research Utilisation at St Andrews University (now Edinburgh University). She argues that hierarchies of evidence need to address more than questions of what works. Determining what hierarchies and standards of evidence are fit for what purpose is a persistent issue. Ruth Puttick mapped the 18 frameworks, scales and standards in the UK, currently growing at two a year.
- Social Media and Public Policy; What is the Evidence? This was written in 2013 before the Facebook scandal around a mood manipulation experiment, or indeed before Cambridge Analytica, but is super relevant now. We still get asked about it as it was ahead of the curve, showing the potential for ‘social media science’ for policymakers.
Whilst these are my favourite reports, they are one piece of the broader contributions that the Alliance has made to championing the better use of evidence for Social Policy and practice across the UK. I must have missed some others. You can add your favourite book, article, or report to the crowd-sourced list of other good evidence reads by getting in touch with the Transforming Evidence network.
These reports provide a legacy for those that come after us. I am very grateful, for instance, that you can still find highly readable working papers from the UK ESRC Centre for Evidence and Policy, written in the late 1990s. The Centre is long gone, but the reports remain – and are well worth reading. For example, Bill Solesbury taking stock with a short history in the UK of evidence-based policy movement, or Annette Boaz and Deborah Ashby on how to vet research quality. There is a body of work for people to read, build upon, and use – a legacy we can be proud of as the Alliance for Useful Evidence, and Nesta, prepare for this next chapter.