Richard Clubbe discusses the mixed findings of Sense about Science’s spot check on how transparent government departments are about their evidence use.
Last week Sense about Science published a report on how transparent the UK government has been about the evidence behind its policies. Transparency of Evidence: A spot check of government policy proposals July 2016 to July 2017 looks at 94 policy proposals from 12 departments, each scored according to the ease with which readers could tell how evidence played into the decision. The project was supported by funding from the Alliance for Useful Evidence and the Nuffield Foundation, and was conducted in partnership with the Institute for Government.
When the evidence behind policy isn’t shared, citizens are unable to understand the motivations behind policy decisions, decide whether they agree, or contribute to constructive discussion around the issue. On top of this, researchers and analysts find it harder to contribute to the evidence base in a meaningful way, and the government is less able to build on its own past work.
Transparency is a prerequisite for assessing the quality of evidence behind a policy proposal. Our report doesn’t look at the merits of a particular policy or the quality of the evidence considered, but rather whether a reader can identify what evidence was used and how it was assessed.
In 2015, the Institute for Government, Sense about Science and the Alliance for Useful Evidence established an evidence transparency framework, designed to be rapidly applied to any policy proposal, which broadly asks:
‘Can someone outside government see what the government is proposing to do, and why?’
With this question at the core of our assessment, Sense about Science led a citizen-centred review that rewards policy proposals for clarity of reasoning and consideration of the audience. For our spot check, we recruited a group of 27 volunteers to score policy proposals against the transparency framework. No prior knowledge or experience of the policy area was required. Scorers approached the task as if they were a member of the public motivated to learn more about the policy.
For this spot check we defined policies as ‘specific interventions intended to change the status quo’. We assessed the documents publicly available at the point when the policy was first set out substantively. The time frame for our assessment was 13 July 2016, when Theresa May became prime minister, to the end of July 2017. From a (very!) long list of potential policy proposals announced during this period, we randomly selected six to eight policies from each of 12 departments for our spot check. The scoring group then read the policy documents and scored each policy against the transparency framework. Scores were verified by the research team from Sense about Science and the Institute for Government.
The framework breaks down policy assessment into four sections – diagnosis, proposal, implementation and testing and evaluation – with possible scores on a scale from 0 (indicating very little mention of evidence and explanation of its use) to 3 (indicating a consistently well-referenced discussion of the evidence behind the policy, including acknowledgment of uncertainties and gaps in the evidence base).
Reviewers found that departments were capable of attaining a high level of transparency in diverse policymaking situations – showing, importantly, that evidence transparency is achievable. The most consistently high scoring departments were the Department for Transport (DfT); Department for Business, Energy and Industrial Strategy; Department of Health; Department for Environment, Food and Rural Affairs; and Department for Work and Pensions. DfT produced the highest scoring policy, ‘Cutting down on noise from night flights’, which received a 3 in every section of the framework. This was a strong example of how it’s possible to not only share the evidence being discussed, but also explore its limitations in the context of the policy area.
Overall, the review revealed several areas of improvement since our preliminary assessment in 2016, which highlighted good and bad practice across government. One of the shortcomings revealed in that report was that often departments had compiled research during policy development but failed to clearly share this with the reader, either through poor referencing or complete omission. This year we observed more sharing of the research done by departments. Analysts should welcome this finding – they are seeing more of their good work being published by their departments, a trend that reflects well on their policies.
The least transparent area identified was testing and evaluation. To score well in this section of the framework departments are required to set out what they will measure in order to determine whether an intervention has been successful, and clear plans for the publication and use of those results. Our review revealed very few occasions in which this was discussed in detail. Of the 94 policies assessed, only four scored a 3 for testing and evaluation, while 64 policies scored a 0 or 1. Testing and evaluation is about considering what is working and helping to produce evidence that informs future policy decisions. Without measurable outcomes, it becomes very difficult to build upon past experience to create improved policy solutions in the future.
The transparency framework developed with the Alliance for Useful Evidence and Institute for Government could be used outside reviews such as this one. Over the last year we have used the framework to engage with departments through talks and workshops with senior civil servants, analysts and policy professionals, many of whom are keen to use the framework to ‘mark their own homework’ and improve their department’s scores in time for future reviews. Outside Whitehall, the framework has been shared through facilitated workshops at the away days of nine parliamentary select committees, a meeting of the European Commission’s Regulatory Scrutiny Board and international policy conferences like the What Works Global Summit. It has even been translated into Spanish for use in the Peruvian parliament.
We are currently planning to conduct a further spot check in two years’ time. The time period to cover and scope of departments, agencies and devolved governments to which we might extend the review will be decided over 2018.
This project was funded by the Nuffield Foundation, but the views expressed are those of the author and not necessarily those of the Foundation, or the Alliance.