Articles Lessons from the General Election

Lessons from the General Election

Who remembers the general election?

So opened the questioning at a high level roundtable which we recently convened at Nesta, where a group of senior figures from the data and evidence world gathered to assess how well evidence checking initiatives around the general election such as the manifesto check had worked.

For six pre-election weeks of intense activity, 120 analysts from Full Fact’s ‘war room’, The Conversation UK, the Institute for Fiscal Studies, and others conducted a live rapid reaction experiment – fact checking Ministerial statements and manifestos to inform debate and engage the populace.

The pay-off was immediate, and dramatic. Describing the vast media appetite delegates recalled questions at midnight from the Today programme and correcting a line during an episode of Newsnight, actually clarified within the broadcast. Journalists and the public responded quickly as key issues from foodbanks to fracking, uncollected taxes to zero hours contracts were critiqued and communicated.

When it comes to the crunch, does it matter?

In a nutshell, yes.

Where genuine mistakes are made, it’s clearly important to correct to avoid disaster, but often the way facts are presented is nuanced, and dealing with this can be more difficult. What is the correct response when something written or spoken is technically true but not fair, for example?

Looking at the whole picture can be incredibly valuable, as individual fact checking has its limits. It was agreed that one of the key successes of the initiative was the synthesis document produced by Full Fact. However, it was noted that the timing of the political releases may have prevented this really useful piece getting out early enough for optimum impact.

It may be that a proactive approach where the experts offer to pre-check party pledges prior to publication would be well-received. Nobody wants to end up with egg on their face in the full media glare, after all, and it could buy valuable time. Why not pilot it during the Welsh election?

Good practice

The Department of Health are setting a great example with a promise to issue a ‘fact document’ for each ministerial speech, a commitment first made in a letter to the Director General for Regulation at the UK Statistics Authority. This followed their public intervention on the use of unpublished research on excess weekend deaths by the Secretary of State. In a further initiative, The Royal Statistical Society’s offer of training for MPs is being widely taken up, which is more positive news.

The big evidence ‘infrastructure organisations’ – recently mapped by the Alliance – put the facts out there and the tools to analyse them, and broadcasters can present both sides of an argument, but in the end, individuals must make up their own mind.

One of the biggest achievements highlighted was the trust placed in Full Fact and the wider partnership. The Sunday Express described the project as ground-breaking and it became the ‘go to’ place for journalists. Greater moves towards transparency in areas of public life, and the rise of data journalism are very encouraging trends, and the question is how to build on this effectively.

We often have emotional responses to immigration, welfare, education and healthcare, getting involved when it personally impacts upon a child in school, or an elderly relative in care.  Trusted intermediaries are both important and influential in steering a course through what can be unchartered territory.

The ONS project ‘How well do you know your area?‘ is great example of engagement, starting quite literally from where people are, and there was agreement around the table that building statistical literacy step by step is the way forward.

There are risks, however, that the momentum could be lost. In particular a financial mechanism for this specialised work needs to be found.  One idea, posed over lunch, could be for the Alliance for Useful Evidence to bring together potential funders to inform phase 2 of this work, heading into the EU referendum and the 2020 elections.

Measuring success

Full Fact has exciting plans. They are set to automate and systematically track the claims they know about, to study the epidemiology of fact checking and what happens when you intervene or not. The results will be useful to inform next steps, and directly relevant to our upcoming piece of research on what works to enable evidence use in practice.

Answers on a postcard, please

How can this group help the public tell what is nonsense and what is truth? An evidence kite mark could be the way forward.  It’s a comforting thought that a logo on an article might guarantee a certain scientific rigour – in an increasingly online world it can be hard to distinguish between big political visions and evidence based think pieces.

Influencing Cabinet Office guidance is proving difficult, and while the open data agenda is leading to more opportunities than ever for individuals and small organisations to analyse the raw data themselves, they may have limited capacity. Engaging with young people will be critical, and looking at how and what MPs access in terms of information could yield intriguing results.

In the closing remarks, it becomes clear that this powerful partnership needs to play the longer game.  A common, shared understanding of the fundamental issues and reducing the ‘drift’ between rhetoric and reality is the pot of gold under this particular rainbow.

Learning the lessons from the general election over lunch has been an excellent start; participants agreed that a follow up dinner, perhaps with an emphasis on the financial stability of this work, could secure the future.


The initiatives reviewed included:

The correspondence between the Department of Health and the UK Statistics Authority may be found on the UK Statistics Authority website (see letter from Ed Humpherson to Mark Svenson dated 24 July 2015 and response dated 19 August 2015).