Articles Parody and the science of evaluation

Parody and the science of evaluation

 

Through the below parody,  Ray Pawson, Professor of Social Research Methodology at Leeds University, explores the reality of the complexity of the science of evaluation, focusing on RCTs and their growing reputation as the ‘gold standard’ methodological approach for evaluations across social policy. 

Evaluation researchers are tasked with providing the evidence to guide programme building and to assess its outcomes. Labouring under the highest expectations, to bring independence and objectivity to policy making, they face huge challenges, given the complexity of modern interventions and the politicised backdrop to all of their investigations. They have responded with a huge portfolio of research techniques and, through their professional associations, have set up schemes to establish standards for evaluative inquiry and to accredit evaluation practitioners.

Click. We turn on the ‘Today Programme’ [Radio 4]. Astonishingly, there’s an item on evaluation research methodology. Let’s have a listen:

Interviewer: Have I got this right? Your new book, The Science of Evaluation: A Realist Manifesto, is about claiming the mantle of science for the methods of ‘realist evaluation’ and ‘realist synthesis’, research strategies that you have written about in your previous books.

Pawson: Yup. Precisely.

Interviewer: But wait a minute, it’s subtitled a ‘manifesto’. How can you be both scientific and a writer of political propaganda?

Pawson: Now you have upset me. There is not a word of political ideology in the whole text. It is, however, one long methodological argument. I’m not too troubled if you want to call that ‘propaganda’, the point being that real scientists argue all the time. The work of one group is constantly checked and challenged by others. Truth emerges from and because of a ‘disputatious community of truth seekers’.

Interviewer: OK. But everyone in the evaluation community knows that the RCT is the gold standard scientific method.

Pawson: Since when? Mainstream science does not use RCTs. What proper scientists do is marshal theories and come up with all manner of ingenious empirical tests of those theories, which go on to refine the original ideas. Other groups add further tests to develop the explanation. This is the account of science you’ll find in Popper’s ‘conjectures and refutations’ and in Campbell’s ‘evolutionary epistemology’. I find it quite incredible that anyone should look elsewhere for a blueprint for science.

Interviewer: But wait a minute Donald Campbell is the architect of all those beautiful adaptations to RCT methodology and The Campbell Collaboration privileges RCTs as the prime source of evidence.

Pawson: Well indeed Campbell was a genius but you’ll find he contributed far more than quasi-experimental designs. I’ll dig out a quotation for you. This is Campbell: ‘I am some kind of realist, some kind of critical, hypothetical, corrigible, scientific realist’.

Interviewer: Have you tried telling the people at the Campbell Collaboration that their hero was a realist?

Pawson: As a matter of fact yes. I proposed a paper at the C2 conference this year. Apparently the conference was so full of other scintillating stuff that they just couldn’t find room for it.

Interviewer: The main body of the book is given over to a discussion of complexity. Could you summarise your argument?

Pawson:  The penny has finally dropped that interventions are not ‘treatments’. Interventions are complex process introduced into complex environments attempting to deal with complex problems. It is impossible to control for every contingency as the trialists urge. It impossible to theorise away the problem as some system theorists have contrived. All that be accomplished is for the evaluators to explain some of the contingencies of implementation and some of the caveats of context that make for programme effectiveness. The urgent need is to get across to policy makers that no one can tell them whether an intervention will work. This plea to make better use of partial knowledge is much more honest that the simplifications and sound-bites that they are used to hearing.

Interviewer: Do you think you are winning the methodological argument?

Pawson: Good grief no. Methodological advice is just like a social intervention – it only works for certain groups in certain circumstances. I like to think that some significant groups within healthcare evaluation have become more ‘realist’. But the great simplifiers still dominate. My heart sinks at the influence of the Cabinet Office’s Behavioural Insights Team. Not only are they rehashing the Ladybird book of the RCT, they are peddling a huge oversimplification on what it takes to change behaviour. Would you like me to explain?

Interviewer: I’m afraid we have run out of time. No doubt interested parties will find their way to chapter 6…

 

Ray Pawson
Ray is a Professor of Social Research Methodology in the Faculty of Education, Social Sciences and Law at the University of Leeds. Ray has written widely on the philosophy and practice of research, covering quantitative and qualitative methods. He is particularly known for his focus on evaluation methodology and evidence-based policy.  Ray has been a researcher and evaluator on programme evaluation for various UK government departments. His latest book, The Science of Evaluation: A Realist Manifesto, was published in February 2013.

(The views are the author’s own and do not necessarily represent those of the Alliance for Useful Evidence)