The announcement of the new What Works network, which includes the Education Endowment Foundation as one of the centres of excellence promoting evidence-based policy, is one of the current focal points for discussing both the role and form of evidence within policy and practice. Loic Menzies, Director of LKMco, discusses how his experiences as a secondary school teacher have led him to champion the importance of developing nuanced judgement to decide ‘what works’ alongside the value of evidence-informed practice in the classroom.
Let me tell you a story:
A few years ago I had a challenging Year 7 form group. They were fourth out of five sets and were full of clashing personalities: the boisterous-already-hormonal big kids, the small-and-vulnerable childish ones and from every conceivable culture, special need and so on. Every day, issues came up in their various classes and I was constantly dealing with fights, tears, detentions and mis-behaviour. Then, one day we went on a class day out. I can’t quite remember what we actually did (it was a Catholic school so it was an annual retreat thing), but I do remember that it involved getting the tube somewhere, having lunch and playing in the park together, and doing something or other with plasticine. Everyone got on well and had a nice time. It felt really special. The next day we came back and I decided to do something different to the usual literacy focus in form time. We talked about why yesterday had felt so good, what was different about the way we had interacted and what we had done differently to make sure we got on well. Once we’d identified what we had done differently the kids got into pairs and agreed which thing was most important for them to change in their behaviour and wrote it down so they could remember it. From then on we referred back to these things whenever necessary. Relationships improved dramatically, I spent much less time fire-fighting and we could get on with learning. Last summer that year group took their GCSEs. Low attainers on entry achieved a Value Added Best 8 of 1075 and the school has an inverse FSM attainment gap – I don’t think that’s because of that trip but I do think we did a pretty good job with those kids.
I did not make my decision to do this with my class based on research evidence. The evidence for making plasticine models and playing in the park is somewhat weak. It was based on my judgement of what the issues were for my particular pupils in that particular class at that particular time. A year on it might not have been the right thing to do and with a different class it could have been a waste of time.
This is ABSOLUTELY NOT an argument against research evidence, but it is an argument against a blunt, technocratic black-box approach which idolises research as something which it is not. People love analogies to medicine in education but these are often flawed; most people will find that a Paracetamol helps with a head-ache whereas far more contextual factors play in to what will help someone learn better. Tools like the pupil premium toolkit are invaluable in aggregating findings and providing macro-level conclusions as well as being the basis of evidence informed practice. However, aggregation and averaging of even the most representative sample or comprehensive meta-analysis also needs to tell you which pupils will benefit from something and when. This is often missed out in headline effect-size figures, as are the nuances of how well an intervention is administered. Most of the time teachers do not teach the average class so what works best on average may not be the best thing to do given the extra information they have access to.
Just saying ‘what works’ is not enough – developing the nuanced skill to do things well and the judgement to decide what is right in a particular circumstance is equally important.
I wrote this blog after reading the rather excellent “The ‘What Works’ Movement Should Focus More on ‘Why it Works'” by Daniel Stid.
The views are the author’s own and do not necessarily represent those of the Alliance for Useful Evidence.