Michael Sanders, Reader in Public Policy at the Policy Institute at King’s and Executive Director of the What Works Centre for Children’s Social Care |
Britain has been very fortunate to be gripped, in a wonkish, nerdy way, by What Works fever for at least the last seven years. The Education Endowment Foundation is now seven years old, and was the first of the “new” what works centres (NICE, the what works centre for health, is much older).
Since its inception, the EEF has transformed education research, practice and policy to the point where it is barely recognizable.
We’re still being surprised by our findings
In its first year, the King’s College What Works Department has achieved significant progress towards that same goal in the higher education, substantially surpassing expectations, but there is, as always, more to be done.
The one-year point is a good time for reflection, and to consider where we are, and what’s next. Despite many successes, any honest reflection must conclude that the movement – both in higher education and elsewhere, remains, if not in its infancy, certainly in a juvenile phase – and a symptom of this immaturity is the fact that we’re so frequently surprised.
If we take a look at the EEF’s impressive resume, the three studies that I talk about the most are;
- giving kids on free school meals breakfast significantly increases their grades;
- texting parents about what’s going on in school increases their grades; and
- teaching assistants don’t seem to make a difference overall.
These findings are surprising – either because the results are far more powerful than we’d guess from the cost and size of the intervention, or because something that we’d naturally assume would work, doesn’t. These aren’t cherry picked examples from deep in the EEF back catalogue: school breakfasts are the EEF’s flagship scale-up project. Meanwhile, huge resources have been dedicated to understanding why teaching assistants don’t help, and what can be done to make them more effective. In HE, research that I led with the Somerset Challenge, found that giving young people more information about the costs and benefits of going to university made them less likely to want to attend – another surprise.
We need to look at the middle: between cheap nudges and expensive, high-impact interventions
Our research is currently focused on the extremes – things that we pretty much agree works but which are expensive (in HE, this would include things like summer schools), and things that people are skeptical about, but which are cheap (like texting people to get them to join societies).
This makes sense as a first port of call. If you want to build an evidence base (and quickly), cheap interventions like nudges are an easy place to start, and you can run quite a few studies for not a lot of money: for instance, encouraging students to attend events, or develop skills to help them succeed. On the other hand, if you’re going to spend a lot of money on research, and you want to maximise its impact, it makes sense to focus on something which is already widespread and a lot of people think works.
Getting to a mature evidence base
Each of these individual pieces of research has substantial value – we can stop doing things that don’t work and redeploy money elsewhere, and start implementing the nudges that work more widely. Taken together though, they don’t add up to a mature, evidence based practice. If we want to get there, we need to start filling in the middle: testing things that a lot of people think have a pretty good shot of working, and which cost a moderate amount.
The impacts here are potentially huge– moonshots (very expensive, complicated interventions), and fairly weak nudges are probably less likely to work than more workaday interventions; second, most interventions fall somewhere between incredibly cheap and incredibly expensive.
If we start filling in the gaps, we can start having evidence based policy, evidence based practice, and evidence based innovation – being able to develop new interventions based on a sound understanding of the causal relationships between inputs and outputs. If we do that, we can be surprised a bit less often and What Works go from being for wonks to being truly mainstream.