man jumping

The joys and sorrows of league tables

By Katie Boswell 3 September 2015 3 minute read

So it’s back to school time again, although it’s only a few weeks since anxious students up and down the country were tearing open their exam result envelopes and—if the papers are to be believed—spontaneously leaping with joy. Teachers and school leaders have breathed sighs of relief or groaned in despair, thinking of how the exam results will look in the dreaded school league tables.

This all got me thinking about the social sector’s own league tables, used to rank the impact of different interventions. For example, the College of Policing’s Crime Reduction Toolkit ranks interventions from street lighting (which has a high impact on reducing crime) to Scared Straight programmes (which studies suggest may actually increase crime). Similarly, the Early Intervention Foundation’s Programmes Library rates the impact of a wide range of programmes working with children and families.

League tables provide front-line professionals and commissioners with a powerful rationale for choosing which programmes to implement. The Educational Endowment Foundation’s (EEF) Teaching and Learning Toolkit gives teachers and school leaders a tangible figure of the number of additional months’ progress you might expect pupils to make as a result of an approach being used in a school. Improving the quality of feedback provided to pupils has an average impact of eight months, while school uniforms have an average impact of zero months.

These initiatives are bringing evidence about impact to a greater audience in an accessible format. Almost half of secondary senior leaders in a recent poll said they use the EEF Teaching and Learning Toolkit to decide how to spend their pupil premium. Such figures should make fans of evidence-based approaches everywhere leap with joy, right?

But the league table approach has its detractors. Critics of school league tables worry that the comforting simplicity of a single number—such as the percentage of pupils gaining A*-Cs at GCSE—can be misleading. Good exam results are not necessarily repeatable from year to year and they are affected by other factors such as students’ background or the quality of exam marking. More fundamentally, the focus on exam results can distort the way that schools work, incentivising them to prioritise pupils on the C/D borderline and pushing out vocational education or extra-curricular activities.

Do these criticisms also apply to the social sector’s league tables? High-impact programmes may not be replicable from area to area and they are affected by a myriad of contextual factors that cannot be reflected in a simple ranking. The publishers of toolkits themselves are careful to highlight these limitations. The Early Intervention Foundation advises commissioners that ‘A programme may look perfect on paper, but this is no guarantee that it will work in a particular time and place’. The Crime Reduction Toolkit summarises the evidence on how and in which circumstances each intervention works.

There are also questions about the quality of the evidence that is used to rank interventions. All three toolkits provide an evidence rating for each programme, but there is a danger that users only look at the impact rating. For example, the evidence base for school uniforms is described as ‘very limited’, but this might not stop a school leader from deciding not to introduce school uniform when they see it has limited impact.

More seriously, the league table approach risks prioritising the outcomes that are easiest to measure, rather than those that are most important for people’s lives. The Teaching and Learning Toolkit mainly uses traditional measures of educational attainment such as curriculum tests and examinations and does not systematically record other outcomes such as aspiration, attendance or behaviour. School uniform receives a low ranking, but the toolkit notes that it can be an important component in the development of a school ethos, which may improve attendance and behaviour—so potentially useful even if it does not necessarily lead to better learning.

The opening up of evidence about impact to a wide audience is to be welcomed. However, it needs to be used in a thoughtful and proportionate way. Parents choosing a school for their child will often visit the site, speak with other parents, consider their child’s personal needs, and look at factors other than exam results. Commissioners and practitioners should take a similar approach to interpreting programme ratings. The EEF urges school leaders to supplement its rankings with professional judgment, observation and internal data about their school’s context. Used in this way, the league table approach has huge potential to drive up impact across the social sector. Then many of us really would be leaping with joy.

Categories:

Footer