The government’s A-level grading algorithm may bias against state schools.


In brief:

  • A-level results were published on 13th August 2020. Independent schools saw a 4.7% increase in A* and A grades.  Non-Independent schools saw only a 1.7% increase.
  • That is at least partly because class sizes in Independent schools were often too small for teachers’ grades to be standardised.  And teachers’ grades were in aggregate more generous than standardised grades.
  • But the bias could also be driven by the Government’s “prior attainment” adjustment. Our analysis suggests, that state schools which saw a relatively poor GCSE performance two-years ago may have had their grade distributions revised downwards by too much.

In yesterday’s A-level results, Independent schools saw a 4.7% increase in A* and A grades.  Non-Independent schools saw only a 1.7% increase*:

*Calculated as the average for non-independent schools, weighted by number of total centres.

Many have attributed Independent schools’ improved performance to the fact their class sizes are more often too small for results to be standardised.  This means their teachers’ (more generous) grades were more often used to award results. See here.

But we think the “prior attainment” adjustment in the algorithm could also disadvantage non-Independent schools. This adjusts the grade-distribution for schools if the current A-level cohort performed better/worse in their GSCEs than past A-level cohorts. 

In principal, that adjustment makes sense.  If the current cohort performed much worse in its GSCEs than past cohorts, it would be unfair to simply use the past cohorts’ distribution of A-level results to grade the current cohort.

The problem is those adjustments use national averages to convert differences in GSCE results into differences in A-level results.  And our analysis suggests that, when GSCE performance falls, non-Independent schools see a smaller subsequent fall in A-level performance than Independent schools.  

Specifically, our modelling suggests that if a non-Independent school fell 100 places in the 2017 GCSE rankings, it subsequently fell only 30 places in the 2019 A-level rankings.  

By contrast, if an Independent school fell 100 places in the 2017 GCSE rankings, it subsequently fell by 41 places in the 2019 A-level rankings.  Both these results are significant at 5% confidence interval.

And so, by using a national average to adjust for falls in GSCE performance, the government may have inadvertently penalized some non-Independent schools.  Basically: it is potentially assumed that some non-Independent schools’ A-level results would have fallen by more than they actually would have.

We went to comprehensive schools, and the calibre of education changed between GSCE and A-level (much smaller class-sizes and fewer disruptive students). So a weaker correlation between GSCE and A-level results seems plausible.

But – one big caveat. We’ve done this analysis using public-data based on A-level and GSCE average results. It might not hold using more granular pupil-level data.

So Government could helpfully explain how the “prior attainment” adjustments would change if they used school-specific GSCE to A-level conversions, rather than national averages.


Description of the analysis:

  • We find all schools (in England) which meet the following criteria: (1) have A-level results data for each year between 2016 and 2019, (ii) have GSCE results data for each year between 2014 and 2017, (iii) have at least 30 pupils in each of those academic years, (iv) meet some basic results quality assurance tests (e.g. report at least one result for English/Maths GSCEs).
  • For the full sample of schools (1385) we then rank them for GSCE and A-Level results in each academic year.
  • We then find their average A-level ranking between 2016 and 2018, and their average GSCE ranking between 2014 and 2016.
  • And we then find how their A-Level ranking in 2019 (and GSCE ranking in 2017) compared to those averages.
  • For each type of school (Independent and non-Independent) we modelled the change in 2019 A-Level ranking as a function of the change in 2017 GSCE ranking.
  • This showed that non-Independent school A-level results were less sensitive to falls in GSCE performance.

Full code: