Ian Mulheirn is Director of the Social Market Foundation
Choice is currently limited by capacity constraints. There is an insufficient expansion of new places thanks to the tight control of places by LEAs, restricted capital funding from government for new schools and the higher birth rate in recent years coupled with continued immigration. Regardless, HM Treasury need to be convinced that taxpayers’ money is being spent efficiently on maintaining spare capacity in schools – which is essential for choice to operate. This is doubtful, especially in straitened times.
But even where they can exercise a real choice, parents may not be using the optimum information to identify the best school in their area. Unless parents make informed choices, a market will not deliver better outcomes.
Parents tend to make their decision based on a mix of factors, but the proximity of the school to home and exam results are the most important. Most parents will use the proportion of pupils achieving 5A*-C GCSEs including English and Mathematics – as presented in league tables - when interpreting exam performance, as a way to judge how ‘good’ the school is. But is this “raw” measure a sound basis for making such an important decision? Good results may be a reflection of the aptitude of a school’s intake rather than evidence of great teaching.
That’s why the Value Added (VA) score is a better indicator of quality: it measures how pupils perform on average in their eight best GCSE subjects compared to their expected progress based on their Key Stage 2 results in English and Maths. In other words, the measure discounts prior attainment – the result of the child’s aptitude, home environment or previous school – allowing us to isolate how much value their secondary school added. If you’re looking for the school that is most likely to help your child fulfil its full potential at exam time, then the VA score is what you should focus on.
So are the raw scores a good indicator of school quality, as measured by value added? To investigate this, my colleague Nida Broughton and I extracted from the Department for Education performance tables the raw scores and VA scores of all comprehensives in England for which information is available for GCSEs taken in the 2011-2012 academic year. We looked at the correlation between the two measures: if raw scores are a good guide to school quality, we would expect to see a close relationship between the two. The results are presented below.