National assessment in literacy and numeracy (NAPLAN) results for 2014 were released recently. I was pleased to see a focus on the ‘high gain’ schools – those that are being successful in helping their students to move forward at a more rapid rate than expected.
NAPLAN is a snapshot of student performance on one day and under certain conditions. As such it has some limitations in terms of reliability for individual students. At the whole school level, however, it provides useful diagnostic information that schools can use to decide where they need to focus their attention.
Inevitably, however, some media outlets and some schools will use this data in ways other than it is intended. At its worst this will be an attempt to make comparisons between schools based on absolute levels of performance. Apart from the fact that NAPLAN authorities do not condone this practice it can also be highly misleading.
At question here is what makes a ‘good school’ in terms of student achievement. Is it one with high results or one that helps students to improve more than would be expected?
Ideally I’m sure parents would like their children’s school to be both of the above but in reality this is not always the case. The schools with high results are very predictable. They’re the ones serving families of high socio economic status, whether by good geographic fortune or through recruitment of students. This is true for schools in all sectors – government, Catholic and independent – and also for VCE results.
The measures most often used to represent the quality of teaching and learning in a school are limited and, at times, misleading. The default is absolute levels of achievement and averaged results. Anyone who scans the newspapers after VCE results are released will understand how some schools use those results to make themselves look good by inviting comparison with other, less advantaged schools. It’s both simplistic and divisive. There is a risk that NAPLAN could be used in the same way.
Parents are very interested in those results along with other aspects of the school. But they should also be interested in what the results don’t show. Teachers understand their students’ learning and needs far better than a NAPLAN test does; after all they see them and assess them every day. School reports and feedback from teachers should be valued alongside NAPLAN results.
Similarly, VCE results should be viewed in context. Does the school enrol all students or only those with certain levels of academic ability or ability to pay fees? Do the averaged results in a lower performing school mask the capacity of that school to support students to achieve very high results? Are students achieving to their ability?
Parents might be entitled to more information. Every school, for example, has access to high level data from the Victorian Curriculum and Assessment Authority that compares VCE achievement against predicted results, and many schools commission their own independent analysis of results.
One of the great difficulties we have when we evaluate educational performance is that the measures of ‘value add’ are not always easy to represent. It’s time we developed these further and valued them alongside raw scores and averaged results, particularly given the concern Australia has with the flat lining performance of its higher ability students and the tail on our results.
Parents make schooling choices based on a wide range of factors, including results, and we do them a disservice with much of our current representation of those results. We need a better understanding of what a ‘good school’ looks like.