The stakes are getting higher

President’s comment 1 May 2012

This week has seen renewed media interest in NAPLAN, following The Courier-Mail’s amalgamation of publicly available school data into an online “schools guide” and the launch of the national “Say no to NAPLAN” campaign being run by a group of education consultants, academics and retired teachers.

The two approaches, launched on the same day, could not be more different.

The newspaper has repackaged mostly MySchool data to generate “school report cards”. It claims it has not produced “league tables”, then proceeds to report on the “top-performing” schools and publish comments such as “analysis of the state high school sector shows it struggled in comparison with its private counterparts with only eight making the top 100”. It’s difficult to see why The Courier-Mail has poured its resources into this database. The paper claims it is to support parents in choosing the best schools for their children, but how is it helpful for those parents to use the database to generate a “report card” that compares a primary school in Bundaberg with a high school in Bowen with a special school in Brisbane?

The “Say no to NAPLAN” campaigners are encouraging Australian parents to withdraw their children from testing, highlighting that NAPLAN data leads to simplistic and misleading comparisons of schools and that the high stakes attached to the testing narrow curriculum delivery and place undue emphasis on preparing for the point-in-time test.

ACARA has responded by claiming there is no need for schools to spend time on having students practise for upcoming tests. But schools are now publicly named and shamed on their NAPLAN results, and the Federal Government proposes that the results should influence the allocation of teacher bonus pay, so it is hardly surprising that schools do spend time (that would otherwise be spent on the “rich learning” that ACARA promotes) on ensuring children know exactly how to colour in the circles on the testing material. If a student ticks every correct answer, or circles every correct answer, or doesn’t precisely colour in the circle corresponding to every correct answer, they will score “zero” on NAPLAN; and the implications for schools are becoming serious indeed.

Since NAPLAN testing was introduced in 2008, some things haven’t changed. The test still is one very small part of the education picture, assessing a student’s performance over  a few days in only four of the 13 years of formal schooling; it is a point-in-time test of a national curriculum that is not yet embedded; it does little to measure a school’s “improvement” over time unless the same cohort of students is tested over time, which is highly unlikely in areas with highly transient populations; and there remains no political will to dispense with NAPLAN because the testing regime that was imposed on state governments during the Howard era still has significant federal funding tied to it.

What has changed is how NAPLAN data is used to judge schools, principals, teachers and students. Numbers in their raw form do not lie; but interpretations of these numbers can, and do, mislead.