Is New Zealand’s Partnership School Kura Hourua scheme really a policy ‘shambles’?
The great risk in trying something new is that you will fail.
That’s a risk that affects policy programs too, where even a perception of failure can start to ring the death knell.
One program under this kind of scrutiny is New Zealand’s Partnership School Kura Hourua (PSKH) scheme. PSKHs are publicly-funded free schools, like United States charter schools, whose purpose is to provide an alternative education to regular state schooling for pupils at high risk of poor outcomes, including Maori, Pasifika and special needs pupils.
Since opening at the beginning of the 2014 school year, the schools have attracted much criticism. The nature of that criticism shows why it is important for social science to inform debates about policy and politics so that programs can be meaningfully evaluated, and decisions about whether they are effective based on reasonable evidence.
As has been argued elsewhere on Policy Forum, social science can improve available information to reduce uncertainty, ensuring policymakers, and others, can make informed choices between different options.
To date public debate about the performance of PSKHs, and their effectiveness at improving pupils’ achievement or engagement at school has not been informed by social science research. Critics have not even been asking the right questions to work out whether the schools are a success or not.
Critics have focused their attention on the number of pupils who have attended PSKHs, pouncing on fluctuations below schools’ minimum roll numbers because they believe that they show PSKHs are over-funded relative to regular state schools. They also claim roll fluctuations are evidence that PSKHs are a ‘shambles’, arguing the program is failing disadvantaged children. Instead, they want the $27.4 million of government money allocated to PSKHs to be spent on initiatives to ameliorate the effects of poverty and social status on regular state school pupils’ achievement.
These criticisms cannot be justified from what is known about PSKHs’ performance so far. For one, claims about over-funding were calculated by dividing each PSKH’s funding for the entire six-year life of its government contract by its 2014 minimum roll. But as PSKHs’ rolls will increase each year, nominal per pupil funding will fall, and become comparable to regular state schools.
Moreover, complaints about student numbers don’t add up. In the July school roll returns there was an 11 pupil difference from the expected numbers, across five PSKHs. That could hardly be described as a shambles. The October roll returns perhaps raise more questions, with 30 fewer pupils in attendance. However, some of the variance can be explained by PSKHs succeeding at their mission. One school, for example, graduated two of its pupils who had already gained their exit qualification during the year, and were offered service in the New Zealand Army. Holding these pupils at school would not have helped them further.
Critics should also temper their accusations, acknowledging that PSKHs target pupils who may have major problems with school engagement. This probably accounts for some of the variance.
Instead of putting the program into the dunce’s corner, critics should have been asking how PSKHs are performing relative to regular state schools that PSKH pupils would have been likely to attend. The roll fluctuations, and claims that PSKHs have failed pupils, do not make sense unless they are put in the context of roll changes at similar schools, and how pupils may have fared at these schools. Yet little attention has been paid to understanding the contextual issues which might have shown why PSKHs may be failing.
From what is known about charter schools’ impacts elsewhere, the critics could have also discussed what the relative impacts could be of other interventions to address the underachievement of disadvantaged pupils. Posing these sorts of questions could have helped decision-makers, families and the general public to better judge the value of PSKHs.
In my PhD research, which includes four PSKHs, I am attempting to answer some of these kinds of questions. My research evaluates PSKHs as a kind of educational social entrepreneurship venture. Social entrepreneurs use a business model to provide, or support the provision of, public goods and services to under-served, disadvantaged communities, to improve their welfare by creating opportunities that may not have existed before. I want to see whether educational social entrepreneurs can thereby create value, educational or otherwise, where regular state schools may not.
My findings to date show that a range of indicators, which take into account the ways PSKHs might affect outcomes besides educational ones, is required to properly understand the effects of ventures like PSKHs. To judge PSKHs according to the same standards as regular schools—or according to superficial indicators, such as roll returns—would likely miss important ways in which these ventures may create value for the pupils and communities which they serve.
Social science methods can be used to understand programs better, and provide information that can help to reduce uncertainty, when decisions have to be made about the ones on which it is worth taking a risk. This is an important way that academic research can, and should, inform the design and evaluation of public policy. As social science methods have not been used to evaluate PSKHs yet, they should not be judged failures—the jury is still out until the evidence is in. It’ll only be a failure if we know for sure it is, or if people keep thinking it is.