Learning nothing in education

Dr Eric Crampton
The National Business Review
27 April, 2018

The best thing about the National government was its focus on evaluation.

Finance Minister English often reminded us a large portion of government spending was probably wasted. Figuring out what worked and what did not would let the government shift resources to get the most value out of every dollar spent.

Mr English’s investment approach coupled big data analytics with policy experimentation to find ways of improving the lives of those receiving government services and transfers while also improving the government’s long-term fiscal position.

The approach really was world-leading, thanks in no small part to the excellent data held by Statistics New Zealand linking up administrative records from agencies dealing with social welfare, education, justice, tax, health and more.

That makes the final evaluation report on partnership schools all the more baffling. After reading Martin Jenkins’ report, I am left none the wiser about whether partnership schools were successful.

A government with access to the world’s best microdata, and with a strong stated commitment to evaluation, simply failed to commission or undertake any real assessment of its experiment with partnership schools.

Whether or not the schools actually improved outcomes for the students they taught, an experiment without measurement and evaluation is a failure.


Not rocket science
It is not as though this kind of evaluation work is either rocket science or unprecedented. America’s thousands of charter schools spawned a small academic industry of rigorous assessment. By my rough count, there are about 80 published academic studies of charter schools for every 100 American charter schools – and that counts only those studies published in economics books and journals.

Evaluation in that literature is taken seriously. Students are often admitted to oversubscribed charter schools through lotteries. Researchers can then compare outcomes for lottery winners and those who wanted to attend but could not. In those evaluations, charters do far more to help poor minority kids in urban schools than kids from richer backgrounds. Charters then help to bridge the outcomes gap.

Charters also help the system figure out what works and what doesn’t. New entrepreneurial schools try new methods. When that is coupled with rigorous assessment to figure out which of those methods work, even students in public schools can reap the benefits when their schools adopt better methods. And competition from charter schools can drive sharp increases in public school performance.

It is because of the successes reported in that literature that New Zealand trialled partnership schools in the first place.


Absence of guidance
The ACT Party proposed partnership schools as a way of improving outcomes for kids trapped in state schools that were failing them. National’s coalition deal with ACT after the 2011 election allowed partnership schools in areas with more entrenched educational underachievement.

But it did not bring any rigorous evaluation framework.

In February, I asked the Ministry of Education for the final Martin Jenkins evaluation report on charter schools, along with the terms of reference for the report and any records from any meetings of the ministry’s senior leadership team in which the report was discussed.

The report was released earlier this month. It provides a lot of background data on the students who attend partnership schools, and survey data showing that parents are rather satisfied with their schools. But neither of those really measure how the schools have performed.

Evaluation should have been built into the initiative from day one.

Admission to charter schools should have incorporated a lottery element so that outcomes could have been more fairly compared.

But even absent lotteries, New Zealand’s rich administrative data provides ample opportunities for better evaluation. For every student who switched to a partnership school, an analyst might have found a statistically similar student who stayed with the initial public school – or who switched to a different public school. Outcomes like performance on externally invigilated NCEA standards could have been compared across the groups.

Or, even more simply, outcomes could have been compared for partnership school students and public school students after adjusting for all of the family background characteristics we know matter in educational performance.

But none of that was in the evaluation’s terms of reference. So you might not be surprised that there were no meetings of the ministry’s senior leadership team at which the report was discussed. There was nothing really to learn from the report.

David Seymour was the MP responsible for partnership schools under the previous government. When I asked why there had been no rigorous evaluation of the partnership schools, he was livid. He told me he had desperately wanted proper evaluation, and cited some good evaluation models, but said those efforts had been blocked. He had been told it was not necessary because a better evaluation framework was coming for all schools and so no partnership-specific evaluation was needed. But that framework never came.

The government conducted an educational experiment without learning. We do not know whether students at the different partnership schools did better, worse, or about the same as comparable students in traditional schools.

That is even worse than running an experiment and finding out that all of the schools failed. If they failed, and we knew that, we would at least know that. We might start teasing out what parts succeeded, and which ideas really did not pan out. But we do not even know that. Neither can we celebrate any successes among the schools that did improve outcomes for their students.

Some of that work can still be done. Data still sits with Statistics New Zealand that could help us tell which schools did well under a model that will have ended by the time any analyst looks at it.

Since losing office, National has championed partnership schools against Labour’s changes. It is a real shame National did not build the kind of evaluation framework that could have formed the basis for a more successful defence of the schools – or that could have told us that defence was not warranted.

National’s investment approach emphasised the importance of evaluation in figuring out what works. If only they had applied that approach with the partnership schools.

Stay in the loop: Subscribe to updates