In last night's debut Media Take, I talked Colmar Brunton's Andrew Robertson and UMR's Gavin White about an issue that has been exercising polling nerds for the past few weeks: the significance of the "undecided" vote in polls this year.
In particular, the Political Scientist blog has proposed that the real story in Fairfax/Ipsos polls -- which seem to have shown a crash in Labour support and a flood towards National -- is the steady downward trend in decided voters. So, perhaps, what's actually happening is uncertainty about where to vote on the Left and steady support for the one big party on the Right.
On the show, Andrew and Gavin agreed with Thomas Lumley at StatsChat that the conclusion is limited by the small number of polls (nine) counted by Political Scientist. Thomas concluded:
We simply don’t have data on what happens when the decided vote goes up — it has been going down over this period — so that can’t be the story. Even if we did have data on the decided vote going up, and even if we stipulated that people are more likely to come to a decision near the election, we still wouldn’t have a clear story. If it’s true that people tend to come to a decision near the election, this means the reason for changes in the undecided vote will be different near an election than far from an election. If the reasons for the changes are different, we can’t have much faith that the relationships between the changes will stay the same.
The data provide weak evidence that Labour has lost support to ‘Undecided’ rather than to National over the past couple of years, which should be encouraging to them. In the current form, the data don’t really provide any evidence for extrapolation to the election.
But Andrew and Gavin also agreed that the fact that media organisations either bury the undecided vote or don't report it at all is a problem.
When undecided voters (who unfortunately aren't all counted in the same way by the different polling firms) approach a quarter of the sample, ignoring their existence leaves out a key part of the picture. And apart from anything else, they all (with the exception of Roy Morgan, the black box of New Zealand political polling) signed up this year to the New Zealand Political Polling Code, which obliges polling companies and their media clients to prominently report the undecideds.
... that polls can actually designed to try to maximise the number of undecideds.
My view is that non-response is probably the most important source of error for political polls. Part of the problem is that the average person is not obsessed with politics, and they are harder to survey for this reason (because they are less included to take part in a poll). By targeting as high a response rate/as low a refusal rate as possible, polls are trying to maximise coverage of non-politically-obsessed people.
On the show, we discussed a number of other dimensions of polling and the media, including the fact that journalists shouldn't hang their hats on decimal-point differences in successive polls.
And I was able to tell the world that Maori Television's news and current affairs division is poised to become a player this year. It will be polling all seven Maori electorates, at least two of which (Te Tai Tokerau and Waiariki) may be critical at a national level.
This is a big step up from the only previous regular insight into those electorates, the Marae Digipoll polls, whose sampling methods (1000 Maori voters nationally, only two thirds of them on the Maori roll) leave them drawing only "indicative" conclusions in the individual Maori electorates, where the samples will often be fewer than 100 respondents.
You can watch all that from about the 13-minute mark in the on-demand version of Media Take. But, of course, you should watch the whole thing!