The problem is we are seeing 10% differences and that realistically means one or more polls are very wrong.
It certainly seems that way - as I said I don't buy this argument that it's just about volatility.
I think we might have the unfortunate situation where the two public polls represent the extremes, and the private polls are between the two (remembering that I don't have inside knowledge of that). I don't think, for example, National would have taken the risk of putting out an attack ad if their polling showed them close to governing alone.
@ Russell - thanks for posting the interview with Patrick Gower. He's made some really useful points, and it largely confirms my assumptions. It sounds like they're using the online survey to target demographic groups that are hard to get in phone polls, which isn't quite the same as my preferred option of targeting those who don't have landlines. It's also important to note that demographic groups that are hard to get in phone surveys also tend to be hard to get in online surveys - young males are indeed less likely to participate in landline polls, but they're also harder to get in online polls.
Although there are two polls using the SSI panel, the quotas, weights and questionnaire design will be done by the research company - that's how two surveys using the same panel can provide very different results.
@ Jeremy Andrew - as you probably know, the reason they were looking for under 45s was almost certainly that their over 45 quota for men in your area was already full. That illustrates the point that Patrick Gower (sort of) makes - older quotas tend to fill up first. Therefore when you have a survey spread over several weeks (like the Newshub Reid poll before this one), it's likely that the older sample will have been covered at the beginning of the cycle and the younger participants towards the end. That's why a fairly short survey period is a good idea - although you should be suspicious of surveys completed very quickly because it often means the quotas haven't been very strict.
@Bart Janssen - I essentially agree with your comment that if polls don't reflect the actual result there's a problem with the polls, although it's only fair to point out that there can be last minute changes and polls can't take into account things that happen between when the poll finishes and election day. Those changes do tend to be small of course, but if a poll shows NZ First (as the classic example) 1% below their final result that doesn't necessarily mean it was wrong - it could mean that a few voters decided at the last minute that they'd like Winston to stir things up.
@Walter Nicholls - On the other hand, I've found online polls useful for research on sensitive topics - people seem to be more willing to admit to potentially embarrassing things.
@Geoff Leyland - Yes online polls have to be based on big panels, and there are companies like SSI, Research Now and the Online Research Unit who make their livings out of building and maintaining them. They aim to make their panels as representative as possible, and invite people to participate in surveys based on the quotas they need to fill.
@Trevor Nicholls Look how well that turned out for the Australian right wing parties 🤣
@Russell Also with those Maori seat polls, we should remember the fieldwork dates (20 out of 37 days were before Jacinda became Labour leader) and that historically Maori seat polls have been relatively positive for the Maori Party. There was definitely one election where the Marae Digipoll showed them winning all 7 seats, which of course never happened.
Somebody stole my idea from 2014 😉 (using past accuracy to adjust current poll results). See this blog for how that worked out...
I doubt there's even any private polling Tom - it's too expensive. The only time you see legitimate electorate polls these days is in seats like Ohariu, which (until recently) could affect the nationwide result. Robos may exist, but they'll be unreliable as hell.
That's both shocking and not that surprising Trevor. In a similar vein, I recall that one of the resources I used for my 1996 thesis on turnout was a book of NZ election results from 1946 to 1990. The thing that sticks in my mind from that was that there were a lot of seats that hadn't changed hands since at least 1946 - and of course those tend to be the ancestors of seats that have never changed under MMP either.
From what I remember, I think the single best predictor of turnout under FPP was the margin of the seat at the last election. MMP lessens that variation, but it's still likely to be very important.
The Herald poll is by TNS, who used to do the TV3 poll (as CM and then TNS) before Reid. My recollection is that Reid split off from TNS. TNS are a big global company, so in theory they should be using a solid methodology, but the global brands haven't had a great record with NZ elections (e.g. Ipsos and before them Nielsen).
They surely asked a vote question but I don't see the sense in delaying releasing it - if they release it tomorrow it will have been more than a week since the poll finished, so it's not very useful as an indicator. I think either they didn't think it was a credible number or they don't want to be seen as competing with the public polls for some reason. On the other hand, the fact that they haven't used the party vote as a demographic in either of the two articles to date suggests that maybe they don't have it.
Mikaere - what you're describing was probably not a push poll. I don't think they were contacting you to try to get you to change your mind (specifically), they were trying to find out what would change the minds of people LIKE you. I don't think any NZ political party has the money to be able to afford the number of calls that would be needed to do a true push poll, and I feel certain that Len's campaign didn't. Quite possibly, another group of participants in the same survey heard pro-Banks and anti-Brown messages.
Ianmac - yes there can be a feedback loop, and that's a big risk for the Greens in particular right now. If the polls start showing them consistently at 3.5%-4%, then people may see them as dead in the water and they could drop even further. On the other hand, if they poll 4%-6%, then those considering the Greens will know that every vote counts, and that could draw people in. Similarly, when the stories were all about Labour falling in the polls voters would have seen them as no-hopers and looked for other options, but now they're rising in the polls voters see they have a chance again and get on board.
Some might suggest that's a problem with the polls, but I don't think so. In the absence of polls, people would come to their own conclusions based on who they thought was winning or losing. If public polls were banned, then the stories would all be about the old Tauranga Boys High poll of Year 13 students, or about Paul the Octopus, or whatever 'prediction' was out there. Even if they're not perfect, they have some science behind them, so they're a lot better and more reliable than the alternatives.
As Bart Janssen says, there's very clearly a relationship between the closeness of the election and turnout (compare 2005 and 2014 as an illustration). Downs' theory on rational voting, if I recall correctly (although he argued that voting was only rational if your vote was almost certain to decide the result, which is of course exceedingly unlikely).
Point of order Russell - I'm ex UMR NZ, but still do some work for UMR Au (separate company) 😉 You're right though, I'm not privy to the UMR NZ polls any more so what I'm saying is from an 'interested outsider' perspective.
As Andrew says, what people think of as push polling doesn't meet the definition. Push polling is about using a pretend poll to persuade enough voters to switch to your desired position. The sample sizes of NZ polls are too small to be anything other than a drop in the bucket in terms of the overall vote.
What Mikaere is referring to was probably a poll designed to work out which messages are most effective. The pollster probably asked for a vote up front, tested the messages and then asked the vote again - they would have then looked at who changed their minds, and which messages seem to have caused that change. The politician, company or organisation behind the poll then knows which messages to focus on - it's about influencin the viees of tens of thousands of people, not the views of a single survey participant.
Although push polls do exist (and are against pur Code of Practice), they're only really practical in contests where a very small number of votes can influence very big issues. With conventional methodologies you'd be talking tens of thousands of dollars to influence a few hundred votes - remembering that not everyone contacted will change their minds, you'd need to push poll a lot more people than you need.
It's more feasible with robos I imagine, which is another reason not to trust them.
On your last point Russell, there's no way Reid surveyed all those people just for a preferred PM poll (which I strongly dislike mainly because they're usually skewed to the incumbent, which is what is interesting about the NZ ones at the moment). There must be at least a vote coming, and on the numbers so far it's bad news for the Maori Party.
Two problems with that.
1) Curia doesn't have a 200 line call facility - they're a pretty small agency.
2) Even with 200 interviewers, there's no way in hell you could do 50,000 surveys in an evening, especially not of the length that Russell posted. That's easily a 10 minuter if not more. You need to bear in mind response rates, time 'wasted' talking to people who don't qualify or don't want to participate, incorrect numbers...
Besides, if you were going to target bloggers and get them to publicise your questions, why would you target ones like Russell who were likely to post them on left blogs where they were likely to get shredded? I'd be more likely to screen out journos & bloggers than to have them in.
It's not a push poll - they just won't reach enough people for it to be worthwhile and actually the questions aren't all that leading. It is, however, an astounding poll for this stage of the campaign. That, like the wrap-around blue infestation on the Herald and Stuff websites today and the copious billboards suggests that they have an unbelievable amount of money at their disposal. The financial disparities at this election really are stark, and perhaps, more than anything, they explain the left's difficulties in getting cut-through.
It's closer than you might think. Here's my poll of polls, correcting for the difference between the actual vote and the final polls at the last few elections. UMR polling is included in the mix. http://sayit.co.nz/blog/its-crunch-time