(Disclosure at the outset: Polity’s clients include Justin Lester’s campaign.)
As readers know, Justin Lester won Wellington’s mayoralty over Nick Leggett by 56% to 44% on the final STV count, with turnout up around five points to 45%.
Now that the final results have been released, and the Wellington City Council has tried to claim credit for a five point turnout bump on the basis of their poster design (!), we can look a little to the data to evaluate what happened.
This is not a post gloating about a victory. Being involved with Labour means I’ve been on the other side of election nights plenty of times, mourning losses rather than celebrating wins. All three top-placed Wellington campaigns ran a professional operation, their candidates would have been highly competent mayors, and they should all be proud of their efforts.
But I do want to make four points about modern analytics in this kind of campaign. They’re all about polling.
Fellow anoraks may recall that two internal polls leaked a few weeks before the election. One was from David Farrar’s Curia, and showed the Lester-Leggett race was neck and neck on first preferences, with Leggett winning out on lower preferences. The second, from a firm called Community Engagement, showed Lester well ahead on first preferences and with a solid lead after preference distribution as well.
As we’ve seen from the official results, the election broadly played out in line with the Community Engagement poll’s predictions.
Critically, the Community Engagement poll also asked who voters would prefer if the race came down to a Lester vs Leggett runoff, which ultimately it did. (It asked the same thing for Lester vs Coughlan and Leggett vs Coughlan.)
This is the most important question to ask in a multi-candidate STV election.
The table below shows the raw results of that question, the results of that question for decided voters only, and the final Lester vs Leggett round of the election.
As you can see, the poll was within about two points of the final runoff result. That’s a pretty accurate result.
Four thoughts on this.
First, some people dismissed the Community Engagement poll because it is run by a person known to be sympathetic to Labour. That’s a strange logic, as it would also suggest all David Farrar’s work for National is inaccurate simply because David Farrar likes National. I think Farrar’s work is much better than that.
Second, the National Business Review accused the Community Engagement poll of being “bogus,” and more-or-less suggested the firm was a fiction. So much for the “bogus” part. And on the claim the company is made up, I’d suggest NBR checks with the Victorian ALP, who use Community Engagement a lot. The NBR really needs to learn to think before repeating and amplifying the delusional ravings of Mr C Slater.
Third, it’s worth noting that the Community Engagement poll used automated touchtone polling known as “robopolling” rather than live operator polling. Some think the cheaper technology involved in robopolling can lead to lower quality data, and more inaccurate results. I refer those people to the table above, and also to the fact that many of the top overseas pollsters (eg. ReachTel in Australia) have recently moved towards robopolling.
Fourth, one advantage of Community Engagement’s poll was that it only surveyed the people likely to vote in the election. Turnout in local body elections is normally under 50%, and the half that vote is not at all like the half that doesn’t vote. On average local body voters are a lot older, more female, richer, and whiter than the non-voters.
Because of that, it’s critical to talk to the right people when you’re polling (and also when you’re a political party making voter contacts).
Curia’s poll, on the other hand, surveyed across everyone eligible to vote in the election. In a low turnout election, that’s bound to mislead. I think in this case some lower information folk recognized Leggett’s name from his billboards / bus backs and so on, and told Curia he had their support on the basis of than name-recognition, but were never likely to cast a ballot.
The lesson? Likely voter screens are hugely important in lower turnout elections. In fact, as turnout rates in Parliamentary elections dip into the 70s in recent cycles, they’re becoming ever more important in those elections, too.