SAYit Blog
What do political polls tell us about the election?

It's election year so we're going to hear a lot about polls.  While it's a matter of time before politicians come out with that hoary old chestnut about the only poll that counts being election day, by and large polls in New Zealand have been pretty good when it comes to picking the election results.  Well, I would say that wouldn't I?

In this blog I'm going to take a look at the final results from the mainstream polls from the 2011 election, along with the results of UMR's own last poll from that campaign.  I'll also look back at some of the past elections to see how the trends stand up over time.

In New Zealand, there are five major media polls, plus a few others (such as ours) that are done privately.  The five major media polls now are:

  • One News Colmar Brunton
  • NZ Herald Digipoll
  • TV3 Reid Research
  • Fairfax Ipsos
  • Roy Morgan

The first four of those, and UMR (along with one of the private polls), are all members of the New Zealand Association of Market Research Organisations and recently signed up to an agreed set of guidelines in terms of methodologies and reporting.  In theory at least they're all much of a muchness, but there will inevitably be differences in terms of the exact questions asked and how they ensure the survey sample is as representative as possible.  All those surveys have margins of error of between +/- 3.1% and +/- 3.6%.  I'm not privy to what exactly the other companies ensure that their samples are representative and I'm not going to share our exact methods with you - we all jealously safeguard those because they can be points of competitive advantage.

Four of those five polls were around at the 2011 election, with exception being Fairfax (then conducted by Research International).  Although some on the left wing blogs have been critical of the Fairfax poll on the grounds that it was a long way out in 2011, I think that's manifestly unfair as Ipsos weren't doing it.  That's like criticising Cadbury for the taste of a Peanut Slab.  The most we can say about the Fairfax Ipsos poll in 2014 is that we don't know how it stacks up historically.

In terms of the polls above, and indeed ours, it's fair to say that they were all reasonably close.  Every one of them showed National close to governing alone, Labour in the 20s and the Greens over 10%.  While some were clearly better than others, by and large they produced results that were a reasonable indication of what actually happened.

Let's start by looking at National's vote.  In every case, I've taken the company's final published poll:

  • Actual result: 47.3%
  • UMR: 48.6%
  • One News Colmar Brunton: 50.0%
  • Herald Digipoll: 50.9%
  • Roy Morgan: 49.5%
  • TV3 / Reid Research: 50.8%
  • Fairfax / Research International: 54.0%

Now Labour:

  • Actual result: 27.5%
  • UMR: 28.2%
  • One News Colmar Brunton: 28.0%
  • Herald Digipoll: 28.0%
  • Roy Morgan: 23.5%
  • TV3 / Reid Research: 26.0%
  • Fairfax / Research International: 26.0%

And the Greens:

  • Actual result: 11.1%
  • UMR: 12.4%
  • One News Colmar Brunton: 10.0%
  • Herald Digipoll: 11.8%
  • Roy Morgan: 14.5%
  • TV3 / Reid Research: 13.4%
  • Fairfax / Research International: 12.0%

Lastly, the only other party to pass or come close to the threshold, New Zealand First:

  • Actual result: 6.6%
  • UMR: 6.0%
  • One News Colmar Brunton: 4.2%
  • Herald Digipoll: 5.2%
  • Roy Morgan: 6.5%
  • TV3 / Reid Research: 3.1%
  • Fairfax / Research International: 4.0%

I won't go through the final results for the smaller parliamentary parties, but for each of them it's a pretty mixed picture with some polls picking too high and some too low (e.g. the range for ACT was 0.7% to 1.8%, versus an actual result of 1.1%).

So what can we learn from all of that?  First and foremost, while as I say some polls are closer than others, by and large they provided a reasonable picture of what actually happened. The two big differences for me, however, are:

  • National didn't get enough votes to govern alone, despite all five public polls suggesting that they would (49.5% would almost certainly have been enough for them to govern alone, because of 'wasted' votes cast for parties that didn't get seats in parliament).
  • Only half the polls picked NZ First getting over the threshold, and the three polls that didn't were all out by more than the margin of error. 

It's particularly interesting to note that all six polls listed, including our own, picked National too high.  Three of them were out by more than the margin of error.  That's not what we'd expect from probability theory, although we do need to recognise that all these polls closed at least a few days before the election (but less than a week) and votes can shift in the last few days. There are only two explanations for that: either there's a systematic skew towards National, or National shed votes in the last few days.

You might think that's just a one off result, but I went back and looked at poll results from every election since 1999.  That gives us a total of 19 final polls from 1999 to 2011 conducted by companies that are still polling.  So how did they do:

  • 16 had National too high, while 3 had them too low.  The most any company had underestimated National's vote by was 2%, while the most a company had overestimated National's vote by was 9%.  One poll has had National's vote above their actual vote by more than the margin of error at three of the last five elections.
  • 5 had Labour too high, while 5 had them too low.
  • 9 had the Greens too high, while 3 had them too low.  That overstates the case a little, because the most any poll has been out for the Greens is 3.4%.
  • 1 had NZ First too high, and 9 had them too low.  The biggest difference was in 2002, when one poll had them 6% too low - mostly the differences are within 2%.

I think it's fair to say from that that there's a tendency for New Zealand polls to overstate the votes for National and to a lesser extent the Greens, and to at least slightly understate the vote for NZ First.  When it comes to interpreting current polls, it doesn't really matter whether that's because of inherent biases in the polls or because National and the Greens' vote tends to drop in the last few days of the campaign while NZ First's picks up - the impact on our interpretation should be the same.

One way of looking at this further is to take the average (mean) error for these four parties across the 19 final polls included in this dataset.  That shows us that the average error is:

  • National: 2.7% too high
  • Labour: 0.7% too high
  • Greens: 1.0% too high
  • NZ First: 1.5% too low.

Counting all mainstream media polls since 2005 (i.e. excluding UMR but including TV3 and Fairfax / Research International polls in 2008 and 2011) leaves 14 polls, and an average error of:

  • National: 2.4% too high
  • Labour: 0.5% too low
  • Greens: 1.5% too high
  • NZ First: 1.1% too low.

These differences didn't really matter at the 2011 election, because the overall result was never really in doubt.  I guess you could argue that there would have been more emphasis on National's potential coalition partners had it been known that they probably weren't going to be able govern alone, but John Key spent plenty of time on cups of tea etc. anyway which suggests that National weren't counting their chickens on that score.

It surely does matter in 2014, when at least until recently most of the public polls have shown Labour + Greens within touching distance of National plus its current allies.  I think history suggests that:

  • If the total for Labour + Greens is within about 2% of the total for National and its allies (whichever of ACT, United Future and the Conservatives makes it into parliament), then it's actually pretty much a deadheat. 
  • If NZ First gets 4% in most of the mainstream polls, then they'll probably pass the 5% threshold on election day.