There's a lot of attention on political polls at election time, and a lot of people claiming that the polls are becoming less and less accurate. It's true that polling companies face growing challenges in getting our polls accurate, but it is something we put a lot of effort into. I thought it was worth looking objectively at the public polls over time to see whether they really are becoming less accurate.
Here are the basic tools I'm using for judging accuracy:
1) The final result for every public poll published in New Zealand between 1999 and 2011 (i.e. the last result before election day)
2) The party vote for the four largest parties on average across that time period (National, Labour, Greens and New Zealand First)
3) The difference between what the polls said and what the actual result was.
Now, I'm aware that the Alliance (1999) and ACT (1999 and 2002) have made guest appearances in the top four occasionally, but I thought it simplest to use the same parties for all five elections.
I've measured the difference between the polls and the actual result using what I term 'net error'. That is, the sum of the absolute values of the differences between the polls and the actual result for the parties concerned. If, for example, a poll picks National to get 48% and Labour to get 26%, and the actual result is National 47% Labour 27%, then the 'net error' for National and Labour is 2% (made up by 1% for National and 1% for Labour). A perfect poll would therefore have a net error of 0%.
The polls I've used are:
- 1999: One News Colmar Brunton, Herald Digipoll, TV3 Reid Research
- 2002: One News Colmar Brunton, Herald Digipoll, TV3 Reid Research
- 2005: One News Colmar Brunton, Herald Digipoll, TV3 Reid Research, Roy Morgan
- 2008: One News Colmar Brunton, Herald Digipoll, TV3 Reid Research, Fairfax Nielsen
- 2011: One News Colmar Brunton, Herald Digipoll, TV3 Reid Research, Fairfax Research International
(In case anyone looks back in the archives, the TV3 poll was branded differently in the past but my understanding is that the researchers are the same)
So let's look at the average net error for the two major parties for each of those elections:
- 1999: 5%
- 2002: 11%
- 2005: 4%
- 2008: 4%
- 2011: 5%
Ok, and what about the average combined net error for National, Labour, Green and NZ First:
- 1999: 7%
- 2002: 18%
- 2005: 7%
- 2008: 6%
- 2011: 8%
You'd be hard pressed to argue based on that that the polls are getting less accurate. If you don't believe me or find that bamboozling, here are the average and actual results for National:
- 1999: Average final poll 32%, actual result 31%
- 2002: Average final poll 25%, actual result 21%
- 2005: Average final poll 40%, actual result 39%
- 2008: Average final poll 47%, actual result 45%
- 2011: Average final poll 51%, actual result 47%
And the same for Labour:
- 1999: Average final poll 38%, actual result 39%
- 2002: Average final poll 48%, actual result 41%
- 2005: Average final poll 41%, actual result 41%
- 2008: Average final poll 34%, actual result 34%
- 2011: Average final poll 26%, actual result 28%
The analysis that I saw used polls from 2005, 2008 and 2011 to argue that the polls were getting less accurate, but I think that it wasn't taking a broad enough focus. What I think we actually see is that the polls struggle in 'fait accompli' elections like 2002 and 2011, and do best in the tight elections like 2005. I think it's no accident too that the polls seem to be more accurate when turnout is high, like in 2005.
For more analysis of political polls and election results, you might like to read http://sayit.co.nz/blog/what-political-polls-tell-us. I'll have a look at this year's final polls to see what they suggest about the election tomorrow - stay tuned.