However, one polling company says that its final poll hit the final result almost perfectly but it was so outlandish a forecast that they decided not to publish it.
The CEO of Survation claims that the final poll conducted by his company on the eve of the election got the final result almost bang on.
Here is the poll in question:
To give you an idea as to just how close this one was, the Conservatives ended up with 36.9% of the vote; Labour received 30.4%; the Liberal Democrats 7.9%; UKIP 12.6%; and the Greens 3.8%. So the Survation poll was not perfect, but way closer than the average of the polls that had both Tories and Labour on 34% each.
And here's a breakdown of the data Survation shared with writer and political betting expert Mike Smithson:
Survation has shared this data with me showing splits in final 3 days of campaign.Notice how different parties moved pic.twitter.com/BmmwxlRDS5
- Mike Smithson (@MSmithsonPB) May 12, 2015
He wrote:
We had flagged that we were conducting this poll to the Daily Mirror as something we might share as an interesting check on our online vs our telephone methodology, but the results seemed so "out of line" with all the polling conducted by ourselves and our peers - what poll commentators would term an "outlier" - that I "chickened out" of publishing the figures - something I'm sure I'll always regret.
It certainly looks like the final Survation poll captured a late swing that only the exit poll seems to have picked up on. Lyons Lowe says the poll demonstrates that there needs to be "no internal review of polling methodology for Survation post this General Election result."
That's not what most of the other pollsters think, however. Peter Kellner, president of YouGov, makes it abundantly clear in his reaction to the election result that pollsters need to accept that they were wrong and that lessons must be learned by all of those involved in election forecasting.
As he wrote in a post on the YouGov site (emphasis added):
Indeed, if I wanted, I could construct a defensive barricade against criticisms of YouGov and other pollsters. I could point out that our final poll, showing 34-34%, was only three points out per party, and so within the bounds of normal sampling error, and that we correctly foretold the SNP tsunami in Scotland and Labour's big gains in London.
That, though, would be to evade the truth and insult the readers of this blog. We got the election wrong. So did the other ten polling companies who produced eve-of-election voting intentions: we all said the race was too-close-to-call. Only by admitting that we are all at fault can we start the journey to finding out why.
So is Survation right to claim that it called this one? That all depends on one crucial fact.
The difference between YouGov's and Survation's analysis of the accuracy of their polling methods comes down to a simply disagreement - whether the late swing to the Tories could/should have been picked up.
Fundamentally, Lyon Lowe is saying late polls can measure last minute shifts as demonstrated by his company's final poll.
However, on election day YouGov reinterviewed 6,000 people it had polled online earlier in the week and found that, although 5% of people had changed their vote, it made no material difference to the forecast - suggesting that the type of late swing Survation argues for is unlikely.
In either case, this type of late swing is always going to be a challenge for polling companies as it suggests polling done before the election is going to fail to pick up people's real voting intentions. Election Forecast
Here's what we take from all this:
- We need to include even more uncertainty about the national vote on election day relative to the polls.
- Constituency polls may work better when based on the standard generic voting-intention question.
In other words, the most likely outcome of the shock 2015 General Election result is that next time around, pollsters are going to be that much less confident of what their survey data is telling us, right up to the day of the vote.