Election time – do polls ever get it right?
31 October 2019
The first thing to say is that political polls don’t ALWAYS get it wrong. We tend to remember the times they get it wrong because no-one talks about the times when they get it right.
However, over the last few years we’ve seen three instances of when the political polls have got it wrong –
- General Election 2015 – polls suggested a hung parliament
- EU Referendum 2016 – polls suggested a close call but that Remain would win
- US Election 2016 – again polls suggested a close call but that Hilary Clinton would win
How can this happen when there is so much time, money and effort put into the polling process?
Briefly, here are a few reasons why the poll results don’t always match the actual result on the night.
- A political poll needs to represent the whole population, from the 18-year-olds glued to their phones in Starbucks through to the 82-year-olds watching Coronation Street at home. Should you survey them by phone, by post, by email, or by knocking on their front door? Choose the wrong method and you will miss out a whole chunk of the population.
- Pollsters ask people about their future behaviour. But those asked might not yet have decided, or they might change their mind from Party A to Party B in between telling the pollster of their intentions and actually voting.
- People don’t like being asked about their political leanings. Some feel it’s too personal a question, so they might say they’re undecided when they’re actually voting for Party A. Some might lie outright, saying they’re voting for Party A when they’re really going to vote for Party B.
- It’s impossible to predict how many people will go out to vote. In some areas there might be a history of a 50% turnout, yet on voting day only 25% actually vote. However, all the predictions will have been based on 50% voting.
- There will always be a margin of error: if Party A has a 53% share, and Party B has 47%, with a margin of error of 5%, then Party A could receive anything between 48% and 58% of the vote. And Party B could receive anything between 42% and 52% of the vote.
- Who commissioned the poll? Do they have a vested interest in reporting a particular victory?
It explains how to use poll data, looking at areas such as:
- whether the sample is representative
- whether the findings are statistically significant, and
- whether it’s clear who conducted the research.
With election fever about to start, it’s going to be a long few weeks of opinion polls. I wonder if predictions will be right this time?