Trump, Brexit and the problem with polls
1 in 5 Britons believe x. 6 out of 10 of us are having y amount of sex. 52% of people are voting for z.
The politicians, the press, the public: we're all obsessed with polls. Every day a different headline.
Some are facetious, some political, but all are presented as the gospel truth or used to try and predict the future. Actually, they’re just assertations backed up by nebulous so-called ‘data’ gleaned from polls - which are published without providence.
Who are these polls asking?
Most people I know have never been asked to participate in a poll, and probably never will. Even if they do, they may well not respond. And if the questions are coming via a landline, most of my generation wouldn’t even be able to respond.
The young generally don’t have landlines. The elderly generally don’t use social media. Polling is already biased, by default, in favour of the type of person who responds to polls. The methods by which you accrue data, skew data.
And when it comes to personal subjects like sex, or controversial and highly charged like politics, why are we assuming that everybody tells the truth? If the questions are being asked face to face, you can go ahead and assume that some don't.
Time and time again, polls results are proven to be anything from unrepresentative, to misleading, to completely inaccurate. I’m thinking of the 2015 UK general elections, in particular.
The common assumption of the population, and a media narrative based on a considerable amount of polled data, was that the result would be a very close vote, possibly a hung parliament.
Out of 92 polls, not a single one accurately predicted the clear margin of Conservative victory.
A number of factors were at play. Professor Patrick Sturgis, University of Southampton, headed an investigation into how this could've happened. His report concluded that the primary cause of failure to predict the outcome was the unrepresentative nature of the poll samples – and the methods used to recruit them.
But there was one, the ‘shy Tory’ factor, that still goes unexplored. The noise of anti-Conservatism at the time, in certain circles, was such plans to vote Tory were hidden; people were embarrassed, or just not in step with their social circle.
In the digital age of ubiquitous internet, social media and viral sharing, it’s become increasingly quick and easy to garner high numbers of responses, seemingly lending polls more credibility. But actually, they’ve become more misleading.
The unrepresentative nature of social media is well documented. You are socially connected to those who, by and large, share your views and demographics, leading you all to the same surveys. And if social media are unrepresentative echo-chambers, then polls have become the distorted megaphone.
There's also a potential towards self-fulfilling prophecy. Results can dominate the political news cycle, which can end up structuring a narrative. So, depending on which survey you’re using, and where you acquired your data, you could be dealing with opposing interpretations of opinion. Presentation of polls as hard fact can lead people to confuse opinion with reality.
And however much commentators say they come to their views as a result of independent judgment, that judgment is undeniably affected by polling and what they expect of an outcome.
Does the media report that people believe something, because people do believe it? Or does the reporting the 'fact' of this belief actually lead people to believe it? It’s an insidious catch-22.
And never was this nightmare of polling iteration more prominent than during the EU referendum. Neither side was listening to each other, because in general, neither side could actually hear each other across deep social, geographical and political divides and lack of common ground in the media.
Just like in 2015, polls lulled us into a false sense of clairvoyance about the result of the referendum. Hot on the heels of the general election, you’d have thought we’d be a little warier this time.
Opinion polls consistently put the Remain and Leave camps at almost neck and neck, with Remain having the slight edge right up until June.
The assumption was, on all sides, that Remain would win, given that they had all the big ammunition, figures in society, economists and now-derided experts.
The reputation of polls plummeted after the general elections, leading to a rise in popularity of betting markets, who actually got it right (their data is based on consistent factors as opposed to different sampling methods). But despite this, we simply chose to forget. After all, this time, the bookies were betting against Brexit – in April, Oddschecker had it at 70 per cent.
And the failure to predict the outcome of the referendum correctly cost them millions.
But basing predictions on polling could cost more than money. I'm thinking now of the unforeseen and potentially tragic outcome of the recent Colombia referendum on the peace process with FARC.
It was an oh-so-delicate accord years in the making, that President Santos gambled on the table of political kudos, stalked by the sceptre of his own personal Boris Johnson, Uribe.
Why did he do it? Perhaps out of greed, perhaps for a shot at a now cripplingly ironic Nobel Peace Prize, or perhaps out of imbecilic overconfidence fuelled by inaccurate polling?
The final result showed a huge margin in support for the peace deal, as did every preceding poll, to some degree or another. Genuine anguish and fury among victims' families was hard to hear above the jubilation and solemn white-clad handshakes proclaiming peace.
The results sent shockwaves around the watching world's media: almost 51% 'no'.
Of course, a referendum is a very different ballgame to an election. The polling methodology is completely different, and during a referendum the bias in polling is almost always to the status quo.
Although, think back to the Scottish referendum, when a contest polls anticipated to be extremely tight ended in a clear and easy victory for staying in the UK.
But in the case of the EU referendum, the media’s overreliance on polls meant our failure to tap into the true visceral reality of grievances. It was a clear example of lack of due journalistic diligence, proper research and original reporting.
The more we focussed on the latest YouGov or Ipsos MORI findings, the less we got out there and spoke to people.
The roughly 6 million or so who voted in the referendum, who had never voted before in their lives, were also never really polled.
Big polling companies, if they're lucky, will reach about 10,000. Many of these will be preregistered, regularly polled, active in politics and up to speed with current affairs. The polling will almost almost take place via the internet, or a landline.
But not mobile phones. Most people respond to their mobiles - these are the people we needed to be hearing from. Shockingly, during the general elections, companies barely polled mobile phones at all.
Where Nigel Farage and his communications director Andy Wigmore were successful, was in tapping into this unheard demographic.
Leave.EU polled mobile phones, used social media with effortless ease, and reached numbers of often up to 50,000 or even 100,000 at a time.
Let’s face it – polls provide easy headlines and always have done. You can choose your polls to fit the case you want to make: one way or another, they can be manipulated for reasons of propaganda or persuasion. People are like sheep, they say, and are easy to herd into a common opinion. And at its lowest level, it’s a cheap way of getting a reaction.
One particularly ugly example of the misuse of polls in the media is the now-infamous headline in The Sun, stating that one out of five British Muslims supported those who went to fight in Syria with jihadi groups. After a huge backlash, Ipso ruled that the headline had completely misinterpreted the results, by clouding the actual wording of the question.
Yes, we're under pressure to get a headline with impact, but there is a line.
In 2016, a year when it feels like anything can happen, fortune telling on the back of unrepresentative polling is a dangerous game to be playing.
When it comes to an innocent, even facetious poll about sexual practices in the country, this may not be so important. But when it comes to predicting outcomes to major events such as a political vote, the dependence on polls starts to seem like a problem.
Of course, you can’t just ignore the data of polls, but perhaps there’s more of a justification for reporting on the betting markets as well, which came out trumps during the 2015 elections.
Or, you can publish the full details of the poll. Earlier this year, in an exercise of masturbatory meta-journalism, I undertook a 'poll on polls', to establish what factors might make poll results seem more credible. Overwhelmingly, the answer was knowledge of the poll methodology itself, including demographics, the length of time it was open, how many respondents, and so on.
Or, you can shift the focus from polls. If you're spending half your time trying to predict the future, you've only got half the time left to examine the issues, which are far more important.
Latest opinion polls give Clinton an 11-point lead over Trump in the US presidential elections, and the world breathes a sigh of relief.
I'm in no way suggesting that the US elections are similar to the UK's, or that the Colombian and EU referendum had much at all in common.
What I will say, is that humans are doomed to repeat the same mistakes over and over. So let's not get too comfortable just yet.
Some are facetious, some political, but all are presented as the gospel truth or used to try and predict the future. Actually, they’re just assertations backed up by nebulous so-called ‘data’ gleaned from polls - which are published without providence.
Who are these polls asking?
Most people I know have never been asked to participate in a poll, and probably never will. Even if they do, they may well not respond. And if the questions are coming via a landline, most of my generation wouldn’t even be able to respond. The young generally don’t have landlines. The elderly generally don’t use social media. Polling is already biased, by default, in favour of the type of person who responds to polls. >span class=“TextRun SCXW32188381” xml:lang=“EN-GB” lang=“EN-GB”>.
And when it comes to personal subjects like sex, or controversial and highly charged like politics, why are we assuming that everybody tells the truth? If the questions are being asked face to face, you can actually go ahead and assume that some don't.
Time and time again, polls results are proven to be anything from unrepresentative, to misleading, to completley inaccurate. I’m thinking of the 2015 UK general elections, in particular.
>span class=“TextRun SCXW32188381” xml:lang=“EN-GB” lang=“EN-GB”> result would be a very close vote, possibly a hung parliament.
Out of 92 polls, not a single one accurately predicted the clear margin of Conservative victory.
A number of factors were at play. Professor Patrick Sturgis, University of Southampton, headed an investigation into how this could've happened. His report concluded >span class=“TextRun SCXW32188381” xml:lang=“EN-GB” lang=“EN-GB”> unrepresentative nature of the poll samples – and the methods used to recruit them.
But there was one, the ‘shy Tory’ factor, that still goes unexplored. The noise of anti-Conservatism at the time, in certain circles, was such plans to vote Tory were hidden; people were embarrassed, or just not in step with their social circle.
In the digital age of ubiquitous internet, social media and viral sharing, it’s become increasingly quick and easy to garner high numbers of responses, seemingly lending polls more credibility. But actually, they’ve become more misleading.
>span class=“EOP SCXW32188381” data-ccp-props=“{”134233117“:true,”134233118“:true,”201341983“:0,”335559740“:240}”>
>span class=“EOP SCXW32188381” data-ccp-props=“{”134233117“:true,”134233118“:true,”201341983“:0,”335559740“:240}”>
And however much commentators say they come to their views as a result of independent judgment, that judgment is undeniably affected by polling and what they expect of an outcome.
Does the media report that people believe something, because people do believe it? Or does the reporting the 'fact' of this belief actually lead people to believe it? It’s an insidious catch-22.
And never was this nightmare of polling iteration more prominent than during the EU referendum. Neither side was listening to eachother, because in general, neither side could actually hear eachother across deep social, geographical and political divides and lack of common ground in the media.
Just like in 2015, polls lulled us into a false sense of clairvoyance about the result of the referendum. Hot on the heels of the general election, you’d have thought we’d be a little warier this time.
Opinion polls consistently put the Remain and Leave camps at almost neck and neck, with Remain having the slight edge right up until June. The assumption was, on all sides, that Remain would win, given that they had all the big ammunition, figures in society, economists and now-derided experts.
>span class=“SpellingError SCXW32188381”>Oddschecker had it at 70 per cent.
And the failure to predict the outcome of the referendum correctly cost them millions.
But basing predictions on polling could cost more than money. I'm thinking now of the unforeseen and potentially tragic outcome of the recent Colombia referendum on the peace process with FARC.
It was an oh-so-delicate accord years in the making, that President Santos gambled on the table of political kudos, stalked by the sceptre of his own personal Boris Johnson, Uribe.
Why did he do it? Perhaps out of greed, perhaps for a shot at a now cripplingly ironic Nobel Peace Prize, or perhaps out of imbecilic overconfidence fuelled by inaccurate polling?
>span class=“TextRun SCXW32188381” xml:lang=“EN-GB” lang=“EN-GB”> huge margin in support for the peace deal, as did every preceeding poll, to some degree or another. Genuine anguish and fury among victims' families was hard to hear above the jubilation and solemn white-clad handshakes proclaiming peace.
>span class=“TextRun Underlined SCXW32188381” xml:lang=“EN-GB” lang=“EN-GB”>almost 51% 'no'.
Of course, a referendum is a very different ball game to an election. The polling methodology is completely different, and during a referendum the bias in polling is almost always to the status quo.
Although, think back to the Scottish referendum, when a contest polls anticipated to be extremely tight ended in a clear and easy victory for staying in the UK.
But in the case of the EU referendum, the media’s overreliance on polls meant our failure to tap into the true visceral reality of grievances. It was a clear example of lack of due journalistic diligence, proper research and original reporting.
>span class=“EOP SCXW32188381” data-ccp-props=“{”134233117“:true,”134233118“:true,”201341983“:0,”335559740“:240}”>
>span class=“EOP SCXW32188381” data-ccp-props=“{”134233117“:true,”134233118“:true,”201341983“:0,”335559740“:240}”>
Big polling companies, if they're lucky, will reach about 10,000. Many of these will be pre-registered, regularly polled, active in politics and up to speed with current affairs. The polling will almost almost take place via the internet, or a landline.
But not mobile phones. Most people respond to their mobiles - these are the people we needed to be hearing from. Shockingly, during the general elections, companies barely polled mobile phones at all.
Where Nigel Farage and his communications director Andy Wigmore were successful, was tapping into this unhead demographic.
Leave.EU polled mobile phones, used social media with effortless ease, and reached numbers of often up to 50,000 or even 100,000 at a time.
Let’s face it – polls provide easy headlines and always have done. You can choose your polls to fit the case you want to make: one way or another, they can be manipulated for reasons of propaganda or persuasion. People are like sheep, they say, and are easy to herd into a common opinion. And at its lowest level, it’s a cheap way of getting a reaction.
One horrendous example of the misuse of polls in the media is the now-infamous headline in the Sun (a not uncommon phrase, actually), stating that one out of five British Muslims supported those who went to fight in Syria with jihadi groups. After a huge backlash, Ipso ruled that the headline had completely misinterpreted the results, by clouding the actual wording of the question.
Yes, we're under pressure to get a headline with impact, but there is a line.
In 2016, a year when it feels like anything can happen, fortune telling on the back of unrepresentative polling is a dangerous game to be playing.
When it comes to an innocent, even facetious poll about sexual practices in the country, this may not be so important. But when it comes to predicting outcomes to major events such as a political vote, the dependence on polls starts to seem like a problem.
Of course, you can’t just ignore the data of polls, but perhaps there’s more of a justification for reporting on the betting markets as well, which came out trumps during the 2015 elections.
Or, you can publish the full details of the poll. Earlier this year, in an exercise of mastubatory meta-journalism, I undertook a 'poll on polls' to establish what factors might make results seem more credible. Overwhelmingly, the answer was knowledge of the poll itself: methodology, demographics, the length of time it was open, how many respondents, and so on.
Or, you can shift the focus from polls. If you're spending half your time trying to predict the future, you've only got half the time left to examine the issues, which is far more important.
Latest opinion polls give Clinton an 11-point lead over Trump in the US presidential elections, and the world breathes a sigh of relief.
I'm in no way suggesting that the US elections are similar to the UK's, or that the Colombian and EU referendum had much at all in common.
What I will say, is that humans are doomed to repeat the same mistakes over and over. And nothing is as easily forgettable as the recent past.
So let's not get too comfortable just yet.