Only online polling has the tools to correct the errors of the 2015 election - and the future is increasingly mobile
When a whole industry is hit by disaster – as happened to the polling industry last May when we collectively failed to predict the Conservative election victory – you’d expect an unedifying period of panic. And this time was no exception.
We’ve seen the opportunists, magically revealing previously unpublished polls that had been right all along in an attempt to make a name for themselves; we’ve seen conservatives, who see this as an opportunity to slow the pace of technological progress; we’ve had the blowhards, calling for opinion polls to be banned; we’ve even had the absurd spectacle of pollsters blaming ordinary people for lying to them, as if it was not precisely their job to work that kind of thing out.
Today sees the publication of the report by the British Polling Council into what went wrong, and is a moment when we can finally cut through all the hot air. What people want to know is, can we trust opinion polls in the future? At YouGov we believe the answer is yes – we have identified what went wrong and have already set about correcting it. We were the only company to forecast Jeremy Corbyn’s spectacular victory in the Labour party race last September. With London mayoral elections and a likely referendum coming this year, we are confident that we will be accurate but understand that we will be judged by our success.
Specifically, the headline finding of today’s report chimes with our own internal investigation, namely that the main cause of the error last May was sampling. In other words, the samples of people who were answering surveys from all the different polling companies did not adequately represent the voting public. They may have looked representative in terms of age, gender, income and so on but they had too few Tories in them.
In particular, the groups we were measuring contained too many politically engaged people. For most of our work this is not a concern – if you want to find out what toothpaste people prefer, it makes no difference whether they take an interest in politics or not. But for an election poll, the very fact that they agree to answer a survey may make them more likely to be political.
This meant, for example, that we based our voter turnout models for the youngest age group (which skews heavily towards Labour) on an unrealistically engaged group of youngsters. In the real world, many fewer of them voted, while the over-70s turned out in droves and this pushed the Conservatives over the finish line.
So how do we go about getting a more representative sample? That is more controversial. Attracting people who don’t care about politics to take part in political surveys may sound like an impossible task. Certainly it is an existential threat for the beleaguered telephone pollsters who simply dial random numbers – the small minority of people who don’t immediately tell them to push off are bound to be more political, and because they are random strangers there is nothing they can do to correct for it.
Meanwhile today’s report points to “random probability sampling” as a potential solution. This method involves selecting a random group of addresses around the country and sending people to knock on the front doors and interview the inhabitants. If they are not at home or are too busy to take part, the researchers come back again and again – up to nine times – until at least 50 per cent of their target group have taken part.
But let’s be clear – this is not a practical solution. First, it is wildly expensive and time consuming (think six figures and four months for a single survey) and so completely inappropriate for measuring fast-changing sentiment like an election campaign. Worse still, the results are not necessarily better. Today’s report points to a survey that was carried out in this way months after the election, asking people what they remembered voting, and managed to show a Tory lead close to the eventual margin – but it significantly underestimated the Ukip vote share. There is a well-known tendency for Ukip voters to “misremember” their vote in face-to-face surveys, and explains a large part of that Tory lead. What’s more, Jeremy Corbyn became leader of the Labour party halfway through their sampling – so it is not really surprising that people became less likely to remember voting Labour as the research went on. Finally, after all that time and effort, you still can’t be sure if the 50 per cent of people you did not manage to reach might be in some way different from the 50 per cent you did.
No, the future of opinion polling is not going back to door knocking – it must be online, and more specifically mobile. YouGov has an online panel who answer regular surveys in exchange for cash and prizes (600,000 people in the UK) including plenty of non-political people who simply do it for the money. Because we have detailed profile data on all of them we can identify the non-political ones and make sure to include enough of them in future election surveys. Random samples of strangers – telephone or face to face – can never make this adjustment and so are vulnerable to effects they can’t even measure.
The way to get more “normal” young people taking part – a large part of the polling error in 2015 – is surely to build a mobile-based experience that makes taking part easier and more social, a more natural part of their online lives. We are already doing this. We need to innovate faster to keep pace, not pretend the internet never happened.
You only need to look around to see the future of opinion polling. For the first time in history, the majority of society walks around with a smartphone – a miniature networked computer designed to share information at the touch of a button. It’s a pollster’s dream. For now we need to go the extra mile to make sure to include the parts of society that are less plugged in, but within decades we could be seeing instant mass participation in everyday public decisions that will make the polling debacle of 2015 seem like ancient history.