Why online surveys outperform telephone surveys in terms of representing the whole population
When new evidence emerged last week that YouGov’s online panel is better at finding hard to reach demographics than phone polls it was met with surprise in some quarters. We have long known that one of the benefits of online research is that it can better represent the entire population than other methodologies but false claims about this has created a narrative among some commentators that phone research was actually better at doing this. It was satisfying to be able to finally debunk that myth but it is also worth taking the time to explain why this is the case.
The most important thing to understand is that no methodology, phone or online or face-to-face, is capable of reaching the 'hard to reach' when creating a new sample within two or three days. It's impossible. The reason YouGov can produce good samples, instantly, is because they are not 'new' samples but are drawn from a panel that has been painstakingly assembled over many years, a panel of over 800,000 people about whom we know a lot and which contains plenty of every kind of demographic imaginable. That was YouGov's key innovation when we started sixteen years ago, and has allowed us to conduct highly accurate research (proven again and again by our commercial, political and entertainment data).
One of the frequently levelled criticisms of online research is that the samples are self-selecting. The truth is though that all methodologies are self-selecting in one way or another. Nobody is ever forced to participate in research - they are only involved if they have chosen to answer. This is where telephone polls have a problem – they rely on respondents being in, available and prepared to do a survey the moment that they call.
That takes out big groups of the population who are not just unwilling to participate but unable because they work long hours, particularly those that work in manual jobs where answering the phone during work is not an option; those that are out a lot; and people with young children. Basically any group of people who at any specified moment are unlikely to have the time to talk to a stranger for ten minutes. That is why response rates on phone polls are so low (nearer 5% than 10%) and, importantly not low in the right proportions.
By contrast online surveys allow respondents to take the survey at the moment of their choosing – late at night, during a lunch hour, on the bus to work, whenever they want to do it but most importantly at any time they are free – it is there and waiting for them, not a now-or-never choice. You can stop and start; if the baby starts crying no problem, your answers are stored and you can return to it hours later. When you think about it, it is obvious – of course online can reach harder to reach groups, even before you get to the willingness question they are a long way ahead on availability.
So online gives you a better starting point of people who could participate. They then have to choose to take part and online has the advantage here too. With all research one of the principle incentives is to take part in the debate, feel that you are contributing. That is a powerful incentive, made more broadly attractive in the context of an online panel such as YouGov where you are part of an ongoing debate, but it is also the only possible incentive for phone polls. Online can add other incentives that bring in those who don’t feel any civic desire to participate. The cash payments YouGov offers may be small but they do attract a different type of person. In addition, YouGov has a lively opinion-sharing community that covers everything from movies to sport, not just politics.
Online methodologies have a better chance of finding the hard-to-reach groups and once they’ve been found, a better chance of persuading them to participate. Given that, it really should not have been a surprise when the evidence emerged that online can better represent the real population than phone.
The YouGov methodology enables us to build up incredible levels of information about people (it might seem iextraordinary but we have over 200,000 data variables on how members of our panel lead their lives). This has two big advantages in terms of achieving representative samples – it means we can micro-target the sample that we need and, because we have so much information already at hand, it means that we can keep questionnaires short.
It is not just having those advantages that is important, you also need to know how to use them. This is where YouGov’s team of top data scientists really excel and it is their great work that means YouGov comes out top of the pile in terms of finding representative samples. Earlier this month there was independent verification of the quality of our samples:
Pew, the world's leading polling institute, published a study into online surveys with the headline conclusion that 'Vendor Choice Matters.' In their thorough analysis of nine competitive survey providers, 'Sample I' (YouGov) dramatically outperformed the others. As they put it, "this top-performing sample was notable in that it employed a relatively elaborate set of adjustments at both the sample selection and weighting stages…it appears that the sample I vendor has developed an effective methodology.” For full details of the methodologies we employ to achieve this high level of accuracy, see the more detailed explanation by Professor Doug Rivers, YouGov's Chief Scientist.
Making it easier for respondents to participate, making it more beneficial for them to participate, making it more interesting for them to participate and knowing more about them – all of these factors are crucial in getting a representative sample and explain why online research and YouGov in particular is better at reaching the hard to reach demographics than phone research.