Why question wording matters

Peter KellnerPresident
October 24, 2011, 8:55 AM GMT+0

This blog would not appear if I wanted an easy life. I am going to provide ammunition for those who don’t like opinion polls, think our numbers mean little and reckon our political culture would be healthier without them.

Of course, that’s not how I view the evidence in the table. I believe it shows the need to read poll results more carefully. In particular, we need to distinguish between top-of-the-head answers to questions asked out of the blue, and how people react once they have thought about the matter in hand.

Here’s what we did. To inaugurate a new partnership between YouGov and the Department of Politics at Cambridge University, we tested how the wording and context of a question can shape responses.

We all know how loaded questions can bias results – 'Do you think Britain should allow the tinpot dictators in Brussels to tell us what to do, or defend its proud history of democracy and national independence?' The purpose of our experiment was to ask alternative, unbiased, questions. We chose the BBC licence fee as the subject of our inquiry: do people regard it as good or bad value? We posed the question nine different ways, with nine different 2,000+ samples, in late August and September.

ATTITUDES TO BBC LICENCE FEE (questions put to separate samples)

Good %

Neither %

Bad %

Don't know %

Net (good minus bad)

Overall, do you think the BBC licence fee is good or bad value for money?

39

18

39

4

0

The BBC licence fee costs £145.50 a year. Do you think this is good or bad value for money?

27

17

54

2

-27

The cost of the BBC licence works out at £12.13 a month. Do think this is good or bad value for money?

37

17

44

2

-7

The cost of the BBC licence works out at £2.80 a week. Do think this is good or bad value for money?

42

19

37

2

5

The cost of the BBC licence works out at 40p a day. Do think this is good or bad value for money?

44

16

36

3

8

A basic Sky TV package (excluding sports and movie channels) costs £19.50 a month. The cost of the BBC licence works out at £12.13 a month. Do think the BBC licence fee is good or bad value for money?

36

15

46

3

-10

A Sky TV package including sports and movie channels costs £55.75 a month. The cost of the BBC licence works out at £12.13 a month. Do think the BBC licence fee is good or bad value for money?

43

19

35

2

8

The cost of the BBC licence works out at £12.13 a month. Do think this is good or bad value for money?

42

23

32

3

The cost of the BBC licence works out at 40p a day. Do think this is good or bad value for money?

51

21

26

3

The table shows what we found. When we asked the simplest question – 'overall, do you think the BBC licence fee is good or bad value for money?' – we found equal numbers replied 'good' and 'bad'. This yields a net score (good minus bad) of zero. However, when we started adding information, the figures changed. Reminded that the licence fee costs £145.50 a year, people told us by two-to-one that it was bad value – a net score of minus 27. But when we divided the same annual sum into smaller timescales, responses became steadily more positive. The net score is plus 8 when people are told that the licence fee works out at 40p per day.

Next we added in the cost of a Sky package, and quoted figures (as Sky does) in terms of monthly subscriptions. Again the results varied. When told a basic Sky package costs £19.50 a month and a BBC licence fee £12.13 a month, people give Sky a positive net score, and the BBC a negative score. But when we mention that a premium Sky package (including sports and movies) costs £55.75 a month, the BBC’s net score turns positive – and Sky’s turns negative.

Finally, we asked a warm-up question – how people feel about the service provided by the BBC. Three quarters of the public say they are satisfied. Having expressed their satisfaction, people are more likely to say the licence fee is good value. When the satisfaction question is linked to the 40p a day cost, the net score soars to plus 26.

In short, responses range from two-to-one saying good value, to two-to-one saying bad value. Critics will say this renders such polls meaningless. I disagree. The figures tell us something significant about the way coalition appeared to have public support when it froze the licence fee last year. One reason ministers succeeded was that the public debate centred on the high-sounding annual fee rather than its more digestible daily equivalent.

The deeper point is this. A typical poll question gives people between two and five answer options. Except for respondents who answer “don’t know”, their views are reduced to a single, simple word or phrase – “doing well”, “oppose”, or whatever. The results frequently arouse media interest. Indeed we are often commissioned to ask stark questions in order to generate bold headlines and stark findings that this politician is hated or that cause is doomed.

It’s not that these headlines or allegations are wrong, but they are often too crude. A single question, or even a short sequence of questions, will seldom tell us all we need to know. One respondent may be knowledgeable, passionate and certain in her opinions: another, misinformed, indifferent and uncertain. Further thought, or the injection of particular facts, will cause one person to change their mind, and leave another unmoved. Unless we explore such matters, we are missing half the story.

Recently YouGov asked one of its routine questions about Nick Clegg and found him less unpopular than usual. How come? This time, we preceded this question with others about him and his party. I believe this affected his rating. We should not label one result as 'right' and the other 'wrong': rather we should regard the two sets of figures as telling us something important: that when normal folk spend a little time thinking about the Lib Dem leader, some of them start to warm to him. This is just one reason why the trajectory of British politics over the years ahead could surprise us all.

A more sustained example of polling variation took place last winter ahead of this year’s referendum on the Alternative Vote. Until the final few weeks, YouGov’s standard question included a brief description of AV and asked people whether they want to switch to this system or stick with First-Past-The-Post. We found the 'no' camp moving into the lead last September, and never losing it after that.

Other companies asked a simpler question, with no explanation. They showed 'yes' staying ahead until March. I lost count of the number of times 'yes' supporters complained to me about YouGov’s approach. In terms of technical purity, they had a case. By injecting information we altered the way some people responded. Indeed, I am sure that is what happened – and that is precisely what made our polls last winter a more useful guide to the eventual outcome.

Nobody should have been surprised. Other research showed widespread ignorance of how AV worked. This meant that simpler questions, asked months ahead of the referendum, were lousy predictors of what would happen. Think about the process: people who were minding their own business when the phone rang, had to give a view on a subject they neither understood or cared about. Many thought the choice was the status quo or something vaguely better. As knowledge of AV spread, support evaporated. By adding a brief explanation our question effectively anticipated this trend.

None of this will surprise our corporate clients who commission market research for their companies. They pay for insight, not headlines. They want to understand what makes their customers really tick. At the risk of offending people who dislike applying commercial notions to the great debates about values and ideology, I contend that we should explore how voters make their political choices at least as thoroughly as the way consumers choose their brand of dog food.

This commentary appears in the November issue of Prospect magazine

See the survey details and full results here