Polling seems easy. Write questions. Make sure the options balance, and that the choices cover the range of options. Train interviewers to ask the questions as written, and tally the results.
But if you don't also ask those questions in the right order, things can get complicated.
In 1995, CBS News asked for opinions about the leaders of the Republican-controlled Senate; we learned that opinions of Senate Majority Leader Bob Dole could be quite different, depending on whom we had asked about just before we asked about Dole.
Republicans had won control of both houses of Congress the previous November. In a January 1995 poll, 20 percent had a favorable opinion of Dole; 21 percent an unfavorable opinion. But the next month, positive feelings about Dole shot up to 40 percent -- while unfavorable opinions were at 23 percent.
Yet we had asked the same question in both surveys: "Is your opinion of Bob Dole favorable, not favorable, undecided, or haven't you heard enough about Bob Dole yet to have an opinion?"
Something had changed, but we weren't sure what it was, until we considered question order.
In the January poll, the Bob Dole question followed a series of questions about Bill Clinton; in February, it followed a question about Pat Buchanan, who merited only a 10 percent favorable rating from the public. Even at Clinton's lowest points, after the Republicans' Senate takeover, he was far more favorably viewed than Buchanan. Some people were basing their judgment of Dole on how it contrasted with their judgment of the person we had just asked about. Dole wasn't as popular as Clinton was, but he was a whole lot more popular than Buchanan!
The survey world produces many examples of order affecting results, sometimes even for questions we think tap very basic, strongly-held opinions. Take abortion, for example. Responses indicating support for legal abortion in general can change if we first ask about support for legal abortion in specific circumstances. Once people allow for abortions in specific circumstances, they are less willing to say they approve of abortion in general.
Sometimes apparently minor differences in wording, made to accommodate changes in question order due to news events, will create large differences in responses. A telling example occurred in the summer of 1998. On August 17, President Bill Clinton gave a speech in which he finally admitted to a relationship with Monica Lewinsky that was "not appropriate." News survey operations quickly went into the field to determine the impact of the speech. (Even as the scandal grew, Clinton's approval ratings had remained fairly steady).
During that year, the Gallup Organization had been sounding out public perception of Clinton in the same way as it asked about other prominent individuals, using this question: "I'd like to get your overall opinion of some people in the news. As I read each name, please say if you have a favorable or unfavorable opinion of this person or if you have never heard of him or her. What is your overall opinion of Bill Clinton?" On the evening of Clinton's speech admitting responsibility, though, Gallup asked only about the President - and boiled the entire question down to "Thinking about Bill Clinton as a person, do you have a favorable or unfavorable opinion of him?" The question immediately followed the question that usually begins news polls: "Do you approve or disapprove of the way … is handling his job as President?"
In a poll taken before the speech (Aug.10-11), the CNN/USA Today/Gallup Poll reported a 58 percent favorable rating for Clinton, with 40 percent unfavorable. But in the post-speech Gallup Poll, opinion appeared to have nearly reversed: only 40 percent favorable and 48 percent not! It was tempting to think the public had finally turned against the President, once he made his admission.
To its credit, Gallup conducted a split-ballot experiment on the very next day. [I describe split-ballot experiments here]. Randomly-selected halves of the sample were asked the favorable question only one of the two ways. The original way of asking the question produced results that suggested there had not been a big change: how the questions were introduced had made all the difference. By August of 1998, "Thinking about Bill Clinton as a person" was much less likely to yield a positive assessment than "What is your overall opinion of ….?"
(By the way, there is a great review of the 1998 questions from all organizations in Jeffrey E. Cohen; Presidential Studies Quarterly, Vol. 29, 1999)
The tricky issues of what-follows-what, and how that affects the answers, aren't simple. The safest way of minimizing any expected order effects is to rotate the order of the questions we ask about people and candidates. In fact, CBS News has been rotating question order when we ask voters whether they have a favorable or unfavorable view of the presidential candidates. We have used identical wording ("Is your opinion of [the candidate] favorable, not favorable, undecided, or haven't you heard enough about [the candidate] yet to have an opinion?"). The last choices ("undecided" and "haven't heard enough") are not universal among polling organizations. It's something we uncovered back in 1984, when we realized that respondents often wanted to say they were "undecided" about someone whom they had, in fact, heard about.
In our latest CBS News/New York Times Poll, many registered voters do tell us they aren't sure what to think about the remaining candidates. Nearly everyone has heard "enough" about Hillary Clinton and Barack Obama, but 12 percent still haven't heard enough about John McCain. That probably reflects the more intense news coverage of the continuing Democratic battle. And just about one in five voters say they have heard enough but are not yet expressing an opinion. And it's the critical independent voters who will almost certainly decide the November outcome - who are the ones most likely to be waiting to decide what they think about these candidates. And tracking them might not be as easy as simply ordering the questions!
By Kathy Frankovic