If polls that seem to be similar yield different results, you've got to find out why. And this week, we've seen different results from several polls that apparently asked the same "horserace" question: Who's ahead - or ?
Three polls, all taken on the same days, came to different conclusions. The USA Today/Gallup poll, conducted Thursday to Saturday, found Clinton leading Obama 51 percent to 44 percent among "Democrats and Democratic-leaning independents." A Gallup tracking poll found Obama ahead among "Democratic and Democratic-leaning voters. The CBS News/New York Times poll gave Obama a double-digit lead among "Democratic primary voters." Even in that poll, if we look only at registered voters who say they are Democrats, we find yet a different result: Clinton 45 percent, Obama 44 percent.
We can attempt to account for these differences in several ways. One is obvious: sampling error. None of the polls have sample sizes large enough to overcome appropriate skepticism about their conclusions. The CBS News/Times poll, for example, has an especially large possible sampling error: 283 "Democratic primary voters," with a possible error of six points either way.
But sampling error doesn't always explain everything. There also were slight differences in exactly when the polls were conducted. If there was any trend at work in support for one candidate or the other, the poll with the greatest number of interviews conducted on the last day of interviewing might find more of it. For example, the Gallup tracking poll, according to USA Today, had proportionally more interviews conducted on Saturday than the USA Today/Gallup poll did.
There were also differences in how questions were worded. The CBS News/New York Times poll asked: "Who would you like to see the Democratic Party nominate as its presidential candidate in 2008 - Hillary Clinton or Barack Obama?" The Gallup tracking poll asked which candidate "you would be most likely to support for the Democratic nomination for president in 2008."
Also important was where a question was positioned in the questionnaire. Did it come before or after any substantive questions? Did any questions ask for an assessment of the candidates or to assess a respondent's favorable or unfavorable view of them before asking preference? As I wrote last week, question-order issues can be significant.
But probably the most telling difference was the difference in which groups were being polled about the question. And on this subject, CBS News and The Times are the outliers - and we have been for years.
Unlike Gallup and many other survey organizations, we don't use the answers that people give to a question about party identification to decide who should be asked preference within a party. For example, most polls ask a question like this: "Generally speaking, do you usually consider yourself a Republican, a Democrat, an Independent, or what?" And those polls use the answers to determine the group that matters for analysis. We ask instead about voters' intention. Starting more than a year before a presidential election, we ask registered voters this question: "Next year, are you more likely to vote in a Democratic presidential primary or caucus, or a Republican primary or caucus, or aren't you likely to vote in a primary or caucus at all?"
In the past, that question has served very well through primary seasons much shorter than this one (when all we need to do is change "next year" to "this year"). Early voting states like Iowa and New Hampshire contain relatively few voters, and usually we know who the nominees will be after the first few big primary days. So when we have to alter the question to include those who may have already voted, relatively few people are affected.
This year, of course, the primary season has been quite different. More people have voted - and voted earlier than ever. And the nomination fight remained unresolved. So our question sounded more complicated: "Have you already voted in or do you plan to vote in a Democratic primary or caucus this year, or in a Republican Primary or caucus, or are you not voting in a primary or caucus at all this year?" That is the question that gave Obama the 50 percent-38 percent lead over Clinton in our poll.
So when it comes to the Democratic "horserace" question, we are in the process of moving away from the definition we have been using all along to measure preference. That's why in the CBS News poll release (PDF), we reported the preferences of voters who identified as Democrats before we talked about "Democratic primary voters" in the aggregate. We've never had a race like this. By May of every other presidential-election year, the nominees have always been clear. This year, our more complicated question has had to be put to more respondents than ever. And we don't really know if people are still responding accurately.
About a quarter of our Democratic primary voters don't call themselves Democrats in the party identification question. And about 16 percent of registered Democratic-identifiers aren't Democratic primary voters. The results for the Democratic horserace, among all registered voters who call themselves Democrats (45 percent Clinton, 44 percent Obama), does look more (though not entirely) like results from the other polls.
A focus only on the differences in the national horserace question distracts us from the similarities in the poll results. All of the national polls have examples of how perceptions of Obama have been negatively affected by the statements of his former minister Rev. Jeremiah Wright, and there is evidence both for and against a rebound. Fewer people now than before think he would unite the country; fewer now say he shares their values. Among people who think the Wright affair matters (that number varies, and more voters think it will affect other people more than it will affect themselves), the impact is negative.
We often don't look beyond the horserace. But we should.
By Kathy Frankovic