N.H. Polls: What Went Wrong?

poll generic New Hampshire poll CBS/iStockphoto

By Kathy Frankovic, CBS News director of surveys.

After last week's New Hampshire primary, pollsters found themselves trying to understand "what went wrong." There were calls for investigations, and words like "problematic," "debacle' and "fiasco" were being tossed around. Of course, as one might expect, some of the people tossing those words around hadn't been engaged in pre-New Hampshire primary polling, and some of their explanations don't have data to support their claims.

One of those explanations was that white voters over-reported their intention to vote for an African-American candidate. That theory was based on findings from some elections in the 1980's and the 1990's when black candidates ran against white candidates. Those elections, however, tended to be in racially polarized situations (big cities like New York), and most especially in general elections. Voters see things differently when race is a factor, as demonstrated by political scientist Keith Reeves in this New Yorker column.

But the factor of race doesn't explain New Hampshire. The Democratic campaign is not racially polarizing now, and hopefully never will be. It is true, however, that when voters tell an interviewer what they are going to do, some of them may sometimes try and guess what it is the interviewer wants to hear, and tailor their answer to accommodate that. The theory is that the respondent (white or black) might not want an interviewer to think they aren't voting for a black candidate. They might think the interviewer will take offense, or believe the respondent to be racist.

Taken to its extreme, this theory predicts that respondents who think they have socially unacceptable opinions -- or situationally unpopular opinions -- simply won't answer a questionnaire. Many of us think we have opinions other Americans might find offensive. Twenty years ago, CBS News and The New York Times asked Americans if they thought any of their opinions would be very offensive to most other Americans; 26 percent said they did. When asked what those opinions were, some did mention their racial opinions, but more cited religion and abortion and feelings about government policies.

In the 2004 elections, younger exit poll interviewers had a difficult time convincing voters, especially older voters, to fill out exit poll questionnaires. The popular belief that younger voters were more likely to be Kerry supporters may have helped explain why the exit polls understated support for George W. Bush. Clearly, at least some of those non-respondents thought that their opinions might be offensive to the interviewer.

There is data that can explain whether voters interviewed before the New Hampshire primary had different response rates based on candidate preference, and that data can test the "race" theory. The theory would predict that those not voting for Barack Obama would be less likely to complete an interview. In November, Hillary Clinton held a 15-point lead in the CBS News/New York Times New Hampshire poll. We called back the same sample the weekend before the primary. The January response rate for the November Obama and Clinton voters was nearly the same, 74 percent for November Obama supporters and 68 percent for November Clinton voters. Before publication of the results, we adjusted ("post-stratified") the results to account for that small difference in response by previous candidate preference (which is normally done in panel surveys). Correcting for that small difference in response changed little.

There is a second theory. It suggests that pre-election polls didn't account for enthusiasm, that "likely voter" models need to take account of this, and that perhaps the likely voter models applied before the election missed the difference in enthusiasm for the candidates. Well, maybe. The surge in enthusiasm for Obama in Iowa may have led some pollsters to open up their models to allow for voters who might otherwise be excluded, giving those who said they were probably going to vote, and those who may not have voted before more of an opportunity to be part of the sample. Prior to Iowa, as reported in the CBS News pre-New Hampshire poll release, there was enormous enthusiasm among Obama supporters, an enthusiasm that suggested that more than just past voting behavior was at work in getting people to the polls.

But New Hampshire wasn't Iowa. In fact, if any of the candidates benefited from enthusiasm in New Hampshire, it was probably Clinton. Her supporters were 10 points more likely than Obama supporters to say that their choice was "a lot better" than any of the other Democratic candidates. More than half of her supporters in the CBS News poll had recommended her to others, compared with just four in ten Obama voters who recommended him to others.

So this explanation might actually give us some understanding of what happened. But since all the polls were off in the same direction, while methods and practices of pollsters differed greatly, we may be forced to return to the simplest explanation, the one discussed in last week's column. It's the only theory that explains why most Republican pre-primary polls were accurate, while the Democratic estimates were not. Just as in some previous New Hampshire primaries (Democrats in 1984 and Republicans in 1988), that state's 2008 Democratic electorate was "fluid." Voters made decisions that were only temporary -- and were likely to change their decisions right up to the last minute. And women voters -- for whatever reason -- were the ones most ready to do that.
  • Kathleen Frankovic

Comments