What's the best way to measure change?
One way, of course, is to survey repeatedly, asking the same questions. That's what tracking polls do. Tracking polls come from the world of politics - campaign managers want to see the impact of advertising and campaign events on a candidate's status.
Early campaign tracking polls told a story journalists wanted to tell. In the fall of 1980, public pollsters stopped interviewing a few days before the election. All of them showed the race between Jimmy Carter and Ronald Reagan was "too close to call." But pollsters for both candidates, who were conducting tracking polls, saw something different: a movement toward Reagan in the last days of the campaign.
That was too good for journalists to ignore. ABC News tracked Democratic voter movement between Walter Mondale and Gary Hart before the 1984 New Hampshire primary; and while their final news release didn't quite predict Hart's 12-point victory there, it certainly showed a distinct trend in Hart's favor between the Iowa caucuses and the New Hampshire primary.
Public pollsters use tracking polls when campaigns seem volatile, such as the periods between primaries, during and immediately after party conventions, and at the end of the general election campaign. In 1992, the Gallup Organization (with CNN and USA Today) tracked opinion every day of the fall campaign. This year they are producing results each day, which they call the "Daily Gallup." (There could be an insiders' joke in that, as the tracking poll is measuring the "horse race.")
But it can sometimes be difficult in a tracking poll distinguish between real change and sampling error. Pollsters and reporters may be tempted to "explain" a small change by referencing campaign events, but it is usually more likely that the "change" is merely a result of random fluctuations inherent in any survey.
Mark Blumenthal, at pollster.com, has looked at this year's Daily Gallup and argues that there has not been any real change in Democratic preferences for Hillary Clinton or for Barack Obama since Super Tuesday, February 5th - and that much of the Gallup tracking poll's changes can be explained by sampling error alone (See the argument here - it has generated much discussion on his blog, and has yielded at least one update.
Tracking polls aren't very interesting when there isn't much change. By the last two weeks of the 1984 campaign, there was little movement to be seen - Ronald Reagan was far ahead of Walter Mondale. In more recent elections, there have been only marginal changes in the last week. George W. Bush and Al Gore were fighting to a draw in 2000, with polls suggesting only that the race was close, and the Bush lead over John Kerry in tracking polls in the fall of 2004 was always small.
Sometimes there is real change, but it is short-lasting. Everyone talks about a "convention bounce," where candidates pick up support after their party's convention. But the candidate whose party holds its convention first almost always loses that bounce by the end of the other party's convention - and sometimes even sooner. Changes that are noticed after the candidates hold debates sometime stabilize (George W. Bush gained on Al Gore in 2000, and held on to most of it) and sometimes don't (Walter Mondale might have cut into Reagan's lead after their first debate, when Reagan's age briefly became a factor in the campaign, but that didn't last long).
This year, the possibility of political change itself became a story. We were conducting a CBS News Poll in mid-March as the controversy grew over Rev. Jeremiah Wright's videotaped sermons, and we saw an increase in Barack Obama's unfavorable ratings. Thirty percent of registered voters said they had an unfavorable view of him, up from 23% in February. But still, more voters were positive about him. And there was a subsequent event that could stimulate change again: Obama's March 17 speech on Rev. Wright and on race in America.
Search recent CBS News campaign polls.
Tracking polls have limitations. CBS News sometimes uses a different way of measuring change - one that looks more closely at which people change their minds, in order to better understand why they changed. It can be more accurate than asking voters to pinpoint the moment when their opinions changed. On March 20, three days after Obama's speech, CBS News called back more than half of those people whom we originally interviewed. That technique - often called a "panel survey" - is one we can use to see who changes, and that can help us figure out why. Such a questionnaire can be very short - demographic questions were asked in the original interview, so they don't need to be asked again - and there is no need to ask if someone's opinion has changed, because you can look at the previous response and see that it has.
We found no sizeable overall change in Obama's rating after his speech, though there was a lot of internal movement. Fifteen percent of those who had a favorable view of Obama before his speech did not maintain it afterwards in the second interview. Twenty-three percent of those whose opinions were unfavorable also changed. Most of those who changed their opinions went into the "undecided" category, suggesting that the speech had given them something new to think about. Forty-two percent of those who had been undecided before the speech said they had an opinion after - but they were almost evenly divided between positive and negative opinions. The impact was greater with some groups than with others - women were more likely than men to move into the "undecided" category after the speech. Obama lost a little ground with independents, and even with Democrats. Republicans had been unfavorable before, and they remained that way.
Panel surveys are a great tool - different from tracking polls, and very useful after some major event. We've used them following debates and even after wars have begun (tracking which people opposed beforehand changed their minds when it began), and we will certainly use them again.
By Kathy Frankovic