Exit polls have been around for more than four decades. What do they tell us, and what questions can't they answer?
We know exit polls tell us about the desires of the voters. For example, if we examine the 2004 NEP national exit poll -- the pooled operation of five television networks and the Associated Press -- we find more than just the simple fact that George W. Bush won. Even though the president claimed that election was an "accountability moment" on Iraq, and the voters chose his approach, the polls showed a nation divided, one that might not have the patience or the trust required to stick with the president on the war.
Even on the narrow question of approval of the decision to go to war in Iraq in the first place, voters were closely divided - 51 percent approved, but 45 percent did not. Assessments of what was taking place on the ground at the time were negative - 44 percent said things were going well, 52 percent said they were not. And had the war made the U.S. more secure? No, said 52 percent (46 percent said yes).
On none of these questions did more than 52 percent of voters agree. And when voters were asked which issue mattered most in their vote, the 15 percent who chose Iraq certainly didn't vote for Bush - by nearly three to one (73 percent to 26 percent), they supported John Kerry.
Exit polls give voters themselves a voice in helping us understand what an election means (without exit polls, of course, the winners can interpret election results any way they like).
But people have wanted to use exit polls for other purposes, too, most notably to assess whether or not vote fraud has taken place. Some of the claims made about the 2004 U.S. election are summarized by Mark Blumenthal in multiple postings at pollster.com. But such claims are much more common in developing democracies.
Are exit polls more accurate than the vote? Although it is tempting to use exit poll results as a way of validating election returns, it's not that simple. The same sort of information that you need to know to judge the quality of a regular poll must be applied to exit polls. The hard part sometimes is finding that information. Absent it, we need to remain skeptical about an exit poll. We can't really make judgments about them.
Information about exit poll methodology in the U.S. usually is accessible. In fact, there is already an FAQ about what we can expect for 2008 exit polling. But that's often not true elsewhere.
There are some famous examples of using exit polls for validating elections. For example, the 2004 Venezuela referendum to recall Hugo Chavez, where an exit poll found a sizable "yes" vote while the official vote found the opposite. Or the presidential elections in Ukraine that same year, where the first round brought charges of fraudulent vote counting.
But detailed information about those exit polls is often unavailable, making those judgments difficult if not impossible. What do we need to know? At the very least, we need to know something about the sample and something about the interviewing.
First, the sample could misrepresent the country. For example, if an exit poll in New York State had too many precincts in New York City, it would probably overstate the Democratic vote (New York City gave 75 percent of its vote to Kerry in 2004). Exit polls in parliamentary systems, like most in Western Europe, purposely over-sample in competitive districts. In trying to project the distribution of seats, they have little need to focus on the areas where the outcomes are expected to be easy wins for one party or the other. And they publicly admit what they have done. But in countries like Ukraine, where the Western portion (near Poland, Slovakia and Hungary) supported candidates like reformer Viktor Yushenko (of the Orange Revolution there) and the Eastern portion (the part closer to Russia) supported pro-Russian (and former communist) Viktor Yanukovych, it may be difficult to discover whether the sample adequately allocates voters proportionately between the country's two political components.
Second, the interviewers might affect the results. Are they professional interviewers or volunteers? Volunteers might have a particular political point of view, and might cause unexpected results. This can happen even when the interviewing force is trained professionally. In the 2004 U.S. Election Day exit poll, younger interviewers on average had more refusals and more error than older interviewers: exit poll votes in their precincts were farther off from the tabulated vote. Could volunteer interviewers in Venezuela, many of whom were associated with a group opposed to the Chavez regime, have affected those polling results?
Of course there is much more to know to judge an exit poll. The World Association for Public Opinion Research has been working on guidelines for exit polls that can be applied everywhere. There is still much to do in this area. But it's important to remember that while exit polls do important things, a flawed exit poll can't replace a flawed vote count.
By Kathy Frankovic