One of the toughest challenges pollsters face is asking about some emerging yet complex public policy issue in just a question or two. Challenging or not, it is a task that faces pollsters more frequently as our clients -- be they news organizations or private interests -- try to get a jump on the latest breaking news. Yet when the subject is complex and obscure, our questions do not gather reports of pre-existing opinions as much as responses to new information conveyed in the questions. Such a process was at work in polls conducted over the last two weeks about recent disclosures of a National Security Agency (NSA) program to gather domestic telephone records, making for quite a bit of partisan trashing of polls from all corners of the blogosphere.
In all of this, most of us overlooked one of the most important lessons of public opinion research, which is to avoid taking the results of any one question at face value, to avoid treating any single question as providing a singular version of "the truth."
On the NSA phone records issue, I definitely include myself among those that jumped to conclusions too quickly. On the morning of Friday, May 12, ABC News and The Washington Post released results of a short set of questions on the NSA program asked the previous evening. Although the interviews had been conducted in just one night, less than 24 hours after the story had broken, using a smaller than usual sample size (502 adults), the results appeared to be one-sided and clear-cut. As the Post's front page lead put it, "63 percent of Americans said they found the NSA program to be an acceptable way to investigate terrorism, including 44 percent who strongly endorsed the effort."
Looking for a quick blog item on a busy day, I took the results at face value and immediately leapt to speculation about the potential implications of the still-breaking NSA records story for future trends in President George Bush's job rating.
Not so fast.
A day later, a poll by Newsweek showed a very different reaction to the NSA program, and subsequent surveys conducted by USA Today/Gallup, CNN, CBS News and Fox News found results that fell somewhere in between. Support for the NSA program ranged from 41% on the Newsweek poll to 63% on the Post/ABC poll:
With the release of each new poll -- especially the first few surveys -- the partisan political blogosphere reacted as it often does. Partisans on the left and right attacked poll results that contradicted their own views as biased, flawed or methodologically questionable, yet tended to accept more favorable results at face value. Reacting to the first ABC/Washington Post poll, comments on my own site objected to the emphasis on "investigating terrorism" in the ABC/Post question and the lack of reference to the alleged illegality of the program. They described the poll variously as "utter nonsense" based on a "false dichotomy," "bogus and crafted to skew the results" and "a pathetically obvious attempt by the right-wing media to inoculate President Bush against possible articles of impeachment."
The release of contradictory results by other pollsters provoked similar reactions from conservative blogs. Under the headline, "Creatively Crafting Crafty Questions For Fun and Profit," blogger Dayfdd noted that the ABC/Post poll question explained the purpose of the NSA program ("an effort to identify possible terrorism suspects") while the Newsweek question did not, an omission he termed "blatant." On Polipundit, blogger "Jayson" rejected the USA Today/Gallup poll "derived largely from stoned couch potatoes who actually speak to pollsters on Friday nights and over weekends, had a margin of error of + or - infinity."
Yet for all the invective, bloggers did collectively identify key differences in question wording that do seem to explain much of the variation in results. Review the full text of the questions above -- and in this case you really should, either with the links above or on the more user friendly summary page provided by The Polling Report -- and a pattern emerges that helps explain the variation. The more a survey emphasized the need to "investigate terrorism" as a rationale for the program, the greater the support. The more it emphasized the potential invasion of privacy the greater the opposition.
So in a sense, in the debate over the merits of the various polls, everyone is right and everyone is wrong. All of the critiques contain some kernel of truth. The differences in question wording and order certainly influenced the result, and all of the questions probed an initial reaction based on limited awareness of the program itself.
Yet much of this debate rests on a false assumption, that a singular truth exists somewhere (or will eventually exist), perhaps out of reach of any poll, about whether Americans support or oppose the NSA records program. In reality, many Americans will never become fully informed about the NSA program and may never make a truly informed judgment. The "truth" of public opinion in this case is more likely about a sometimes conflicting mosaic of more general attitudes that shape the way Americans react to specific political arguments about the NSA program.
If we read a bit beyond the face value of each poll result, we can see evidence of a large number of Americans that feel strongly cross-pressured on issues of protecting privacy and investigating terrorism. The most important point is not that one question or poll is "right" or "scientific" and the other "flawed," but that roughly 20% of Americans offered conflicting assessments of the NSA phone records depending on the wording of the questions.
The most clear-cut evidence of the underlying cross-pressure comes from two questions with nearly identical wording (Q17 & Q18) asked on last week's CBS News poll: "To reduce the threat of terrorism," 69% said they would be willing to allow the government to monitor the communications of "Americans the government is suspicious of." At the same time, on a question with nearly identical wording, only 30% would be willing to allow the government to monitor the communications of "ordinary Americans on a regular basis" in order to combat terrorism.
The lesson we could all stand to learn here is that on issues of public policy no single question provides a precise, "scientific" measure of the truth. The most accurate read of public opinion usually comes from comparing the sometimes conflicting results of many different questions on many different polls and understanding the reasons for those differences.