Trolling The Polling

Paul Mirengoff over at Powerline is among those taking issue with the latest CBS News/New York Times poll, particularly regarding the questions asked about the wiretapping issue:
A New York Times/CBS poll purports to find that support for the president's warrantless surveillance program is "mixed." But the poll is bogus. In addition to the usual MSM trick of over-sampling Democrats, including non-voters, etc., the pollsters asked misleading questions that do not reflect the actual nature of the NSA intercept program.
PE has addressed the weighting issue before and you can read the entire post for more, but here's the basic explanation from it:
The issue of weighting in polling is much discussed and debated, and there is plenty out there to consider. Mystery Pollster has some of the most in-depth discussion on all aspects of polling (you can find out all about weighting here). The issue of weighting is, at its root, about adjusting data to reflect the most accurate picture possible. There are plenty of arguments about how to do that, but for our purposes here, let's stick to how CBS News does it.

A CBS polling primer answers the question of whether "our respondents look like the American public":

At the end of our surveys, we find sometimes that we have questioned too many people from one group or another. Older people, for example, tend to be at home to answer the phone more than younger people, so there is often a greater percentage of older people in our surveys than exists in the American public.

When that happens, we take great pains to adjust our data so that it accurately reflects the whole population. That process is called "weighting." We make sure that our final figures match U.S. Census Bureau breakdowns on age, sex, race, education, and region of the country. We also "weight" to adjust for the fact that people who share a phone with others have less chance to be contacted than people who live alone and have their own phones, and that households with more than one telephone number have more chances to be called than households with only one phone number.

So when we add up all the answers to our questions, we know that no one's opinion counts for more than it should. When you see one of our poll results on TV or in the newspaper, you know that it does not show the opinions of only one or two groups of Americans.

So how does a poll end up being comprised of 35% Democrats, 41% Independents and 24% Republicans? Kathy Frankovic, Director of Surveys for CBS News, explains that there is no adjusting for party identification in the CBS News polls and that the weighting described above accounts for the changes in the party ID. Frankovic notes: "The people who tend to be under-represented in a sample … tend to be younger and tend to be more minorities. So that by assuring that the sample looks like the country, you're probably going to almost always increase the number of Democrats." In short, when the sample is adjusted to match the Census Bureau data, the party ID percentages change.

That's a lengthy explanation for a very complex issue, check out the Mystery Pollster for more. As to the question of wording, the issue may be as much in how it was portrayed than the actual results since the poll did find that 68% answered willing to the following question (as Mirengoff notes): "In order to reduce the threat of terrorism, would you be willing or not willing to allow government agencies to monitor the telephone calls and e-mails of Americans that the government is suspicious of?"