How to Read Political Polls Like a Pro

During political campaigns, polls dominate the news. Here’s a guide to interpreting them rationally.

1. Take polls conducted more than six weeks before an election with a sack, not a grain, of salt.  As Alexis de Tocqueville noted in 1840, Americans “find it a tiresome inconvenience to exercise political rights which distract them from industry. When required to elect representatives, to support authority by personal service, or to discuss public business together, they find they have no time.”

Then, as now, many citizens don’t really make up their minds about how they will vote on any but the most publicized races until shortly before the election. So early polls are prone to volatile top-of-the-head responses. Such polls are like nailing Jello to a wall.

2. Expand the margin of error at least by half, if not double. The results of scientific polls are often presented as “a snapshot” of public opinion at a particular time with an exact percentage of the population favoring something or someone and a precise margin of error.

But polling experts caution that the margin of error only measures one reason why the survey results may differ from true public opinion--sampling error. Further, this error margin rests on a small mountain of assumptions that are never met in the real world. A poll is never a “snapshot.” It’s merely an estimate. Not a photo, but a drawing. And it’s based on probability, not certainty.

When the gap between candidates--or sides of a ballot issue--is equal to or smaller than the margin of error, the poll can't really say who is ahead. In fact, unless the gap is twice as large as the margin of error, there is at least a small chance that the race is too close to call. (That's because each side's estimate has a margin of error around it.)

Because there are so many reasons a poll may be off, if possible, average results of similar polls rather than relying on a single survey, suggests Prof. Steven Chaffee, an experienced academic pollster at the University of California, Santa Barbara.

3. Consider the source. If those who conduct or pay for the poll have something to gain from the outcome, exercise extreme caution. Results of a candidate’s own polls are notoriously unreliable. Warren Mitofsky, who directed polls for CBS News for 27 years, says news media shouldn’t even report results from partisan polls. Put the most trust in polls conducted by well-known firms and paid for by news media.

“The biggest problem I see is partisan polls being taken as factual information,” cautions Mark Baldassare, senior fellow at the Public Policy Institute of California and president of the polling firm Baldassare Associates. Partisan pollsters will ask a question multiple times, but only report the most favorable response, he explains. And they will use “push polling” in which respondents are read a hypothetical or negative campaign charge and then asked how they feel. “I see it all the time,” he adds.

4. Read any poll not based on a random sample with extreme caution. Many websites and newspapers conduct polls in which visitors are invited to “vote” on a question. Once you vote, you can see the results. Even when thousands have registered their opinions, such polls do not necessarily indicate opinions of the public at large. Only polls in which everyone has an equal chance of being selected permit such a projection.

5. Look at the wording of questions. The Field Poll found that a shorter description of a ballot initiative in the March primary was rejected by a large majority of respondents while a longer description of the same proposition was approved by a large majority. Even subtle differences in wording can have major effects.

6. Beware of sub-group comparisons. The margin of error reported in most news stories only applies to the entire sample, yet reporters often compare the positions of groups within the sample, such as Republicans and Democrats, males or females, Latinos, Anglos, Asian-Americans and African Americans, etc. The smaller the group, the less accurate the estimation.

7. Look at the number polled. If fewer than 600 persons respond to a political poll, it may not merit your attention. The margin of error will be large enough that only big differences in opinion can be estimated.

Polling is counter-intuitive. Accuracy generally depends on the size of the sample interviewed, not the ratio of the sample to the population at large. A random survey of 1,500 persons can estimate the opinion of hundreds of millions of Americans within a few percentage points of reality. But a similar poll of 300 in a town of 5,000 will have an error margin almost 12 points wide.

8. Mistrust limited disclosure. At a minimum, poll reporting should indicate:

·        whether the poll was scientific (i.e. respondents were chosen in a way--usually random sampling--that gave everyone an equal chance of being interviewed),

·        margins of sampling error for each group whose opinion is estimated,

·        the number polled,

·        who was polled (adults, registered voters, likely voters, etc.),

·        how those people were identified (from voter rolls, taking their word, etc.),

·        the actual wording of  key questions,

·        the dates of the poll.

·        who conducted the poll,

·        who paid for it and any potential for conflict of interest with the subject of the poll.

·        how the poll was conducted, e.g., telephone interviews using random digit dialing, from voter registration lists, person on the street, etc.

·        the response rate--those who completed the interview with the pollster divided by the total of those who cooperated plus those who couldn’t be reached and those who refused. (If this ratio dips much below 50%, the poll is not really scientific and margins of error become meaningless.)

Sometimes the language(s) in which the questions were asked is also relevant. If the opinions of those who speak Spanish, or Chinese, for example, are important, those not fluent in English may decline to respond to a poll conducted only in English.

For more information about evaluating polls, try one of these websites: American Association for Public Opinion Research, or the National Council on Public Polls (for 20 questions a journalist should ask about poll results).

If you're curious about the accuracy of some recent Bay Area political polls, check our analysis.

 

-- John McManus