- CounterPunch.org - https://www.counterpunch.org -

What If the Problem with Phone Polls is that They are Phone Polls?

For years an open secret among academic researchers and polling agencies is starting to become apparent to the general public: the reliability and predictive value of telephone-based surveys is becoming increasingly questionable.

In the past two weeks a half-dozen independent public opinion polls have clouded Americans’ understanding of the presidential race. A Gallup poll shows Bush with an 8-point lead over Kerry, New York Times/CBS gave Bush a 9-point lead while Pew and other found a dead-heat. As wonks, pundits and spin doctors thrash about trying to account for wide disparities in presidential poll results, it might be time to take two steps backwards and re-examine the basic reliability and assumptions of land-line telephone polling research methods.

As I read these latest disparate results I first wondered if poll respondents simply couldn’t differentiate between the two candidates being sold to them-this sort of “brand confusion” in advertising often leads to similar results. But the constant bombardment of advertisements about the candidates’ past (in)actions during the Vietnam War has created identifiable brand loyalties, so something else must be going on. There are multiple reasons to wonder if the once robust reliability of telephone surveys is starting to come undone.

Jimmy Breslin wrote a fine piece this past week codifying what many of us social scientists who critically use and evaluate research methods have been saying and thinking for some time. As Breslin says, “anybody who believes these national political polls are giving you facts is a gullible fool.” While Breslin’s language may strike some as harsh, his reasoning is dead-on.

As a social scientist who uses both quantitative and qualitative research methods, I am not one to dismiss survey research out of hand, but something has methodologically gone awry when polls are swinging about this wildly. Obviously past telephone-survey presidential polls have accurately predicted election outcomes, but Americans’ social interactions via telephones may be evolving in ways that render past telephonic sampling techniques unreliable.

We Americans simply don’t answer our phones like we used to. Entire industries are now devoted to helping us not answer the phone. Voicemail, Caller ID, caller-specific-rings, cell-phones, even email have fundamentally transformed the ways we (don’t) answer the phone when it rings. These and other technological innovations have moved us from a late-20th Century near-pavlovian automatic response of answering the phone when it rang, to new levels of screening or ignoring calls without a sense that we might be missing something important. When pollsters call under these technological conditions they are now increasingly treated as any telemarketer or unknown caller would be, thus the people who pollsters actually get to talk to are becoming increasingly less representative of the general public. There now may be something unusual about people who are willing to answer the phone a talk with strangers, and we should be skeptical about generalizing from the results of these surveys. It is possible that the new habit of non-phone-answering is evenly distributed throughout the population (thus reducing this as a sampling confound), but this seems unlikely.

Anthropologist Robert Lawless once speculated on the possibility that many “native informants” were often marginal, or “odd” members of their societies. They were at times so unusual that they were the only ones willing to deal with the oddest of outsiders: anthropologists. The implication of this finding of course is that if anthropologists’ primary informants are often marginal people, then it can be questionable to generalize from the information collected from interviews with them.

Gathering survey information by telephone can be a bit like calling someone on the phone to tell them their phone isn’t working. Pollsters and those who consume the products of surveys need to remember: the fundamental limiting feature of telephone surveys is that you can’t talk to people who won’t (or can’t) talk to you.

The first time pollsters and politicos got burned by forgetting that you can’t conduct phone interviews with people who don’t answer phones, we saw the classic 1948 photograph of Harry Truman triumphantly clasping the Chicago Tribune banner headline proclaiming “DEWEY BEATS TRUMAN.” The Tribune went to press early, printing a story based on pre-election polls indicating a slam dunk for Dewey. The shortcoming of these polls was the failure of polling agencies to consider how the people responding to their surveys were fundamentally different from people who had no opportunity to respond to their surveys. They hadn’t understood that in post-war America the distribution of American wealth and the distribution of telephone was such that those who could afford telephones tended to vote for Dewey, while those too poor to have phones tended to be New Deal Democrats who supported Truman.

But statisticians and pollsters know that new telephone technologies present serious problems for standard telephone surveys. A Pew Research Center study released last April pleadingly entitled, “Survey Experiment Shows: Polls Face Growing Resistance, But Still Representative” found that:

“More African-Americans than Whites have caller-ID (73% vs 47%) and a higher percentage of Blacks use it for call screening (34% vs 24%). Young people, ages 18-29 are the group most likely to say they always screen calls with caller-ID (41% say this), compared with only 12% those aged 65 or older.”

Pew also found that more women than men were found to use features like call blocking (20% vs. 14%). If we can get over the paradoxical fact that this data was collected in phone interviews (and of course the point of this piece is that I’m not sure we can get over that) you can see that those profiled as being most prone to answering phone surveys tend to be: (more) White, (more) older, and (more) male. Or if you prefer to think this through in hall-of-mirrors-phone-paradoxical-mode: We simply don’t know how many households with Caller-ID were called and didn’t choose to answer. Out of those homes who did answer the phone it was reported that those who didn’t use call screening were more white, male & older. But for all we know there is a whole universe of households with the opposite attributes who used Caller-ID to avoid this poll.

Remarkably, this Pew report concluded that there was no direct evidence that these new technologies were, “undermining the reliability of survey research.” This study did grudgingly admit that, “it is possible that call screening is even more prevalent in households in the sample where an interview was never obtained.” In other words: we don’t know what’s really going on because we can only talk to people who aren’t using Caller ID systems to keep us away from them. Given Pew’s own vested interest in finding high levels of reliability in phone surveys it is not surprising to find this self-serving conclusion. But while Pew and rest remain devoted to the convenience of land-line phone surveys, Zogby has increasingly investigated other means of polling Americans because of phone surveys’ decreasing reliability, which in the language of research methodology is measured by the ability to consistently replicate findings.

There is a mixed literature exploring the possibility that poll results can cause voting behavior. It simply isn’t clear if polls can act as advertising endorsements of candidates or ballot measures. But in another sense, skewed presidential polls can significantly alter voters’ behavior. Though telephone-based polls’ predictability may be becoming increasingly unreliable, if they broadcast messages of certainty, predictions of a presidential fix can keep people away from the polls, and while the differences between the two presidential candidates may not matter to some, other important national, statewide and local issues can be determined by the low voter turn-outs that can result from such misleading polls.

We will have to see how well these polls do in predicting the outcome of November’s election. As others before us have well argued, the proof of the pudding is in the eating. But in a world of such saturated advertising, the marketing of pudding can impact what people think they are eating.

DAVID PRICE teaches anthropology at St. Martin’s College in Olympia, Washington. His latest book, Threatening Anthropology: McCarthyism and the FBI’s Surveillance of Activist Anthropologists has just been published by Duke University Press. He can be reached at: dprice@stmartin.edu