July 21, 2015

The challenges of polling when fewer people are available to be polled

Around the world, pollsters have had some high-profile flops lately. In both the U.K. and Israel, pre-election polls earlier this year predicted much tighter races than actually occurred. Last year, Scots voted against independence by a wider-than-expected margin. In the U.S., many pollsters underestimated last year’s Republican midterm wave, and some observers have suggested that polls simply aren’t appropriate tools for studying certain subjects, such as religion.

Cliff Zukin, past president of the American Association for Public Opinion Research and a Rutgers University political science professor, wrote recently that “two trends are driving the increasing unreliability of election and other polling in the United States: the growth of cellphones and the decline in people willing to answer surveys.”

Despite those challenges, social scientists, market researchers, political operatives and others still rely on polls to find out what people are thinking, feeling and doing. But with response rates low and heading lower, how can survey researchers have confidence in their findings? Scott Keeter, director of survey research at Pew Research Center, addresses this and related questions below.

Scott Keeter, director of survey research at Pew Research Center, discusses the challenges facing pollsters.
Scott Keeter, director of survey research at Pew Research Center

Do low response rates in and of themselves make a poll unreliable?

The short answer here is “no.” The potential for what pollsters call “nonresponse bias” – the unwelcome situation in which the people we’re not reaching are somehow systematically different from the people we are reaching, thus biasing our poll results – certainly is greater when response rates are low. But the mere existence of low response rates doesn’t tell us anything about whether or not nonresponse bias exists. In fact, numerous studies, including our own, have found that the response rate in and of itself is not a good measure of survey quality, and that thus far, nonresponse bias is a manageable problem.

For example, our 2012 study of nonresponse showed that despite declining response rates, telephone surveys that include landlines and cellphones and are weighted to match the demographic composition of the population (part of standard best practices) continue to provide accurate data on most political, social and economic measures. We documented this by comparing our telephone survey results to various government statistics that are gathered with surveys that have very high response rates. We also used information from two national databases that provide information about everyone in our sample – both respondents and non-respondents – to show that there were relatively small differences between people we interviewed and those we were unable to interview.

But it’s important to note that surveys like ours do have some biases. Better-educated people tend to be more available and willing to do surveys than are those with less education. Nonwhites are somewhat underrepresented. People who are interested in politics are more likely to take surveys that have to do with politics. But most of these biases can be corrected through demographic weighting of the sort that is nearly universally used by pollsters.

Are some kinds of biases harder to correct than others?

While weighting helps correct the overrepresentation of voters and the politically engaged, it does not eliminate it. This makes it especially important to have accurate ways of determining who is likely to vote in elections, a problem that all political pollsters grapple with.

The one other source of nonresponse bias that seems to persist after we apply demographic weighting is the tendency of survey participants to be significantly more engaged in civic activity than those who do not participate. People who participate in volunteer activities are more likely to agree to take part in surveys than those who don’t. This might lead us to overestimate things like the proportion of U.S. adults who contact elected officials, work with other people to solve community problems, or attend religious services on a weekly basis (though even in surveys with very high response rates, Americans report church-attendance rates that appear to substantially exceed actual attendance). Because of this, we try to be especially cautious in interpreting data about volunteer activity and related concepts. But fortunately, this characteristic of survey participants is not strongly related to most other things we study.

Survey response rates have been falling for many years. Why has this become of particular concern now?

One reason that there’s greater public awareness of falling response rates is because we and other researchers have been closely tracking the decline, constantly monitoring for impact and talking publicly about the issue. Our 2012 study of nonresponse documented the downward trend; at that time, we reported that the average response rate in 2012 was 9%, a figure that’s been widely cited since. There’s also been more discussion lately because of faulty election polls in the U.S. in 2014 and in Britain and Israel this year.

It’s important to keep in mind that even if there is more public discussion about the nonresponse issue now, it’s not a new concern among survey researchers. Scholars were noting the declines in response rates 25 years ago. We conducted our first major study of the impact of survey nonresponse in 1997, when our telephone response rates were 36%.

Do we know why fewer people are willing to respond to surveys than in years past?

The downward trend in response rates is driven by several factors. People are harder to contact for a survey now than in the past. That’s a consequence of busier lives and greater mobility, but also technology that makes it easier for people to ignore phone calls coming from unknown telephone numbers. The rising rate of outright refusals is likely driven by growing concerns about privacy and confidentiality, as well as perceptions that surveys are burdensome.

Does Pew Research Center see the same pattern of low/declining response rates in other countries?

Yes indeed. Nonresponse to surveys is growing in many wealthy nations, and for most of the same reasons it’s increasing here in the U.S.

Are low response rates the reason, or at least a big reason, why so many pollsters around the world seem to have missed the mark recently in their pre-election polls?

It’s not at all clear that nonresponse bias is to blame for the recent troubles with election polls, though that’s one possible source of the errors. Equally important may be the methods used to determine who is a likely voter, or how to deal with voters who tell pollsters that they are undecided in the race. The British Polling Council commissioned a review of the polls in the 2015 general election, following the failure of most polls there to forecast the Conservative victory. That review has not yet been completed.

How do response rates compare between calls to a landline phone and calls to a cellphone?

We are obtaining nearly identical response rates on landline phones and cellphones. However, it takes considerably more interviewer time to get a completed interview on a cellphone than a landline phone, because cellphone numbers have to be dialed manually to conform to federal law. In addition, many cellphones are answered by minors, who are ineligible for the vast majority of our surveys. Unlike a landline, we consider a cellphone a personal device and do not attempt to interview anyone other than the person who answers.

In general, how does Pew Research Center attempt to overcome the challenges posed by low response rates in its survey research?

Pew Research Center devotes considerable effort to ensuring that our surveys are representative of the general population. For individual surveys, this involves making numerous callbacks over several days in order to maximize the chances of reaching respondents and making sure that an appropriate share of our sample are interviewed on cellphones. We carefully weight our surveys to match the general population demographically.

Perhaps most importantly, Pew Research Center’s team of methodologists is engaged in ongoing research into improving our existing survey techniques while also looking at alternative ways of measuring the attitudes and behaviors of the public. As society continues to change and technology evolves, the future of social research is likely to involve some combination of surveys and other forms of data collection that don’t involve interviews. In the meantime, we continue to apply the best survey practices we can and endeavor to be as transparent as possible about the quality of our data and how we produce them.

For more information on the methodology behind our research, visit our Methods page.

Topics: Polling, Research Methods

  1. Photo of Drew DeSilver

    is a senior writer at Pew Research Center.

  2. Photo of Scott Keeter

    is director of survey research at Pew Research Center.

Leave a Comment

All comments must follow the Pew Research comment policy and will be moderated before posting.

10 Comments

  1. Bob6 months ago

    As someone who teaches statistics and research methods, it’s absurd to hear someone claim that a response rate around 10% could possibly result in unbiased results. I had no idea that polls had that poor of response rates. If it’s something that only 1 in 10 volunteer to do, then they are surely different from the other 90%. You cannot re-weight your way out of that.

    Reply
  2. Slide9 months ago

    Of coarse you found that non response doesn’t affect your out come. Because if it did it would affect your income. Please stop calling. I am going to ask my congressman to push to put these bs so called non profit on the do not call list.

    Reply
  3. judy10 months ago

    Used to get calls for polls, none this yr

    Reply
  4. Stanley Dubis10 months ago

    Excessive phone call “Spam” is the cause of decreased participation in not only your survey, but in all surveys. I, like almost every one else, do not answer any phone call from a unrecognized phone number.

    You should devise a system that Pew Research readers can be surveyed on line if they agree or register to do so. That combine with those that answer calls would probably give you the numbers you are looking for.

    Reply
  5. Dr. Z.10 months ago

    Wasn’t YouGov involved in all those polling fiascos you noted? Is that just a coincidence or what?

    Reply
  6. Scott10 months ago

    Another reason people aren’t answering surveys, alluded to in your burdensome comment but not directly addressed is the fact the the number of surveys keeps increasing so we are called more frequently which becomes very annoying. I used to get survey calls only occasionally, now they are much more frequent at least I assume they are survey calls as I no longer answer numbers that I don’t recognize.

    Reply
  7. dave10 months ago

    I get so many spam phone calls that I now let the answering machine answer all calls and only pick up for personal calls. What happened to the “don’t call list”?

    Reply
    1. Scott Keeter10 months ago

      The “do not call” list is still around, but it applies only to marketing calls. Legitimate research calls are exempted. That said, the survey contractors who actually do the calling for our polls all have an internal “do not call” list and will add you to it if you request it when they try to reach you.

      Reply
  8. Dale Piper10 months ago

    My wife and I have caller ID. We do not answer calls from “unknown,” 800 numbers, or numbers without identifications or that we do not recognize. We would answer a number/call if it were identified as coming from “Pew Research” (or similar) or from any other reputable polling organization. If you would like a better polling response it might help if you allowed caller IDs to identify you (if you are not already doing that).

    Reply
  9. Earl Shields10 months ago

    I have agreed to participating in polling questionnaires in the passed, but finally quit because so many of the questions asked I didn’t feel were worth responding to. Too many of the questions I considered to be an insult to my intelligence. It seemed as though the pollster was looking for war fuzzy strokes rather than thoughtful answers.

    Reply