by Evans Witt and Jonathan Best, Princeton Survey Research Associates

for the Pew Internet & American Life Project

In mid-2007, Princeton Survey Research Associates conducted a major survey for the Pew Internet Project and the University of Illinois-Urbana-Champaign about how people solve certain kinds of problems.1 Funding for this survey was provided by a federal agency called the Institute for Museum and Library Services.2 All surveys supported by the federal government are reviewed by the White House’s Office of Management and Budget. In the process of evaluating our survey, OMB officials asked if we could do additional work to see if non-respondents to the initial poll were different from those who did respond. The following article excerpts the major findings of that additional analysis.

Summary of findings

In the past two decades, the research profession has faced increasing numbers of Americans who fail to complete interviews. The levels of such non-response raise questions about the representativeness and validity of surveys and the data they provide. The additional analysis produced at the request of the White House’s Office of Management and Budget (OMB) had three elements.

First, the interviewing effort on a sub-sample of the telephone numbers generated through computerized Random Digit Dialing (RDD) methods was doubled. RDD is the practice of pollsters to generate results from a random, representative subset of a population. The original survey design called for a maximum of 10 calls to each number. We selected 1,500 of the phone numbers that did not yield complete survey interviews after 10 calls and phoned them at least 20 times. The results of that extra effort were compared with the results of the standard 10-call effort.

Second, an analysis was conducted of survey results from the base 10-call design by segmenting interviews by the amount of effort actually required to get results.

Third, the total sample of telephone numbers and the subset that provided completed interviews using the 10-call design were analyzed to determine what kinds of communities are under- and over-represented in completed interviews.

The results of the analysis found:

<ul >

  • Doubling the interviewing effort to 20 calls produced 84 additional interviews. The results from these extra-effort interviews varied only occasionally and marginally from results in the base study.
  • The extra interviewing effort drew responses from those who are usually harder to reach in surveys: Younger adults, working adults and those with college degrees were a larger share of the extra-effort completed interviews.
  • Analyzing the original survey by the level of effort required to achieve an interview revealed few statistically significant differences. In other words, there were few meaningful differences between those who were reached in the first wave of calls, compared with those who were reached on the ninth or tenth try to a phone number. Analyzing the complete RDD sample and the “extra-effort” completed interviews by community characteristics showed that interviews are hardest to complete in urban areas and easiest to complete in rural areas. While there appear to be no significant variations across communities by average household income, areas with higher minority populations (both Hispanic and African-American) were less productive in terms of interviewing, paralleling the finding on urban areas.

The Issue of Survey Non-Response

Fewer people respond to surveys now than in the past. These failures to complete interviews result from a variety of factors, but the largest components are non-contact (that is, the failure ever to reach a person at the location or phone number designated as part of the sample) and refusal (the result of active or passive activities to avoid completing the survey). Thus, the question: How do the people who did not complete the survey differ from those who did?

There have been a variety of excellent summaries of the research on non-response and potential bias from non-response in surveys, the latest of which is Public Opinion Quarterly, Special Issue: Non-Response Bias in Household Surveys.3 The POQ special issue includes a report on the most recent major experiment for gauging the impact of extra effort to complete telephone interviews.4 A central feature of each of these two experiments was to compare a standard RDD survey completed over a five-day period, to a RDD survey using the same questionnaire, the same sample design, the sample field house but a calling period that stretched for more than six months. PSRAI participated in each of these experiments. Based on those models, this experiment was designed with the three elements described in the summary above.

Basic Effort Analysis 1: The Impact of Interviewing Effort

A three-category variable was computed to aid in the analysis of the amount of effort it took to complete interviews.

<ul >

  • “Hard” to reach: Phone numbers that had been dialed six or more times in the original sample and where potential respondents refused to be surveyed, were defined as “hard,” meaning the highest level of actual interviewing effort was needed to complete an interview.
  • “Easy” to reach: Phone numbers that had been called five or fewer times in the original sample and where no potential respondents refused to participate, were defined as “easy,” meaning the lowest level of interviewing effort was needed to complete an interview.
  • “Medium” effort required: All other phone numbers were defined as “medium” effort, including respondents who were called six or more times or had refused to participate in an interview, but not both.

The table below compares sample demographics of respondents according to the amount of effort it took to complete an interview. It shows that expending greater efforts proved positive in reaching younger respondents, particularly those 18-29 years of age. Older respondents (65 years of age or older) who were defined as easy to reach were overrepresented as compared with their prevalence in the general population; completion rates are significantly, and precipitously lower for medium- or hard-to-reach respondents.

The greater effort expended to complete interviews with whites was most productive in re-contacting respondents in the easy group (who hadn’t refused participation or were called fewer than six times previously) in which 69% of the sample were white. Respondents in the medium or hard categories were somewhat less likely to be white. More effort led to a greater number of completed interviews with African Americans. There was little difference by effort in reaching Hispanics.

Figure

It proved more difficult to complete interviews with respondents in the medium and hard categories who were employed full- or part-time, compared with respondents who said they were retired. While the greater effort paid off in higher completion rates among the employed in the medium and hard categories, extra efforts were less fruitful in completing interviews with retired persons in the medium and hard categories (30% among easy vs. an average of 22% for an aggregate of medium and hard).

Impacts of Effort on Community-level Demographic Distributions

We also tested to see if more effort helped complete interviews in harder-to-reach communities. Typically in RDD telephone samples, households in heavily populated urban areas are under-represented. These households are also more likely to have minority residents.

The table below compares the three sub-samples in relation to population density and the percentage of a community’s population that is minority. There were no significant differences among completion rates with respect to the levels of effort expended to get respondents to participate. There was no significant variation in reaching respondents in high-density areas with respect to the level of effort expended to complete interviews.

Figure

Impacts of Effort on Substantive Question Response Distributions

Of particular interest to this, or any, research is the possible effect that extra effort might have on the substantive results. For this analysis, we have selected a subset of questions to investigate. The first questions we analyzed were a series about visiting various local institutions in the past 12 months. There is little variation among the affirmative responses in this series. None of the differences are statistically significant.

Figure

Similarly, a comparison of the results of a series of questions about computer and Internet use among respondents in the sub-groups, we observed no statistically significant differences in responses among effort segments.

Figure

Finally, we examined differences in a series of questions which asked about ten situations or decisions that people might have faced in the past two years. There were statistically significant differences among the three subgroups on just two of the 10 items on the list. The toughest-to-reach respondents were more likely than others in the easy and medium categories to have made a decision about schooling or education (46% vs. 40% for medium and 34% for easy). Age and parental status are each likely the factors affecting these results. Younger respondents are typically harder to reach, appear in the higher effort sample in higher proportions than in the original study, and are inherently more likely to have recently made decisions about their education.

The greater likelihood among easy (48%) and medium (51%) category respondents to have dealt with a serious illness or health condition is attributable to older and retired respondents being easier to contact.

Figure

The next step was to analyze the results to questions which asked about the sources people use to get information or assistance. Here the results were generally uniform among the effort categories. There were slight statistically significant differences in terms of those who used the Internet to find information or assistance in solving their problems. Medium- and hard-to-reach respondents were slightly more likely than the easy-to-reach respondents to use the Internet (57% for hard/medium vs. 53% for easy). This correlates with the greater usage of the Internet by younger people, who are, in turn, harder to reach.

Figure

Basic Effort Analysis II: Community Demographics and Refusals

The second part of the analysis of potential non-response bias includes an examination of final sample dispositions of numbers dialed by community to assess over- or under-representation in the original sample. The disposition categories include completed interviews (completes), a combination of refusals and callbacks (refusals/callbacks), and numbers dialed where no potential respondent was contacted (non-contacts). The table below compares the effect of region, community type and whether those who were sampled live in a Metropolitan Statistical Area (MSA)5 or not, on response.

Figure

As can be seen in the above table, differences in response are mainly seen among geographical regions represented in the sample.

A higher relative proportion of interviews with respondents from the Midwest were completed (23%), as compared with just 15% in the West. Fully one third of the sample in the West region was unable to be contacted by PSRAI interviewers. Similarly, interviewers were unable to contact a potential respondent for 33% of the sample that came from urban areas. The West sample itself was significantly skewed toward urban communities, comprising 54% of the total sample for the region.

Income, Minority Density and Response

The table below compares the disposition categories with average household income and percentage density of Hispanics and African-Americans within sample blocks used for interviewing. There is very little difference among income distributions for the three sample segments. Between 35% and 40% of each segment came from areas with lower household incomes and less than 10% of each group came from the highest income areas.

Figure

Interviews were easier to conduct in areas with fewer Hispanic households. Nearly two-third of all completed interviews (63%) came from areas with the lowest incidence of Hispanic households. Only about half of the sample in the other groups came from these low-incidence Hispanic areas (53% of refusals/callbacks and 46% of non-contacts). The same trend is not seen when comparing the high and low-density African-American areas. Completes, refusals and non-contacts were distributed about the same across the African-American strata.

Extra effort Analysis III: Extra Effort, Extra Calls

In an effort to analyze potential non-response bias in survey results, a sample of 1,500 phone numbers that had been called 10 times and were still live numbers were put into a new project and dialed as many as 10 more times to try to complete an interview. The numbers included in the extra-effort dialing fell into three categories.

<ul >

  • Non-contacts: Numbers that had yielded no contact with a person in the initial survey. These numbers were some combination of no answer/ busy/ answering machines for all attempts.
  • Refusals: Numbers that had yielded a refusal on one of the first 10 attempts but were not converted to a completed interview.6
  • Break-offs: Numbers where respondents had started an interview but did not finish.

An additional 13,742 calls were made to the extra-effort sample and an additional 84 interviews were completed. This translates into one completed interview for every 164 call attempts.7

It is difficult to complete an interview once a phone number has been dialed 10 times without success. The table compares the outcome of the extra-effort numbers at the 10th attempt (the columns) with the outcome after 20 or more calls (the rows).

Figure

As can be seen from the table, the success rate was low, with only 6% of the extra-effort cases yielding completed interviews. The conversion rate was higher for the break-offs and the callbacks, with 16% and 14% converted respectively. The numbers that were non-contacts for the first 10 attempts had the lowest conversion rate (3%). Seven percent of the refusals were converted after the tenth attempt.

Effect of Extra Effort on Person-level Demographics8

One way to gauge potential non-response bias is to compare the demographics of the respondents interviewed with the extra effort with those interviewed with no extra effort. The table below compares basic sample demographics of the two groups. The population parameters are also represented in the table to put the numbers in context.

Figure

The gender distribution of the two samples was almost identical, 39% and 40% males. As is common in most telephone surveys, males are under-represented. There was a difference in the age distribution of the two samples with the extra-effort sample doing a slightly better job of reaching younger respondents, especially the 30-49 year old group. Additionally, the extra effort yielded a more appropriate proportion of older respondents (13%) than the regular sample (25%).

Both samples over-represented college graduates and under-represented people with less education. However, on the whole the regular sample got a slightly better overall education distribution.

The race/ethnicity of the two samples was comparable. However, the extra effort did pay off in reaching more Hispanic respondents than the standard sample. The extra-effort sample also reached more employed respondents than the standard sample (66% vs. 55%) and did not over-represent retired people as did the standard sample.

Effect of Extra Effort on Selected Substantive Questions

We investigated differences in responses to questions about information sources people used when they confronted problems to see if there were any differences between the intital and extra-effort samples. Results for the two samples were very similar with the exception of internet use to find information or assistance in solving problems. Significantly more people in the extra-effort sample reported using the Internet to get information about or assistance with their recent decision or situation (68% vs. 55%).

Figure

A Final Note about Potential Bias

The extra interviewing effort had the expected impact: Those who are harder to reach in general were those reached with extra effort. Younger adults, working adults and those with college degrees are harder to reach and thus were a larger share of the extra-effort completes. In terms of substantive questions, the impact of the extra effort parallels the demographic differences. Those reached with extra effort were more likely to say they have dealt with an education issue (reflecting the younger adults) and less likely to have dealt with a health issue or Medicare (reflecting the larger share of older adults in the normal sample). Even when results were significantly different with the extra-effort group, the impact on overall results was small. Consider that the sample of 1,500 extra-effort numbers yielded only 84 completed interviews. Since only about 3,000 numbers qualified for the extra-effort study, even if all of them had been dialed an additional 10 times that would have added approximately 170 completes to the main sample of more than 2,000. Even if the extra-effort results differed substantially from the main sample, there simply would not be enough of them to move overall survey results.


1 Information Searches that Solve Problems. Leigh Estabrook, Evans Witt and Lee Rainie 12/30/2007 https://www.pewresearch.org/internet/PPF/r/231/report_display.asp 12/30/2007.

2 OMB Clearance Number, 3137-0070, expiration date 06.30.2010.

3 Public Opinion Quarterly, Special Issue: Non-Response Bias in Household Surveys, 2006, Vol. 70, No. 5.

4 Gauging the impact of Growing Nonresponse on Estimates from a National RDD Telephone Survey, Keeter, et. al. POQ, 70: 759-779. This updated a review done in Consequences of Reducing Nonresponse in a Large National Telephone Survey, Keeter, et. al., POQ 64:125-48.

5 MSAs are delineated on the basis of a central urbanized area-a contiguous area of relatively high population density. The counties containing the core urbanized area are known as the central counties of the MSA. Additional surrounding counties (known as outlying counties) can be included in the MSA if these counties have strong social and economic ties to the central counties as measured by commuting and employment. Note that some areas within these outlying counties may actually be rural in nature.

6 Hard refusals were excluded from the extra effort study. These are phone numbers where the potential respondents have refused to cooperate in no uncertain terms.

7 The difference in calling productivity between the two samples is even more pronounced if you consider that many of the non-working phone numbers were identified on the first attempt before the extra effort even started.

8 Unweighted data was used for all comparisons in this report.