An illustration of a sample invitation to complete a Pew Research Center survey via the American Trends Panel.
(Pew Research Center illustration)

At Pew Research Center, we regularly field questions from readers about the methodology behind our polls, including the occasional inquiry from someone who wants to be polled but hasn’t had the opportunity. These kinds of questions are understandable since the average American likely isn’t an expert in survey methodology.

Here, we address some of the most common questions we receive about the nuts and bolts of taking a U.S.-focused Pew Research Center poll. While the Center fields surveys in many other countries, too, this explainer focuses on our U.S. polling since our international methodology can vary considerably from country to country.

How can I sign up to take one of your surveys?

Unfortunately, you can’t. The Center relies on random sampling, which means we invite people to take our surveys by reaching out to them at random rather than simply polling those who ask to be polled. This helps ensure that nearly everyone has a roughly equal chance of being selected. It also increases the likelihood that our survey takers are representative of the broader population we’re trying to study – most commonly, U.S. adults ages 18 and older. Polling only those who want to be polled would not produce a representative sample of U.S. adults, for the simple reason that “volunteers” may differ in key ways from the broader population.

How exactly do you randomly select Americans to take your surveys?

In the not-too-distant past, the Center used the nation’s master list of landline and cellphone numbers to call Americans randomly and invite them to take a survey, an approach known as random-digit dialing. But in an era of robocalls and caller ID, many Americans no longer answer calls from unknown numbers. Response rates to telephone surveys in the U.S. have plummeted in recent decades, making it much more difficult and costly for polling organizations to reach respondents this way.

Fortunately, the widespread adoption of the internet in the U.S. has made online polling an attractive alternative to phone polling. As of early 2021, 93% of U.S. adults say they use the internet, and the Center now does the vast majority of its U.S. polling online. (See below for more on the 7% of adults who don’t use the internet.) But the process of randomly inviting Americans to take our online surveys still takes place offline. That’s because there is no master list of U.S. email addresses the way there is for U.S. phone numbers.

A sample invitation to complete a Pew Research Center survey via the American Trends Panel.
(Nick Bertoni/Pew Research Center)

These days, we mostly recruit people to take our online surveys by mailing them printed invitations – again at random – with the help of a residential address file kept by the U.S. Postal Service. This approach gives everyone living at a U.S. residential address a chance to be surveyed (though it does exclude some people, such as those who are incarcerated or living at a rehabilitation center). We usually include a small amount of cash with each invitation to increase the chances that recipients will notice and respond to it.

It’s important to note that these invitations aren’t just to take one survey. They are invitations to join our American Trends Panel (sometimes referred to as the ATP in our publications), a group of U.S. adults who have agreed to take multiple surveys over time. As of August 2021, the American Trends Panel had more than 10,000 participants, but this number changes fairly regularly as we recruit new members and others stop participating.

How is it possible that I’ve never been randomly invited to take a survey?

It may seem counterintuitive that you’ve never been randomly selected to take a Pew Research Center survey. But keep in mind that the adult population in the U.S. exceeded 258 million as of 2020. Even with a large survey panel like the American Trends Panel, your chances of being invited to join it are exceedingly small – somewhere around 1 in 170,000.

Once invited, how do people actually take your surveys?

A sample Pew Research Center survey via the American Trends Panel.

Participants in the American Trends Panel fill out survey questionnaires with whatever internet-enabled device they prefer, whether it’s a smartphone, tablet, laptop computer or desktop computer. The questionnaires are self-administered, meaning that respondents can fill them out whenever it’s convenient for them, so long as it’s within a window of time (usually a week or two) that we communicate to them at the outset of each survey field period.

Most of our U.S. surveys are designed to take respondents a maximum of 15 minutes to complete. We try to be respectful of our respondents’ time and are mindful of the fact that survey data quality can decrease if it takes people too long to answer all of our questions.

As noted above, we do still conduct U.S. phone surveys with live interviewers from time to time, too. But this is much more the exception than the rule. Nearly all of our U.S. polling now takes place online, in the manner described above, instead of by phone.

Not everyone in the U.S. uses the internet, so how can your online surveys be nationally representative?

This is a great question, and an important point. Nationally, 7% of U.S. adults don’t use the internet, according to an early 2021 survey (which, for obvious reasons, we conducted by phone). While 7% isn’t an especially large figure, we can’t just ignore or overlook these Americans, so we’ve tried a variety of approaches to include them in our surveys.

In the early days of the American Trends Panel, we surveyed non-internet users by mailing them paper questionnaires with the return postage paid. Since 2016, we’ve provided them with internet-enabled tablets and data plans so they can fill out our questionnaires online, just as regular internet users would.

Do your surveys include people who don’t speak English?

Just as we do with non-internet users, we try hard to include non-English speakers in our polling. All of our questionnaires are available in Spanish as well as English, and 99% of adults in the U.S. speak one or both of these languages well enough to complete a survey.

Of course, people in the U.S. speak many other languages, too, and it remains a challenge to survey the small percentage of people who don’t speak English or Spanish. This is especially difficult when it comes to Asian Americans, a diverse and heavily immigrant population with origins in more than 20 countries. As of 2019, around seven-in-ten Asian Americans ages 5 and older (72%) spoke English proficiently, but 28% did not.

A sample chart showing that Asian Americans are interviewed via English only in Pew Research Center's U.S. surveys.

The Center’s surveys currently only include Asian Americans who speak English (or Spanish). In other words, our surveys can only provide a limited view of the attitudes of the entire Asian American population. This is why you will typically see an asterisk next to the “Asian Americans” label in many of the graphics we publish about racial and ethnic differences in U.S. public attitudes. The asterisk denotes that Asian Americans were interviewed in English only.

If the same people take your surveys over and over again, doesn’t that bias the results?

Theoretically, yes. Asking the same people the same questions may cause them to remember their previous answers and feel pressure to answer consistently over time. Conversely, some participants in online survey panels like the American Trends Panel might change their attitudes or behaviors simply by being exposed to and answering a variety of questions over time. This is known as panel conditioning.

Consider a survey that asks about Senate Majority Leader Chuck Schumer. A survey respondent initially may not be familiar with Schumer; after all, many Americans don’t follow the details of Washington politics all that closely. But if a polling organization asks about Schumer in repeated surveys, it may cause the respondent to seek out more information about Schumer and, in the process, become more politically engaged than initially was the case.

Fortunately, a recent Pew Research Center study found little evidence that panel conditioning has changed our survey respondents’ views or behaviors in several areas where such changes might be expected, including in their media consumption habits, the frequency with which they discuss politics, their party affiliation and their voting records. But the study did find a slight increase in voter registration after people joined the American Trends Panel.

Asking the same people to take surveys over a protracted period of time has some advantages, too. For example, as panelists become more comfortable with answering questions in a self-administered online setting, they might report their opinions and behaviors more forthrightly than they otherwise would have, especially compared with a telephone poll in which a live interviewer asks the questions. Repeatedly surveying the same people also allows the Center to examine how their attitudes are or aren’t changing over time, an approach known as longitudinal research. In 2018, for example, the Center published a study looking at how opinions of former President Donald Trump had changed (or, more accurately, had not changed) among those who voted for him in 2016.

How often do your respondents take a survey?

We typically field two or three surveys a month, but not all of our online panel members take each survey. Instead, we often survey smaller groups of people in “waves.” This helps ensure that our respondents don’t get worn out by answering so many questions.

Do you pay people to take your surveys?

Yes. We provide a small incentive for participation, typically ranging from $5 to $20 per survey. Incentives tend to be at the higher end of this range for demographic groups that traditionally respond to surveys at lower rates. Respondents can choose to receive their incentive in the form of a check or a gift code to Amazon.com.

John Gramlich  is a senior writer/editor at Pew Research Center.