Introduction

Most probability-based political and social surveys of the general public in the United States continue to be conducted by telephone. In most of these surveys, respondents are interviewed only once. This model of conducting surveys is facing serious challenges as response rates drop, costs increase and newer alternative methods present appealing alternatives.

One alternative that is hardly new, but is attracting renewed attention, is the panel survey. Panel surveys involve repeated interviews with the same pool of respondents over time. Panels have been around for decades, and high-quality, probability-based panels (those recruited through random samples of the population) are an integral part of the federal statistical system. Panels recruited through non-probability methods or “convenience” samples have found widespread applicability in the market research world. But with a very small number of notable exceptions, panels have not been widely used by organizations committed to conducting probability-based political and social surveys on a regular basis. This is changing.

Several features of panels make them an appealing alternative to one-time, “cross-sectional” surveys. The initial costs of recruiting a panel, while potentially large, can be amortized over time because the panel can yield multiple individual surveys at relatively lower marginal costs per survey. Over time, far more can be learned about the panelists’ social, demographic and political characteristics than is feasible in a single survey. Related to this, external databases of information about panelists, such as voter and consumer files, can be integrated with survey responses to yield additional insights. Panels allow for the measurement of individual-level change over time, something that is not possible with cross-sectional surveys. Moreover, panels make it relatively easy to use multiple modes of interviewing. In particular, self-administration using Web or a paper questionnaire has desirable characteristics for the measurement of many kinds of attitudes and behaviors. And perhaps most obvious, follow-up studies are ideally facilitated in a panel design.

With these advantages in mind, Pew Research Center set out in early 2014 to build a probability-based panel — the American Trends Panel — to supplement our traditional method of data collection in the U.S. — the random digit dial (RDD) telephone survey. This report describes the steps taken to build and manage the panel and our experiences with it in 2014.

Panel Models

Probability-based survey panels come in many varieties. Some attempt to be representative of the entire population, while others focus on subgroups such as teens or young adults. Some involve nearly continuous interviewing, while others may collect data once a year or even less frequently. We envisioned creating a panel of randomly selected adults that would represent the U.S. adult population and could be used for social and political surveys similar to those we conduct with cross-sectional samples. Our goal was to collect data from panelists about once per month. We wanted most surveys to be self-administered, rather than interviewer-administered.

One well-known model for this type of panel is GfK’s KnowledgePanel (formerly known as Knowledge Networks’ panel of the same name).  Originally begun in the late 1990s, KnowledgePanel is a large nationally-representative survey panel of over 55,000 panelists.  Panelists are now recruited using address-based sampling (ABS), though some existing panelists were previously recruited through landline RDD surveys.  Interviewing occurs in both English and Spanish. Some Latino panelists continue to be recruited by telephone in high-density, Latino areas. All panelists are surveyed online. Those who did not have a computer and/or access to the internet at the time of their recruitment were provided with the necessary equipment and access.  In addition, GfK has built and managed several smaller custom versions of its KnowledgePanel for specific clients.

Two other multi-purpose, non-governmental panels in operation are the RAND American Life Panel at the RAND Corporation and the Understanding America Study at the University of Southern California. The RAND panel has more than 6,000 participants who have been recruited using a variety of sampling methods and sources; the majority of the panel was recruited using ABS and RDD (landline and cell). Panelists are surveyed via the internet and are provided a computer and/or internet access if they need it.

The Understanding America Study has approximately 2,000 panelists who have been recruited using ABS. Panelists are surveyed via the internet and are provided a tablet computer with internet access if they need it.

Gallup maintains a nationally-representative panel of approximately 60,000 adults recruited via RDD and ABS. Most panelists participate via the internet, while some are interviewed by mail or phone.

One other panel currently under construction is the AmeriSpeak panel by NORC at the University of Chicago. The sample frame is the area probability NORC National Sample Frame. This panel, with an initial size expected to be 10,000 members, is being recruited using a variety of methods including email, U.S. mail, telephone, and face-to-face recruitment. Panelists without internet access will usually be interviewed by telephone.

After considering various models, we opted to recruit the American Trends Panel via a large RDD telephone survey conducted in early 2014 on the subject of political polarization. The study had a total sample size of about 10,000, providing a large base for the panel recruitment. All respondents in the telephone survey received a common core of questions about their political values and engagement, along with a comprehensive set of demographic questions, ensuring a good baseline of information about respondents who agreed to join the panel as well about those who refused. The telephone survey and panel recruitment was funded in part by grants from the William and Flora Hewlett Foundation and the John D. and Catherine T. MacArthur Foundation and the generosity of Don C. and Jeane M. Bertsch.

We decided that the standard mode of interview for panelists with access to the internet would be self-administration on a desktop, laptop, tablet or smartphone. To ensure coverage of individuals who do not have access to the internet or did not want to use the internet for taking surveys, we decided to build the capability to survey them by mail with a paper questionnaire. Providing computers and internet access was not, in our view, a cost-effective approach to cover this portion of the population. We also have the option to interview this group by telephone, and did so in the first wave of interviewing and on another wave that incorporated a test of interview modes. But concerns about mode-of-interview effects led us to prefer the option of a mail survey for this relatively small portion (12%) of the panel.

Like most of the other national panels, we provided a small incentive for joining the panel ($10 in cash) and for completing each panel survey (detailed below). During 2014, surveys were conducted approximately once per month, and will be conducted approximately every two-three months in 2015. A second recruitment effort is planned for later in 2015 to dilute the effects of panel conditioning – the possibility that panelists become acclimated to the interview process and survey content and no longer respond in the same ways that they did when first interviewed – and to replenish panel membership because of inevitable attrition over time.

Although we built this panel with the explicit goal of having it serve the research needs of our yearlong study of political polarization, it was also very much an experiment, given our lack of past experience with panel research. Among the questions we wanted to answer were the following:

  • What percentage of RDD respondents who were offered panel membership would join and participate?
  • Does the language used to recruit respondents to the panel affect the proportion of respondents who agree to participate?
  • How well would the demographic and political composition of the panelists match the composition of those who were recruited? How well would the composition of the panelists match the overall U.S. population?
  • How engaged would panelists be? Would most of them take most surveys, or would participation be more intermittent?
  • How serious would attrition be over the course of the year and how would it affect the representativeness of the panel?
  • For the Web portion of the panel, how important would mobile devices be for completing surveys?
  • Would this method of data collection be cost effective for our purposes?

The American Trends Panel was designed by Pew Research Center staff in collaboration with staff at Abt SRBI. Overall direction of the panel is the responsibility of Pew Research Center. Ongoing data collection is conducted and managed by Abt SRBI.