A behind-the-scenes blog about research methods at Pew Research Center.

Testing survey questions ahead of time can help sharpen a poll’s focus

Survey researchers often try to determine how respondents might react to different kinds of questions. This can help ensure that proposed survey questions are as clear as possible and measuring what researchers set out to measure.

Pew Research Center frequently tests survey questions ahead of time through qualitative research methods such as focus groups, cognitive interviews, pretesting (often using an online, nonprobability sample), or a combination of these approaches.

In this post, I’ll walk through our preparations for a comparative, cross-national survey we’re planning to field in the United States, United Kingdom, France and Germany later this fall. In this case, we used focus groups and a short survey experiment to test some questions for the upcoming poll. I’ll discuss two of those survey questions here.

Measuring pride — or shame — in a nation’s history

One of the topics in the focus groups — conducted in the U.S. and UK in fall 2019 — was a discussion on what makes people proud to be American or British, as well as what makes them embarrassed.

One interesting observation is that participants pointed to each country’s history as either a source of pride or shame, depending on the political orientation of the focus group. For example, in British focus groups consisting of those who voted to leave the European Union (“Leavers”), participants emphasized their pride in the former British empire and the role that the UK played in spreading democracy and the English language worldwide. But in British focus groups composed of those who voted to remain in the EU (“Remainers”), participants expounded on how the country’s history of colonialism and imperialism made them feel embarrassed, citing examples such as the “mess” in Hong Kong and how the UK bore responsibility for “destabilizing regions” and “making one tribe fight the other.” These respondents also emphasized that Britain’s involvement in the slave trade was problematic, an issue that has since boiled over as protesters toppled a statue of slave trader Robert Colston in Bristol, England.

In the U.S., too, the focus groups were separated by issues of history, slavery and racism. Those composed of Republicans and Republican-leaning independents emphasized their pride in American history and the ways in which the country has been a beacon of hope and a “land of opportunity,” while those comprised of Democrats and Democratic leaners highlighted past injustices and unequal opportunity for Black people. Democratic focus groups also singled out the perceived hypocrisy of America as a “melting pot” despite its legacy of slavery and mistreatment of Native Americans, among other injustices.

Based on these divides, we were interested in drafting a survey question that could measure the relative tendency to be proud or ashamed of the history of one’s country. But we wanted to resolve two issues first. One was whether we should we provide a middle answer option so people could report being both proud and ashamed in their country’s history. The other was whether the question might inadvertently elicit answers related to each country’s handling of the current COVID-19 pandemic, which was not our intention. (We know from a previous survey that people in the U.S. and UK are divided about their government’s handling of COVID-19.)

To study the first issue, we used an online nonprobability sample to randomly assign people to receive one of two survey questions. We used a nonprobability sample because our pretesting typically focuses on understanding how different question wordings perform — something that can be explored through experiments on nonprobability samples — rather than understanding what percentage of the U.S. population has a particular view. For this experiment, half of the respondents in the online nonprobability sample were given the following binary, forced choice question:

Which comes closer to your view even if neither is exactly right? There are some times when I am not proud of the U.S./UK OR I am always proud of the U.S./UK, no matter what.”

The other half were given the following three-part question instead:

“Which comes closer to your view, even if neither is exactly right? I’m almost always proud of this country OR I’m almost always ashamed of this country OR I’m often proud of this country, but I’m often ashamed of it as well.”

Since people were randomly assigned to one question or the other, we know that the only differences between the two groups should have been which question they received and not other factors that might affect results.

Results indicated that half or more in each country that got the second version of the question chose the middle option of both pride and shame. In the U.S., similar shares chose “always proud” in both question formats, whereas in the UK, fewer people chose “always proud” when the middle option was present. Thus, while the two versions of the question produced somewhat similar results for the “always proud” group, the three-part question provided more information, allowing us to more clearly parse those who were “always ashamed” from those who fell into the middle category of being often proud and often ashamed. Regardless of the formulation of the question, Republicans in the U.S. were more likely than Democrats to say they were “always proud,” as were Leavers in the UK when compared to Remainers.

To gain some insight into the second issue flagged — whether respondents might be thinking of their government’s response to the coronavirus outbreak — we decided to explore the cognitive processes behind individual answers by asking respondents to explain what they were thinking in an open-ended question following whichever of the two survey questions they answered.

Examining these open-ended responses, we saw that while COVID-19 did come up, respondents’ feelings of pride and shame often had a longer time horizon. For example, people mentioned the history of the two countries; general freedoms and principles that they attributed to the countries; characteristics of the publics; and more. While some mentioned politics, more emphasis was often placed on the current administration in each country than on the specific handling of the COVID-19 pandemic. In fact, responses felt qualitatively similar to what we heard in the focus groups, suggesting that we were capturing our concept of interest, rather than a short-term sentiment about the pandemic. For example, in focus groups, we heard things like “All you have to do is travel outside the U.S. to realize how good we have it here.” In open-ended responses online, one American participant similarly voiced, “I’ve been to 45 countries — none better.”

In addition to the valuable information we gained from the three-part question in terms of analytically separating out the “always ashamed” group from those who feel more mixed, we also learned from the open-ended answers that some respondents who received the two-part question mentioned that they often felt both proud and ashamed of their country’s history and had difficulty responding. As a result, we decided to field the three-part question instead.

Measuring nostalgia

Another question we asked in the focus groups was about when each country was at its best. Participants named different dates or eras, but four key themes emerged.

First, both Britons and Americans highlighted “simpler” times –before the internet era, Facebook and generally when people had a slower pace of life and didn’t have to be constantly responsive to email or texts. Second, people emphasized times when economic opportunity seemed plentiful and they felt like the ordinary people could make a good living and improve their quality of life compared to their parents. Third, people touted times when the countries had come together and been less polarized, whether because of a natural disaster (e.g. Hurricane Harvey in the U.S.), a terrorist attack (e.g. 9/11 in the U.S.), cultural events (e.g. royal weddings in the UK) or the Olympics, among others.

A fourth theme also emerged: the sense that each country has never been at its best. People in both countries — and especially those in groups composed of Remainers in the UK or Democrats in the U.S. — highlighted how the “best” times can depend on one’s vantage point. This was underscored by a U.S. respondent who noted that even after World War II, when Americans could celebrate beating the Nazis, civil rights remained limited for some. This sentiment was echoed by many who said that while both countries had made progress, historically there were always people disenfranchised based on race, sexuality or gender.

To get at this concept in our survey, we drafted the following question:

“Thinking about the U.S./UK, which of the following best reflects your view? We were better in the past OR The best years are still ahead.”

We also drafted a second, similar version, to explore whether slightly different wording would change the results:

“Thinking about the U.S./UK, which of the following best reflects your view? Our country’s best years are already behind us OR Our country’s best years are still to come.”

In the UK, people answered the question similarly, regardless of the wording. Around half generally felt the country was better in the past, while half said it will be better in the future. In the U.S., however, question wording appears to have mattered. While 45% said in the first question option that “we were better in the past,” a considerably smaller share (34%) chose the corollary in the second question option: “Our country’s best years are already behind us.” In addition, we noticed that the ideological differences we observed in the focus groups — with Leavers in the UK and Republicans in the U.S. more likely to be nostalgic for the past — were either slight, not present, or reversed. In the UK, we saw that on both versions of the question, Remainers were actually more likely to say their country was better in the past than Leavers, though those divides were more stark in the first version of the question.

On this question, too, we were concerned that some respondents might think of the COVID-19 pandemic when evaluating whether their country’s best years were in the past or in the future, so we asked the same open-ended question that we asked about national pride or shame. As was the case in the earlier question, while some people did bring up the outbreak, we didn’t find any major cause for concern on this front. But the open-ended answers did help us understand why the ideological differences we observed in the focus groups were more muted in the test survey.

Most notably, people often chose different survey responses while giving the same explanation for their choice — e.g. political polarization, the election of Boris Johnson or Donald Trump or COVID-19. For example, in the U.S., two respondents who were both dissatisfied with the Trump presidency answered the closed-ended question differently. One who said “we were better in the past” explained his reasoning as, “Everything has gone bad since we had Trump as president.” The other said “the best years are still ahead,” but described a similar rationale: “Very many things will change for the positive once we’re unshackled from Trump.” We found many examples like this in the open-ended answers, suggesting that while the question may have captured people’s general predilections, it was not adequately capturing the sense of nostalgia that we were trying to measure. Given this, we decided not to run this question on the survey.

The importance of testing

While organizing focus groups and online pretesting adds quite a bit of time — and expense — to our survey research process, we find it incredibly valuable. For this project, focus groups were particularly valuable for highlighting themes and topics that were related to our key areas of interest but that we had never asked survey questions about before. And the pretesting showed us which of those topics we were successfully asking about in a closed-ended format and which may have slightly missed the mark, allowing us to recalibrate, adjust and create the best possible survey instrument for our larger, nationally representative survey.

More from Decoded


To browse all of Pew Research Center findings and data by topic, visit pewresearch.org

About Decoded

A behind-the-scenes blog about research methods at Pew Research Center.

Copyright 2022 Pew Research Center