Q/A: How Pew Research tracks public opinion in countries stricken by violence and unrest
The Pew Research Center today released findings from a new survey in Ukraine, conducted as that country grapples with separatist movements, armed conflict and general uncertainty about the nation’s political future. Pew Research is polling in more than 40 countries this year. Fact Tank’s Drew DeSilver sat down with James Bell, Pew Research’s director of international survey research, to discuss how the center designs and implements its surveys in places like Ukraine.
Many people following the news in Ukraine are likely wondering, how was Pew Research able to successfully field a survey under present conditions?
In Ukraine, as in many other countries around the globe, our surveys are conducted face-to-face, rather than by telephone. This means that interviewers are assigned the task of walking through neighborhoods and knocking on doors to randomly select survey participants. Being out on the street can be risky at times, especially in a restive country like Ukraine. I want to stress that Pew Research always puts interviewer safety first. Before launching a survey in a foreign country, we discuss security and other concerns with our lead-contractor (in this case, our long-standing partner, Princeton Survey Research Associates International), and the local polling firm that we hire to administer the interviews. If we judge conditions in a country to be too dangerous, we will either delay or cancel the project. We never require interviewers to knowingly risk their personal safety.
Can you provide some examples of countries where Pew Research has either delayed or reconsidered a project due to security fears?
In spring 2013, our survey in Venezuela was delayed for several weeks due to political unrest in the country. Interviewers were unable to safely travel to their assigned neighborhoods, and completed questionnaires could not be securely transported from the provinces to the central office where data were entered into a computer. Fortunately, conditions sufficiently improved so that the local polling firm we worked with could complete the necessary interviews and process the data for delivery to us.
Sometimes the duration or scale of a safety threat requires us to go back to the drawing board and assign new locations to interviewers, so they can complete the survey. An example is Afghanistan, where armed confrontations between the army and Taliban in 2011 required us to substitute a number of randomly selected villages and towns with safer alternates. This was done in a way that preserved the representativeness of the survey. In other countries, such as Pakistan, we know going into a project that we are unlikely to be able to safely interview people in all parts of the country. If you check our methods statements for Pakistan, you will see that we consistently exclude the Federally Administered Tribal Areas and certain other regions due to security concerns. Even with these omissions in coverage, our survey in Pakistan accurately represents the opinions of the overwhelming majority of Pakistanis.
Returning to Ukraine, did Pew Research have to exclude Crimea or other parts of the country due to unrest and violence?
No, we did not. In large part, this was due to the fact that our local partner in Ukraine has a network of experienced interviewers throughout the country. Under normal conditions, at least some of these interviewers would travel between several locations to complete interviews. However, given the current unrest and reports of checkpoints on certain roads manned by either government or opposition forces, we decided that interviewers would only be assigned to randomly selected locations near interviewers’ homes.
In addition to keeping the fieldwork team safe, this approach meant that when a respondent opened their door to an interviewer, they encountered someone who spoke the same language, had a similar accent and generally felt like a neighbor. Interviewers, of course, clearly identified themselves as employees of a professional research firm. But I think the sense of shared local identity between respondent and interviewer helped to reduce possible suspicions of the survey and its motives. Interviewers were aided in this regard by having the option of administering the survey in either Ukrainian or Russian, based on the respondent’s preference.
So, Pew Research faced no interference in conducting its survey in Ukraine?
We encountered one instance where a driver on his way to deliver completed questionnaires to the central office was stopped at a roadblock. Unidentified men seized a packet of questionnaires and burnt it on the spot. The driver was not detained and left the scene unharmed. This occurred in the early days of fieldwork, and the lost questionnaires were replaced by interviewers returning to the original neighborhoods and re-administering the survey. The incident certainly put us and the local firm on alert. We immediately emphasized to our local partner that they should cease fieldwork if they sensed a risk to their employees. Given the randomness of the roadblock, our partner decided to cautiously proceed with interviewing. In the end, it proved to be the only interference during fieldwork.
Wouldn’t it be easier, not to mention safer, to interview people in Ukraine and other restive countries by phone?
We regularly conduct surveys by telephone in countries like the United Kingdom, France, Germany, Spain and Japan. These countries are characterized by widespread ownership of both landline and mobile phones. They are also countries where we feel confident that we can sample both landline and mobile phone users at a reasonable cost, and without missing key segments of the population. In many countries, including Ukraine, landline phones were never as ubiquitous as in western countries and the advent of mobile phones has not yet translated into what we would consider both an economical and rigorous alternative to face-to-face surveys.
I noticed the Ukraine survey included an oversample of the country’s eastern region as well as Crimea. How do you make decisions about when to oversample?
Whether in Ukraine or elsewhere, it is sometimes important to have survey data that speaks reliably about members of subgroups. In national surveys, a standard way of achieving this is by selecting more people from such subgroups than would typically be done if everyone in the sample had an equal chance of being selected, a technique known in opinion research as an ‘oversample.’ Our oversample in eastern Ukraine and Crimea, for instance, gives people in these regions a greater chance of being randomly selected by boosting the number of interviews assigned to these areas of the country.
To make this a bit more concrete, people living in eastern Ukraine and Crimea make up roughly 57% of Ukraine’s population, but 72% of all the respondents we interviewed for this survey. The benefit of the oversample is that when we analyze the attitudes of people within eastern Ukraine or Crimea, we have a larger number of respondents that we can analyze, and are able to identify important subgroup differences, such as that between people in this region who speak Russian only, as opposed to Ukrainian only. Our report highlights some of the dramatic ways in which these groups differ in their opinion of the current government in Kyiv and their views of Russia. However, when we talk about the total share of Ukrainians who hold a certain opinion or report a behavior, people in eastern Ukraine and Crimea are weighted “down” to their actual share of the population, 57%.
James Bell is Director of International Survey Research at the Pew Research Center.