News consumers today are confronted with a tangle of statements and assertions that run the gamut from purely factual to purely opinion. Being able to quickly tell where a news statement fits on that spectrum is key to being an informed reader or viewer. But how good are Americans at distinguishing factual news statements from opinions? A new Pew Research Center report attempts to answer that question. Below, Amy Mitchell, the Center’s director of journalism research, explains how the study was put together and what it found.

This is a different sort of study than the news and knowledge quizzes that the Center has used in the past. Why did you want to address the question of people’s ability to distinguish factual from opinion news statements?

Amy Mitchell, director of journalism research at Pew Research Center
Amy Mitchell, director of journalism research at Pew Research Center

When getting their news these days, Americans need to quickly decide how to understand news-related statements that can come in snippets or with little or no context. At the same time, there’s a growing political divide in which sources Americans get news from and trust. For us, this raises questions about how well the public is equipped to parse through news in the current environment.

And so we studied a basic step in that process: differentiating factual statements – ones that can be proved or disproved with objective evidence – from opinion statements, which are expressions of beliefs or values. Americans’ ability to make this distinction may shape their ability to undertake some of the other tasks being asked of them as news consumers, such as fact-checking or differentiating straight reporting from op-eds.

How did you ensure respondents understood that you were asking them if a statement was “factual,” not whether it was a “fact”?

When getting their news these days, Americans need to quickly decide how to understand news-related statements that can come in snippets or with little or no context.
Amy Mitchell

When people talk about “facts,” they often think about statements that are objectively and unambiguously true. But what we were focusing on in this study was whether news-related statements are factual, meaning they can be proved – or disproved – based on objective evidence. We were interested in assessing a fundamental skill: the ability to differentiate between statements that can be assessed using evidence and those that cannot. In the question itself, we asked respondents to classify statements as factual “whether you think it is accurate or not” to indicate that this wasn’t about accuracy or truth, but about whether a given statement could be proved or disproved.

We conducted extensive pretesting before launching the actual survey to ensure that our respondents understood what we were looking for. We tested variations on the language used in the question instructions, the wording of the response options and the number of response options. All of that testing let us see how each variation performed, and it gave the respondents a chance to provide any feedback they may have had. While the results of these pretests weren’t intended to be representative of the U.S. adult population, they did help us understand the best way to ask this question.

Along with factual and opinion statements, you presented respondents with some “borderline” statements. What are those?

Borderline statements live in a murky space between factual and opinion statements. The borderline statements we included in the study have both factual and opinion elements: They’re factual in that they’re at least somewhat based on objective evidence, but they may also be expressions of values or beliefs, or use vague language that makes them difficult to prove or disprove definitively.

For example, one of our borderline statements was this: “Applying additional scrutiny to Muslim Americans would not reduce terrorism in the U.S.” That’s a prediction, and while there is evidence that can be used to make an argument one way or the other, someone wouldn’t be able to definitively prove or disprove the statement because the outcome of a policy like this in the U.S. is not yet known. We felt it was important to explore how Americans classify this type of statement because, these days, not all news statements are unambiguously factual or opinion.

Did you draw the statements you used in the study from actual news stories or write them yourself?

When people talk about “facts,” they often think about statements that are objectively and unambiguously true. But what we were focusing on in this study was whether news-related statements are factual, meaning they can be proved – or disproved – based on objective evidence.
Amy Mitchell

The statements didn’t come from actual news stories. We wrote our own statements to resemble content you would see in news articles. We wanted to generalize our findings so we wouldn’t be limited to describing how Americans process statements from specific stories or particular outlets. (We did indicate in the questionnaire that the statements didn’t come from specific stories or news outlets. That’s standard practice in academic research when this sort of content is used in surveys and experiments.)

We also used individual statements rather than full articles to more closely resemble the process of scanning through news and having to make quick judgments. The material we used in the factual statements was drawn from a variety of sources, including news organizations, government sources, research organizations and fact-checking entities; all of these statements were accurate. The opinion statements were largely adapted from existing public opinion surveys.

Did you consider including some inaccurate factual statements in your study? If so, why did you decide not to do so?

We wanted to keep the focus of the study on exploring what can be proved or disproved. Including some inaccurate statements among the factual ones and asking respondents to pick them out would have brought the study too close to becoming a knowledge test, which isn’t what we were going for.

Even though the study didn’t include any inaccurate factual statements, there were some cases when respondents thought a factual statement was inaccurate. What do you make of that?

Generally speaking, Americans overwhelmingly consider statements they classify as factual to also be accurate. But in our study that wasn’t always the case. For example, one of our statements was: “Spending on Social Security, Medicare, and Medicaid make up the largest portion of the U.S. federal budget.” A majority (62%) of those who correctly classified that statement as factual also said it was accurate. But roughly four-in-ten (37%) said it was inaccurate. So this study also provides some evidence that Americans can see statements in the news as both factual and inaccurate.

Part of the report discusses how Republicans and Democrats differ in how they classify particular statements depending on whether those statements “favor their side.” What was your basis for deciding whether a particular statement – whether factual or opinion – appealed more to one side or the other?

A statement was considered to appeal to the left or the right if it lent support to political views held by more people on one side of the ideological spectrum than the other. We used various sources to determine the appeal of each statement, including recent polling data, remarks by elected officials and news articles. Overall, what we saw in our findings was that members of each party were more likely to classify a statement as factual when it appealed to their side – and this happened whether the statement was factual or opinion.

What, to you, is the most important or unexpected finding of the study?

One especially salient finding is that the basic task of differentiating between factual and opinion news statements presents somewhat of a challenge to Americans. Most respondents could correctly classify a majority of the statements, but far fewer could classify all of them correctly, and people with certain characteristics did far better at parsing through this content than others. For example, people with high political awareness and those who are very digitally savvy or place high levels of trust in the news media were better able than others to accurately classify the statements.

Overall, Americans have some ability to separate what is factual from what is opinion, but the gaps across population groups raise caution, especially given all we know about news consumers’ tendency to feel worn out by the amount of news there is these days, and to dip briefly into and out of news rather than engage deeply with it.

Drew DeSilver  is a senior writer at Pew Research Center.