DEVIN STEWART: Hi, I'm Devin Stewart here at Carnegie Council in New York City, and today I am speaking with Brendan Nyhan of Dartmouth College. He is also a contributor to The Upshot blog at The New York Times, and he is a co-author of a new study on fake news called "Selective Exposure to Misinformation: Evidence from the consumption of fake news during the 2016 U.S. presidential campaign."
Brendan, thanks so much for speaking with us today.
BRENDAN NYHAN: Sure. Happy to be here.
DEVIN STEWART: Brendan, tell me, first of all, let's define our terms a little bit. Maybe you have a technical definition of fake news. How do you define fake news, and has its definition changed since President Trump has been using it on a daily basis?
BRENDAN NYHAN: Yes, it is important to clarify terms in this debate right from the beginning. I think it is confusing to people when they hear the term, whether people are using fake news to refer to the kinds of bogus stories that were popping before the 2016 election, which is what we study, or whether they are using it to refer to any kind of false information online, or whether they are using it to refer to news that they do not like, as in the way the president uses the term.
We are using the first definition. We are focusing on websites that frequently publish dubious content overwhelmingly favoring one candidate that popped up in the period prior to the 2016 election.
DEVIN STEWART: How does your study determine whether it is dubious or not?
BRENDAN NYHAN: We rely on a set of identified stories that have been noted by fact checkers as being false or misleading. These are sites that have published two or more such articles. These kinds of websites publish so many articles that the fact checkers could not check them all, and so we are making a judgment that sites that are repeatedly publishing that kind of dubious content, much of what they publish is questionable, not all, of course, but enough to reasonably classify them as belonging to this group of so-called "fake news websites."
DEVIN STEWART: The dissemination of fake news, what are the various tactics used to spread false information or disinformation?
BRENDAN NYHAN: There are many different types. For these websites specifically we think the primary tactic was to promote their content on Facebook. Some of them may have had Facebook pages, others may have placed ads, still others might have posted these links into Facebook groups about politics. We cannot trace the dissemination mechanisms of all these articles, but what we can say is that it appears that one of the most important ways that people were getting to them is via Facebook. I am happy to talk about how we draw that inference, but it does seem that Facebook was the single most important distribution mechanism for the type of fake news that we study.
DEVIN STEWART: How did you infer that?
BRENDAN NYHAN: In our study we observe an approximately national representative sample of Americans. We both interview them in a survey context, so we know which candidate they support, and we also observe, with their consent, anonymized browsing behavior; we observe the websites that they go to on their desktop or laptop computers.
We infer that Facebook was the originating site for a visit to a fake news website if it appears in the prior 30 seconds to visiting a fake news site, and we observe that that happens disproportionately for fake news sites relative to regular news websites, and we do not observe that same asymmetry for Twitter or Google or web-based email platforms. So we infer from that that a number of people seem to be clicking from the Facebook feed, which they viewed in their browser, out to actually visit one of these fake news websites.
It is important to note that we cannot observe whether or not people have been exposed to this content inside of Facebook itself. We only observe the URLs of the pages they go to, the address in your browser bar.
Many more people, of course, were exposed to this kind of content within Facebook. We are able to observe a deeper kind of interaction, the people who actually clicked out and visited the website itself. That is a smaller group, but that is probably a more meaningful kind of interaction than someone who just passed it by in a news feed.
DEVIN STEWART: Do you have a hypothesis of what the goals are of these campaigns?
BRENDAN NYHAN: I want to be clear that the websites I am speaking about are not necessarily the whole of the misinformation problem online. There are many types of online misinformation. Your listeners may be familiar with Russian-influenced campaigns and other kinds of digital efforts to shape opinion. I am speaking about these websites that popped up before the 2016 election.
As far as we know from the reporting that has been done by Craig Silverman at BuzzFeed and other journalists, they were overwhelmingly oriented toward making money. These were profit-seeking entrepreneurs. They did not seem to have an ideological or partisan goal for the most part.
What we frequently observe is people popping up and trying what seems to work on Facebook, which they then monetize primarily via ads that are being served on the websites themselves. They are members of online ad networks, so they make money every time someone visits those sites. Their goal was simply to figure out what content would generate engagement, and if enough people click out of Facebook to visit the website and they serve ads to those people, then you could generate a pretty healthy revenue stream for what was a very low-cost business. The famous Macedonian teenagers are just one example. All it takes is a few people and a website, and you are potentially making five or six figures or more.
DEVIN STEWART: Putin's assertion that it might not have been state-controlled or state-sanctioned, and it might have been just some clever entrepreneurs has some merit to it, then?
BRENDAN NYHAN: Well, I wouldn't say that. I would say instead that there are different parts of the puzzle that we are putting together. The Russian-influenced operations seem to be largely centered on Facebook itself. They had no profit incentive to drive people to external websites. A lot of what they were doing was organic to social media. So there might be contenting being shared on Facebook or Twitter under false pretenses by trolls, bots, etc. That is a different part of the problem than these fake news websites that we are describing, which were oriented toward driving traffic to external websites precisely because they wanted to make money.
It is important to keep those ideas separate and to think about both of those aspects of the problem when we consider what we can do about online misinformation. The motives of Russian trolls and incentives of those trolls are very different from the motives of entrepreneurs.
In some ways, these fake news entrepreneurs are easier to deter. If we make Facebook a less hospitable platform for them, it will be less profitable and will likely deter them. There is no reason these Macedonian teenagers can go back to whatever it was they were doing before. They were drawn into the market because of the profit-making opportunity. The Russians unfortunately may be harder to deter, and it is not clear that we have taken steps to address that aspect of the problem, but our study is really focused on again what appear to be profit-oriented fake news websites.
DEVIN STEWART: Got it. Before we get to the findings of your study, which are counterintuitive in some ways, I believe, do you have a sense of where these entrepreneurs are located?
BRENDAN NYHAN: In the United States and around the world. It does not seem to be specific to the United States. There are certainly cases where people within the United States have been identified as responsible for certain fake news websites, but in other cases we have seen people publishing this content successfully from abroad. One of the aspects of the Internet that is different is that you can publish from anywhere and appear to be here.
It is surprising often how low-quality the content on these websites was. A lot of the material was recycled or plagiarized. In some cases the English was discernibly written by non-native speakers, etc. The quality varied quite a bit, but yes, in the United States and around the world.
You might think that actually the relative profit opportunity is greater for a Macedonian teenager in a country with a lower median income and fewer opportunities than here, so it is relatively more attractive to people abroad potentially than folks in the United States, although we saw examples of both.
DEVIN STEWART: Is there one country that stuck out as being the source of a majority or the lion's share of this news?
BRENDAN NYHAN: We do not have hard data on the origins of all of these sites, so I cannot actually tell you. We only know the sites where reporters have actually tracked down the people responsible for them. In other cases it is not clear. Again, you may have a domain registration that is opaque as to who is behind the website itself, and people may not reveal who is responsible. In some cases there are fictitious personas named as the authors of articles and so forth.
DEVIN STEWART: Okay. So let's get to some of the findings here. First of all, what are the more surprising findings? I got a sense of the coverage of your study that you seem to be skeptical about whether fake news deserves the hysteria that it has been getting in the press. What are the effects of fake news on people's political opinions, and how does that relate to your findings?
BRENDAN NYHAN: Sure. Let me say what the findings are, and then I will explain what I think they can tell us about the effects of fake news.
What we find is that about one in four Americans visited one of these fake news websites. That is our best estimate. Not surprisingly, the kinds of fake news websites that people visited was correlated with their candidate preference. Trump supporters were more likely to visit pro-Trump fake news websites, Clinton supporters were more likely to visit pro-Clinton fake news websites. But there was a clear asymmetry. There was much more pro-Trump fake news than pro-Clinton fake news, consistent with the reporting we have seen since the election. That is also supported in our data.
I think what is most surprising about our findings to people is that we do not observe evidence of a widespread echo chamber. The story that people often tell is that everyone is in digital echo chambers now, they are being exposed to information and news that seems to reinforce their opinions, and that is a very worrisome prospect. I am worried about it as well, but it is important to be clear that the data do not support that kind of story. There are people who do overwhelmingly consume news and information that is consistent with their political point of view, but it is actually a small minority.
What we find specifically in these data is that it is really only the 10 percent of Americans with the most conservative information diets who are consuming a lot of fake news. There is a relatively low level of exposure across the rest of the population. But for this group that already consumes a lot of news that coincides with their political beliefs, they also consumed fake news, in the sense that we define it, that corresponded with those beliefs. That, I think, reframes the question about why fake news is a problem. It changes the question, I think, from: was this misleading people in the middle of the political spectrum who did not know very much about politics and were not sure who to vote for to: is this inflaming polarization and misleading people who already have strong predispositions to believe in the veracity of dubious political content. That is the question I think we should focus on.
I am happy to elaborate more on how we might think about what the effects of fake news are, but first it is important to think about who is even being exposed to it. If you are trying to estimate the effect, the first thing you would want to think about is who is actually seeing this information, and then we can think about what effect does it have.
DEVIN STEWART: One in four sounds like a big number. One in four Americans has been exposed?
BRENDAN NYHAN: That's right.
DEVIN STEWART: You are not too alarmed by that 25 percent number?
BRENDAN NYHAN: I am alarmed, but there is a lot of misinformation out there. So it is important to put that number in context. If you think about the misinformation people were exposed to during the 2016 campaign, for instance, I am quite confident they were exposed to much more misinformation from the candidates than they were from fake news websites.
DEVIN STEWART: Got it.
BRENDAN NYHAN: The president of the United States has promoted misinformation at an unprecedented, alarming rate going back to the campaign. He almost certainly exposed people to far more pieces of information that have been debunked than all the fake news websites put together.
That does not mean that this is not an important concern. This is a new type of misinformation and therefore worrisome and worthy of attention. But I do think it is important to calibrate that they are a relatively small portion of all the misinformation people are being exposed to.
In fact, in the paper we quantify the percentage of political articles that people visit and what share of those are made up by fake news, and even for Trump supporters, who are more likely to seek this kind of information out, we find that that number is about 6 percent. So it is still a relatively small percentage of people's overall news and information diet.
DEVIN STEWART: Let's go back to talking about what is the impact of fake news. I talked to some of my colleagues earlier today, and I think a lot of them guessed that people go out and seek news that confirms their political biases. Is that part of your argument and your findings?
BRENDAN NYHAN: It is. We do think that that phenomenon is helping drive exposure. The fact that we see fake news consumption being so correlated with people's political predispositions and their overall information diet suggests a pattern that is consistent with people seeking out news that confirms their prior beliefs. So in terms of the effects that we might see, it is important to be clear there that people's political preferences are driving consumption under that argument, not the other way around.
DEVIN STEWART: Right.
BRENDAN NYHAN: We find it implausible that these people who already have the most conservative information diets in the American public were somehow persuaded not to vote for Hillary Clinton. These are people who were extremely unlikely to vote for Hillary Clinton no matter what.
It is possible this could have affected their decision whether to turn out or not. We cannot evaluate that with the data we had. But to me it is more plausible that exposure to this kind of dubious political content is inflaming polarization and misleading people. I am more worried about those sorts of effects of exposure than I am about changing vote choice.
There is no credible evidence at this point that the fake news websites that I describe had anything close to the effect of changing the 2016 election, as some have suggested. We have a lot of evidence from all different sorts of efforts to persuade people in general election campaigns that it is very hard to do so. Even television ads, which are arguably a higher-impact medium, have very small effects in that context, and I would expect fake news to have similarly small effects.
DEVIN STEWART: You are a political scientist. Can you tell our listeners how people shape their political beliefs in the first place? If it is not from exogenous, external data, how do people pick their political tribe?
BRENDAN NYHAN: There is a short-term version of that and a long-term. The long-term version of that is that most people have a partisan predisposition that is a very strong predictor of how they will vote in the next election. The campaign will ultimately "bring people home." People who tend to vote for Democrats will ultimately vote for Democrats; people who tend to vote for Republicans will ultimately vote for Republicans. That includes not just people who identify with the parties but Independents who often lean toward one of the two major parties and behave as if they are partisan in practice. So it is actually a very small percentage of people who are truly Independent voters, who do not lean toward one of the two major parties.
Then when you are thinking about how people vote, the most important factor that we found in political science in presidential elections is the state of the economy. There are more, but that is the most important motive force in changing the presidential vote between elections. We know it is very hard to run for reelection if the economy is poor, and it is very easy to do so when the economy is doing well. That pattern we have observed across a number of elections.
Again, it is important to remember that people are receiving all sorts of messages during a campaign. These messages are just one of the many forms of content and news and information that people received about the 2016 campaign. They had people knocking on their doors, they had conversations with their friends and neighbors, they had television advertising, they had mailings. The list goes on and on. Any particular piece of information you have to put into that larger context, and then you have to remember that most people have, the people who pay close enough attention to politics and seek this kind of information out, as we were discussing, already often have very strong predispositions to begin with, and that is why we think that it is often quite difficult to change people's minds.
DEVIN STEWART: In some of the articles about your study you have described the people who are exposed to fake news as "a subset" of the American citizenry. What is that subset? What does the typical consumer of fake news look like?
BRENDAN NYHAN: It is these folks who have the most conservative information diets. That is the group where we found fake news consumption to be highest. These are people who follow politics more closely than the average person and tend to seek out news and information from websites that disproportionately cater to conservatives.
The way we are quantifying people's information diets is by examining the proportion of articles shared on Facebook from those websites that are shared by self-identified conservatives versus self-identified liberals. The folks in this group are disproportionately visiting websites that are disproportionately shared by conservatives relative to liberals, so these are websites like Fox and Breitbart, for instance. And the list goes on. Those are not the only ones. But they are, relative to the average person, visiting sites like that much more.
This is a group that is small—as we said, 10 percent of the American public—but those folks on either end of the ideological spectrum are often more politically engaged and active. They are more visible in the political news audience because they make up a disproportionate share of it. They consume much more political news than the average person. They are probably more likely to vote not just in general elections but in primaries, more likely to contact their members of Congress, and all different sorts of ways that people participate in and engage with politics.
As a result, the political system may be disproportionately representative of these relatively small subsets of people. So they do matter, and they matter beyond their voting power as a share of the population because of their level of engagement.
DEVIN STEWART: Did you also look at the effectiveness of the actual stories? What makes an effective piece of fake news and what does not?
BRENDAN NYHAN: We did not do a formal content analysis of these articles. My qualitative impression is that they frequently feature some sort of outrage designed to illicit emotion, particularly negative emotion toward the other side. A typical fake news article denounces some outrage or celebrates someone for putting a—in the case of a pro-Trump site—a Clinton supporter or a liberal in their place or exposing them or revealing something. It is often a combination of dubious information and/or some kind of sweeping denunciation.
This is consistent with the research that has increasingly emerged in recent years about how negative feelings toward the other party play an especially important role in our polarized politics now. It is not that Americans have become more polarized ideologically—a subset has—but what is driving a lot of polarization that we see are these strong negative feelings people have toward the other party, which really have increased in recent years.
DEVIN STEWART: How about solutions? You mention that Facebook is one of the biggest sources driving people to these stories, and since the 2016 elections certainly Mark Zuckerberg has come out in public saying that he understands this is a problem. Do you get a sense that Facebook and other organizations and entities are doing something about this, and if so, are they doing enough?
BRENDAN NYHAN: Facebook has done a lot since the election. They were late to recognize this problem and to address it. They missed it in 2016. I don't think there is any way around that. Since the election, as they have come under greater scrutiny, they started to take the problem much more seriously, and there are a number of people at Facebook working on this problem right now.
It is not an easy problem to solve, though, and we have to be careful in what we ask Facebook to do. I think it is appropriate to be cautious about intervening in political debate on the scale that Facebook now has the ability to do. This is a company that reaches the majority of American adults and often for long periods of time. They have a very significant influence on the news that people see about politics, and if they go too far in how they intervene in the flow of information, it could be worrisome for a democracy.
They already are intervening. Of course the algorithm is not some sort of natural state of affairs, it is a human creation. But as they start to intervene more aggressively to dissuade fake news entrepreneurs, which I think they should do, we have to be careful that they do not go too far in silencing or suppressing legitimate political speech. That is often a complicated line to delineate in practice. Not every case is as clear as the Macedonian teenagers. We have given a private company a great deal of power over how people get their news, and I think that should worry us, and the company deserves the very high levels of scrutiny it is now receiving.
DEVIN STEWART: Finally Brendan, as a scholar, did you find some remaining unanswered questions in this field? What types of research would you recommend that other scholars look to do in the future to understand this phenomenon more thoroughly?
BRENDAN NYHAN: There is a tremendous amount that we need to learn. My study is one of a set of new types of research where we are actually examining people's behavior in the digital sphere rather than relying on their self-reports about what they saw or what they read, and I think that has the potential to open up all sorts of new questions and learn more about how people actually get their news and what news they consume in practice rather than what they tell us in surveys. I think that is a very exciting prospect.
We also have to learn a lot more about how to intervene effectively against fake news. How can we correct stories in a way that reaches the people we hope to reach?
We found in our study, for instance, that fact checks reaching people who are exposed to debunked fake news stories was an extremely rare event, one we never observed in our data. In other words, the people who saw a fake news story that had been debunked by a fact checker never saw the fact-checking question, at least in terms of what we were able to observe in our data. There is a really important targeting problem. Even if we have lots of fact checking, providing those fact checks to the people who need it at the time that they need it turns out to be a very difficult problem.
Finally, I think we have to figure out how to dissuade and deter the fake news entrepreneurs who are clogging and polluting our political debate without silencing legitimate political speech, and that is an important empirical question but also really a philosophical one that we are all going to have to think about as we try to determine what 21st century digital democracy looks like.
DEVIN STEWART: Brendan Nyhan of Dartmouth College is co-author of "Selective Exposure to Misinformation: Evidence from the consumption of fake news during the 2016 U.S. presidential campaign." Brendan, thank you so much for speaking with us today.
BRENDAN NYHAN: Thank you. My pleasure.