Fighting Fake News, with Anya Schiffrin

Sep 5, 2018

"Disinformation, fake news, online propaganda is a problem that has gotten attention all over the world, and we're seeing very divergent responses," says Schiffrin, author of "Bridging the Gap: Rebuilding Citizen Trust in Media." "I think the U.S. is going to do what it always does, which is look for free-market solutions and try lots of small-scale initiatives, and Europe is going to do what it tends to do, which is have more regulation."

DEVIN STEWART: Hi, I'm Devin Stewart here at Carnegie Council in New York City, and today I'm speaking with Anya Schiffrin. She's director of the technology, media, and communications specialization at Columbia University's School of International and Public Affairs (SIPA), so the foreign policy school at Columbia.

Today's theme is rebuilding citizen trust in the media as part of our ongoing Information Warfare project and series of interviews. Earlier today we spoke with one of your colleagues, Andie Tucher. She is a historian at Columbia University. She focuses on the history of journalism and fake news, and she gave us a lot of historical context; fake news is not new, essentially. She gave us a tour of 400 years of history of fake news, at least in the United States.

My understanding is that you're working on a journal article about the response to fake news. Let's start with that. Give us a context of what has been done to fight fake news, and what have you learned from writing your journal article.

ANYA SCHIFFRIN: Sure. Thank you so much for having me, Devin. I'm happy to talk about this subject.

What's so interesting about the conversation today about disinformation and fake news and media trust is how many of the ideas that people have are ideas that have been around for really a hundred years at least. When you look at the research on disinformation and propaganda, it really, I would say, got seriously underway after World War I and in the run-up to the Nazi period and World War II when intellectuals, scholars, and journalists all began to wonder how is it that people fall for propaganda and what causes that.

DEVIN STEWART: It sounds like you're saying that there was a pivot point where people started to focus on this. What made them start to think it was important?

ANYA SCHIFFRIN: Because of World War I and World War II. How could sensible people fall for Hitler? Why is it—there was so much propaganda during the Boer War and World War I about German atrocities and raping Belgian nuns, and there was Stalin and the terrible things that were happening in the Soviet Union, so propaganda became the subject of the day.

Researchers and academics tried to understand why people fall for propaganda. They weren't sure. Is it something to do with the techniques of the propaganda? Is it the personalities of the people who watch it? Is trust correlated with education or intelligence? What is it?

Out of this whole discussion and debate—I won't go too much into the intellectual research on media trust; I have been writing about that—I'll highlight one initiative from the late 1930s, which has many similarities with today. There was a journalist called Clyde R. Miller, who worked at Teachers College at Columbia University as their head of communications, and he was worried about propaganda and disinformation. He got money from a liberal philanthropist, Edward Filene, who had started Filene's Basement, Filene's department store.

DEVIN STEWART: It's like T. J. Maxx. That's great.

ANYA SCHIFFRIN: In Boston. I think it was $10,000, and they started something at Columbia University called the Institute for Propaganda Analysis.

DEVIN STEWART: Is it still around?

ANYA SCHIFFRIN: No. It ended up dying in 1942, and I can explain why.

But what they did was very similar to what people like Claire Wardle are doing today. They came up with easy-to-understand taxonomies—the seven kinds of propaganda; the 11 processes; testimonials, for example, pretending to be just plain folks, or glittering generalities, and playing on people's fear.

They really thought—you can read their documents and their papers, which are in the archives, and I've been spending some time looking at them—today the problem is there's too much information. it's so much like today. In the olden days, there was the town hall, there was the cracker barrel, citizens could get together, they could solve problems together. Now there's so much information that you have to trust people to make decisions for you. In that happening and having to make decisions about things you haven't seen yourself, what do you do? They felt that was part of why people were susceptible to propaganda. They thought if we can just teach people how to think clearly and how to be logical and apply common sense, they won't fall for it.

They began all kinds of things. They had something called the "weekly bulletin," where they analyzed the events of the day and talked about the different propaganda techniques being used. They teamed up with Scholastic, and they had inserts into Scholastic magazines. They had something called something like, "What makes you think so?" and that went to more than a million school children around the country.

DEVIN STEWART: Does that magazine still exist? I remember that from when I was a kid.

ANYA SCHIFFRIN: No, [the insert does not still exist]. Like I said, they ended up going out of business in 1941.

DEVIN STEWART: Oh, they published Scholastic magazine?

ANYA SCHIFFRIN: Scholastic is still around, but this insert to reach schoolchildren was just a few years. They also pioneered an anti-racist, very inclusive curriculum in Springfield, Massachusetts.

The reason I mention them is because so many of the things that they believed and they were trying are things that people are still trying today to fight disinformation and fake news online.

DEVIN STEWART: At the outset, you talked about there are these theories, like IQ or education or personality or maybe relationships or appeals perhaps—there's a great book by Cialdini called Influence: Science and Practice, which is a classic—and then you mentioned the institute at Columbia. It seems like the institute at Columbia decided that what causes someone to believe in propaganda is essentially about media literacy. Was that conclusion correct, or were the original hypotheses still valid? What do you think?

ANYA SCHIFFRIN: This is something that people are still arguing about today. I have made this taxonomy of solutions. There are about five different solutions that are being proposed today.

There are what I call the "supply-side" solutions, which are aimed at affecting the supply of disinformation. That would be things like regulating it, banning hate speech online, taking it down, which is what Facebook is doing, the algorithms.

Then there's what I call the "demand side," and media literacy would be a classic demand side. Facebook would love to believe that if we just taught people to be more sophisticated, more careful, and more analytical, then they wouldn't pass on all this fake news, and we wouldn't have these terrible effects that we're having at the moment.

DEVIN STEWART: Your term, "demand side," can be very loaded in an interesting way because there is demand for fake news, actual fake news. There is a psychological disposition of confirmation bias; you search out information that makes yourself feel better because it confirms what you already believe. Is that what you actually mean, or do you just mean the consumption of the media?

ANYA SCHIFFRIN: I think I mean both actually, now that you mention it.

DEVIN STEWART: It's kind of both in a paradoxical way.

ANYA SCHIFFRIN: That's right. I think the problem is that all of the solutions being put out there, none of them are that comprehensive. All the solutions that are out there are Band-Aid, small-scale fixes. Not one of them is really going to solve the problem.

The media literacy, to me I think it's great to try it just like the Institute for Propaganda Analysis did and just like many people are doing now. The problem is that we don't have conclusive literature or research showing that it works. Many of the people who are pushing media literacy are forging ahead because they feel like they need to do something.

DEVIN STEWART: And it's intuitive maybe.

ANYA SCHIFFRIN: Yes, and it sounds like a good idea, and I'm sure it can't hurt, but when you look at the literature over the last 50 years, it's not that clear.

The other thing is, of course, social media platforms like Twitter and Facebook, they don't want regulation. It's a lot more convenient for them to say, "Well, it's the fault of the consumer." This is the point that Emily Bell makes, which is blame everyone except yourself. There should be more journalism, or the journalists should do a better job, or they should do more fact checking, and the audience should be more savvy and more sophisticated, and not pass on this stuff. So, it's a nice try. I'm not sure it's going to solve the problem.

DEVIN STEWART: You could argue also that it's just a fact of money, that people want things that entertain them and confirm their biases, and those are the things they're going to click on, and therefore those are the things that are going to make more money.

ANYA SCHIFFRIN: That's right. The whole business model of Facebook is based on that, people spreading things around, and what do people spread? Things that make them angry and outraged. Their whole business model is based on this, unfortunately.

DEVIN STEWART: Same goes for the mainstream media writ large, too. I'm hearing from my journalist friends that this is the way, for example, headline writing is essentially clickbait in a lot of places. Did you look at that at all?

ANYA SCHIFFRIN: Obviously, you're absolutely right that journalists are under pressure now to get more attention and to get more shares and more likes, but I'd be really careful about equating an exciting, attractive headline in The New York Times with a piece of scurrilous disinformation on Facebook.

DEVIN STEWART: How do you think we're doing here as a society in addressing fake news? What do you think of American approaches and techniques, and how about the world at large? How are we doing compared to the rest of the world?

ANYA SCHIFFRIN: Disinformation, fake news, online propaganda is a problem that has gotten attention all over the world, and we're seeing very divergent responses. I think the United States is going to do what the United States always does, which is look for free-market solutions and try lots of small-scale initiatives, and then Europe is going to do what it tends to do, which is have more regulation.

The closed societies—the Malaysias, the Turkeys, the Hungarys, and the Polands—are going to have even more repressive legislation.

DEVIN STEWART: Clamp down on the journalists themselves.

ANYA SCHIFFRIN: Many of the digital rights groups in this country are completely against regulation, but I personally like the idea. One of the things I've been trying to do is think about what is low-hanging fruit. What is the kind of regulation that everyone can agree on and that makes sense?

For example, any law that applies offline, maybe we should look at putting it online. If you have to disclose political advertising offline, why don't you have to disclose it online, for example? In the case of Germany, if you're not allowed to deny the Holocaust offline, why should you be allowed to deny it online?

DEVIN STEWART: That would go for a newspaper as well as someone tweeting?

ANYA SCHIFFRIN: Yes. Liability is different in Germany. Until this new law that they just implemented in January, I think a lot of the incendiary speech and hate speech, the liability rested with the person who put it out, but now they've made liability for the platforms, which is something we have not done in the United States. So that Facebook will have to pay a fine if there's a pattern of hate speech online.

DEVIN STEWART: Do you think that's the right way to go?

ANYA SCHIFFRIN: There's a lot of criticism of it. I'm not against intermediary liability necessarily, but many of the rights groups are because they feel that it would have a chilling effect.

DEVIN STEWART: Does that essentially mean that Twitter and Facebook are no longer utilities that passively present information that people are communicating through like a telephone, and more like something that's edited like a newspaper?

ANYA SCHIFFRIN: You've really put your finger on it because that's absolutely right. What they've been saying for years is: "We're just platforms, we're not publishers. It's not our fault what people put up. We're just the pipes, essentially."

DEVIN STEWART: We're the phone line or whatever.

ANYA SCHIFFRIN: Yes, whereas other people are saying actually they are utilities, and they should be regulated as such.

The problem, of course, is all the countries in the world where this inflammatory hate speech is going on Facebook and Instagram and causing death. Mobs are going out and killing people.

I gave a talk last week where I talked about the need for regulation. One of the people in the audience stood up and said, "Aren't you advocating censorship?" The problem is we already have censorship. Facebook and Google are already making decisions all the time about what to put lower down in the search engine or what not to put in people's feeds. But they're not governed by democratic processes. That's why I'd like to see regulation that is upheld by the courts, that is decided on by our elected officials, and that we can agree on, like disclosure, for example. I think that disclosure could probably have a good effect of encouraging companies or people not to do things.

I don't know if you just saw the article in The New York Times about how in many states the drug companies, when they raise prices above 20 percent, have to explain why. Why don't we have things like that?

If you're microtargeting or you're crossing different databases, which is illegal in many parts of the world, why aren't you disclosing that? Why aren't you actually telling people what they're doing with their data? Why aren't you giving them real choices? It seems to me that a lot of those basic things would at least be a start in helping address this problem.

DEVIN STEWART: Did you follow the Alex Jones case? Some media platforms like Apple and YouTube and iTunes took Alex Jones' platform off. One platform—Twitter, specifically—allowed him to stay on. How did you assess that decision?

ANYA SCHIFFRIN: I think all of these people are in a tough situation at this point because they created this huge problem, they don't fully know how to deal with it, and they are getting all this pressure from the right saying that they're too liberal, so they're trying to bend over backward to show they're not too liberal, but then they're giving voice to all these people who really shouldn't be broadly disseminated.

I thought it was really interesting, again looking at Germany, where opinion is protected but false facts aren't. There's no protection for that.

In this country we have a really broad First Amendment, so everything is protected speech here, including you can lie in a political ad and that's protected speech because of the First Amendment. But I thought it was really interesting that in lots of other places false information is not protected.

The problem, of course, is once governments get in the business of determining what's false and what's real, it's a slippery slope. I was in Ecuador in March with the Committee to Protect Journalists, and all these regulations that the government had put into place which sounded great on paper ended up being gamed and misused.

One example: Right [of] reply. Sounds great. Everyone should have the right to reply. If there's something wrong in the newspaper or on TV, you get the right to respond. So what started happening in Ecuador—

DEVIN STEWART: Who gets the right to respond?

ANYA SCHIFFRIN: Anybody, any citizen.

DEVIN STEWART: Like a letter to the editor? What do you mean?

ANYA SCHIFFRIN: No, not just a correction, an article of the same size in the same place.

DEVIN STEWART: But it has to be the subject of the original article?

ANYA SCHIFFRIN: No, it can even be another person, like a citizen who is outraged.

Here's what started happening. You're a journalist. You're writing an article about the mayor of some town. You call him up for a comment. He doesn't return your phone calls because he knows that once your article comes out he can call up your editor and say: "There was a mistake. I want my own article on page one within 48 hours."

DEVIN STEWART: How did that work out?

ANYA SCHIFFRIN: Really badly, as you can imagine. It all got gamed.

DEVIN STEWART: They got rid of that or—

ANYA SCHIFFRIN: The new government is scrapping a lot of those laws now. I guess my point is I like the idea of regulation, but I also think as always the devil's in the details, and doing it properly, without abuse, without having a chilling effect, without a censorship effect, is extremely difficult.

DEVIN STEWART: What about your study, Bridging the Gap: Rebuilding Citizen Trust in Media? You surveyed 17 media organizations and their attempts to build trust of the media in their communities. What did you learn from that study?

ANYA SCHIFFRIN: That was a super-interesting study because again, a lot of journalists are trying to take matters into their own hands, and they feel like if they can just spend more time in the community and engaging with their audiences and finding out what their audiences want and bringing them into the newsrooms and speaking publicly about the job of the journalist, people will start to trust the media more. They're getting foundation funding for this.

DEVIN STEWART: Interesting. Sort of community relations building.

ANYA SCHIFFRIN: Totally.

DEVIN STEWART: It's like community policing. Get to know your local cops.

ANYA SCHIFFRIN: That's right. Joy Masters is doing this as well. You go out and say to people, "What is it that you want journalists to do?" And everything they want journalists to do is exactly what journalists do—talk to lots of people, cover local news, be part of the community.

There are a number of organizations around the world that are trying this. There are projects in Kentucky; there are projects in New Orleans, one of which is being led by my former colleague Jesse Hardman; there are groups that are doing fact checking and community engagement in Africa and Latin America.

I think it's all well and good. Only two caveats, and I hate to be so depressing. One is that we have no data showing that it works.

DEVIN STEWART: Except for maybe common sense, but that's not data.

ANYA SCHIFFRIN: That's right. Joy Masters says one of the ways they measure is if you cover controversial subjects two years after starting a program and you don't get trolled, you don't get a lot of hate mail.

The second thing is these are great ideas and really nice projects, but we don't know if they'll scale, and it's really expensive to do it.

There's also no evidence of what political scientists call specific-versus-diffuse trust. You work really hard in one neighborhood, you get people to really trust the reporter who's covering the school board, that doesn't mean they're going to trust the media generally or that they're going to trust institutions generally. So, nice idea, but not sure it's going to fix this huge problem.

DEVIN STEWART: Could there be not a backlash but maybe unintended consequences of community relations? You get to see the making of the sausage, so to speak, and you might actually come away with a less nice opinion about your local journalists?

ANYA SCHIFFRIN: One study precisely found that. One of the things a lot of these groups do is they make a big deal about corrections. If they make a mistake, they're upfront, and they say, "This is what makes us different from other outlets, and people love it because we're being honest about it." One of the studies that I read found that people who trust the media love corrections. It reinforces their trust. People who don't trust the media see a correction and they think: Aha. The journalists make mistakes. Just what I always suspected.

DEVIN STEWART: That's fascinating. How does that disposition come about to begin with? There's this idea that we receive our opinions from our culture, our community, and our educators perhaps, maybe from our colleagues as well, that our environment has a huge impact on what we think about things, our opinions, our viewpoints? How do you think someone gets to be the type of person who distrusts the media in the first place?

ANYA SCHIFFRIN: Right. Again, that's really not clear, and there are a lot of different views.

Whenever there's a problem, journalists tend to feel the solution is more journalism. So the narrative around a lot of this is that local news died. We don't have the small-town papers anymore. They're not covering school boards, they're not covering the zoning boards. People don't see themselves represented in the media. They started watching Fox, and then the next thing you knew they were listening to all of this inflammatory opinion and stuff that just plain wasn't true, so they don't trust the media anymore, and if we could just improve the quality of the media, we could get back to where we were, which is real people sharing real information and having a common set of facts.

This is a great story, and one of my colleagues at Columbia, Charles Angelucci and Julia Cagé are looking right now to see a little bit about the relationship between the rise of TV and the decline of local news, but we don't actually know whether this is true. We don't know whether having good-quality local news will inoculate people against propaganda and disinformation.

You could tell a whole other story. You could say these are places that have de-industrialized. There aren't any jobs. People are angry. There's opioid addiction. There isn't proper health care. They don't trust any institution, so why on earth would they trust the media? It may be that what we're dealing with is actually a huge macro problem that doesn't have to do with journalism practices.

DEVIN STEWART: This has been great, Anya. Thank you so much.

Just to conclude, this sounds very complicated. There are psychological aspects to it. There are business and economics aspects. There's politics and law and regulation, a big jumble. How much does this really matter? How much should people worry about the fake news epidemic?

ANYA SCHIFFRIN: I'm frankly terrified because I think that people falling for propaganda, falling for disinformation, lack of education, and resentment and distrust of institutions is already having an enormous effect on our democracy. I think we've got to fix this, and we've got to get it right sooner rather than later.

DEVIN STEWART: People talk about America is becoming more and more polarized. It's almost a cliché. Do you think that this is fueling this problem of political polarization, or is it reflective?

ANYA SCHIFFRIN: I think probably both. Again, with the research we don't know. It's a chicken-and-egg situation, but clearly it's not good.

I will say the definitive study on the impact of Fox News came out in The American Economic Review I think in September 2017, and that absolutely showed that watching Fox is correlated and causal of voting Republican. I think we probably at this point are starting to have a sense of some of the effect that media consumption is having on voting.

DEVIN STEWART: Anya Schiffrin from Columbia University, thank you so much for coming by.

You may also like

OCT 7, 2024 Video

Science Summit at UNGA79: Brain Economy Paradigm Shift for Democratic Renewal

As part of the 2024 UN Science Summit at the 79th UNGA, Joel Rosenthal discussed how democratic systems can prosper in the age of technological acceleration.

CREDIT: Abobe/hamara.

SEP 25, 2024 Article

Politico Op-Ed: Walking a Fraying Nuclear Tightrope

In a new op-ed, Carnegie Council President Joel Rosenthal argues that a recommitment to nuclear arms control is nothing short of a moral imperative.

Left to Right: Eleonore Fournier-Tombs, Ambassador Chola Milambo, Ambassador Anna Karin Eneström, Doreen Bogdan-Martin, Vilas Dhar. CREDIT: Bryan Goldberg.

SEP 19, 2024 Video

Unlocking Cooperation: AI for All

On the eve of the Summit of the Future, Carnegie Council & UNU-CPR hosted a special event exploring the implications of AI for the multilateral system.

Not translated

This content has not yet been translated into your language. You can request a translation by clicking the button below.

Request Translation