Logo

Fake News and Google with Daniel Sieberg

October 2, 2017

CREDIT: Stuart Rankin (CC)

Podcast music: Blindhead and Mick Lexington.

DEVIN STEWART: Hi. I'm Devin Stewart here at Carnegie Council in New York City, and today I'm speaking with Daniel Sieberg of Google News Lab here in New York.

Daniel, thank you so much for coming.

DANIEL SIEBERG: Good to be with you, Devin. Thanks for having me. 

DEVIN STEWART: For people who are unfamiliar with Google News Lab, what is its mission?

DANIEL SIEBERG: I was one of the co-founders of the team. I have been at Google for about six years. I previously led a group called Google for Media, which was really about empowering journalists to use digital tools for storytelling. Of course, that encompassed a lot of Google products, but just thinking holistically how can they better make this lead into the digital transformation. That was a few years ago.

And then, about three years ago, a number of us came together and thought how could we combine our efforts, scale it, make it perhaps a global opportunity, to lean into more of what publishers and broadcasters want to figure out. The mission of the News Lab is to collaborate with journalists and entrepreneurs to build the future of media.

DEVIN STEWART: How big a team?

DANIEL SIEBERG: We are a relatively small team in terms of the sheer size of Google. We started out at about half a dozen of us, and we are now about 20 people. We've got folks who are in what we call our News Lab Leads, who are based in Singapore, in São Paulo; across Europe in Berlin, in Paris, in London; and we've got some contractors who do training with journalists on these tools in other markets, like in India, in Argentina, in some of these places where there is really an appetite to learn more and to understand these tools in a way that is beneficial to them.

DEVIN STEWART: How does this work in Google where you want to launch a new initiative? How did you all get to know each other, and is there something in common between you and your teammates?

DANIEL SIEBERG: That's a great question. I would say that at least half of our team has a former journalism background. I was a journalist for almost 20 years and worked at a lot of different outlets. Not everybody on the team has that kind of a background, but they have this drive to figure out information in this age, quality information. I think it dovetails with Google's mission, which is to organize the world's information and make it useful and accessible to everyone, quality information, news being of course probably at the top of the list, or near the top of the list.

DEVIN STEWART: Right.

DANIEL SIEBERG: I think all of the people on the team have a big belief in trying to figure that out.

DEVIN STEWART: They are idealists, or what do you mean?

DANIEL SIEBERG: I think there is some amount of idealism that probably pervades the team, some sense of civic duty perhaps. Of course, we all work at a tech company and, starting that initiative, we could have come at it like we are creating a start-up. But a start-up within Google, of course, is a pretty comfy place to be in many ways. We are not suffering some of the same things that a traditional start-up would have to go through.

DEVIN STEWART: How does that work? First of all, you are all in New York City, right?

DANIEL SIEBERG: I am in New York City and a handful of our team is based here, but others are in Mountain View, which is Google's headquarters, and previously we had somebody in DC.

DEVIN STEWART: Got it.

DANIEL SIEBERG: We are a little disparate as a team, coming together with this combination of sort of virtual meetings, strategizing on what would land well, how do we fit in with the rest of the company.

DEVIN STEWART: How does it work? So you were one of the co-founders, you thought of this idea. How did you do a start-up in Google? Did you have to write a proposal?

DANIEL SIEBERG: Yes. Structurally, we had to figure out how were we going to work, how do we get the right buy-in from stakeholders of the company, how do we demonstrate our value in tandem with everything else that is going on. In the early days, there was a lot of just building momentum around what we wanted to do and galvanizing support in the company for it, and demonstrating that what we were going to do had some real value in the community. It was everything from meeting with industry folks and getting their guidance and feedback, to meeting with internal folks at Google and saying, "Does this resonate with you? Do you think this is important?"

Ultimately, we are a team that is not tethered to one particular product. Most teams at Google are either a marketing team for Android or a marketing team for Chrome Books or something very specific. We are a cross-product team, so we try to figure out how does everything that Google brings to the table fit within journalism and help journalists.

It could be some partnerships that we do with different organizations. We have worked with the European Journalism Center to create these events that happen across Europe that are a combination of thought leadership and training. We work with entrepreneurs, so we have a partnership with Matter, which is the biggest venture capitalist for media start-ups in the country.

So it manifests in different ways, but we keep coming back to that umbrella of collaborating with the industry and what is it we can do to help understand their needs and then help to foster that within the company.

DEVIN STEWART: What would you say journalists need around the world? What are they asking you for?

DANIEL SIEBERG: It varies. I have spent some time in Asia talking with our News Lab Lead there. There is a huge interest in learning, just understanding the tools better. Some of them would say, "We just need Google to listen to us. We have some pain points, whether it is around the revenue side or around the storytelling side, the editorial side." They are really eager to have us come in and just help to train them on some of these things.

Whereas in other parts of the world it is a bit more about transactional: "What is it that Google is going to do about certain products?"

We have the Digital News Initiative, which is a little sub-part of our team and we kind of fit within that. But the Digital News Initiative was announced—gosh, I want to say almost three years ago, around the same time that we were developing. The Digital News Initiative really focuses primarily on Europe, for example, and it has an innovation fund as part of it, where different newsrooms can apply for funding for projects that they are doing. The training effort that we have fits into that.

And then there is a product piece, where it is just hearing feedback from newsrooms saying, "Why is it that YouTube works this way? We would like it to work this way in terms of how it is embedded at our site, how the Google AdSense works"—all of these things. We are not the only team that has these conversations, but we try to be a bit of a conduit between Google and the industry.

DEVIN STEWART: Where does it fit in the corporate structure? We visited Google Ideas downtown in New York City, which was fascinating, and we got the impression that Google Ideas was an initiative launched by Google that is pretty much philanthropic in its attitude toward profits. In other words, they are not really responsible for necessarily bringing in a profit.

How is your team sustained?

DANIEL SIEBERG: I would say that we have a similar model, in that we do not look at metrics every month that are tied to a number of click-throughs or revenue or something like that. I would say that, frankly, that is a bit of a luxury for our team, we are not bound to that, in a similar way that at Google Ideas—which is now Jigsaw—they have a similar way of approaching what they do. In terms of how we fit together—

DEVIN STEWART: Do you guys work together?

DANIEL SIEBERG: Yes. For example, a couple of the big initiatives that Jigsaw has been working on—one is Project Shield, which is about helping protect especially smaller newsrooms in countries where free speech is under threat, giving them this protection from denial-of-service (DoS) attacks. It is like a technical support for them to be able to withstand these types of attempts to shut them down.

We have these touchpoints into newsrooms so we can help, from a relationship standpoint, bring them together. Jigsaw is focused on a lot of things at a really macro level, and I think that where we help to make it more tangible is on the ground with specific newsrooms.

DEVIN STEWART: Aside from organizing all the world's information for it to be able to be used, do you all, when you are talking in the hallway or at the water cooler or whatever, talk about a moral value that guides all this? Is it complete access to as much information as possible? And, if so, to what end? What is motivating and animating this?

DANIEL SIEBERG: I think there is a bit of a moral compass or an ethical compass for a lot of the people who are part of the News Lab. As I say, some of us are former journalists, so we believe strongly in the First Amendment and wanting to ensure that people have access to whatever information they prefer to get. We try not to have a bias towards one type of content or another, or one newsroom or another. We try to be holistic in that way.

I went through journalism school almost 20 years ago, and I remember I had an ethics professor who said, "Everybody brings a bias, whether you want to or not, even as a journalist. You can't help it. You just have to be perhaps a little bit more aware of it and factor that into whatever content you are creating."

I would be lying if I said that all of us did not have some sort of moral or ethical background in all of this, or motivation to make this work.

DEVIN STEWART: It is definitely good to be self-aware.

DANIEL SIEBERG: Yes.

DEVIN STEWART: Now, fake news—that is the thing everyone wants to know about, right? How do you identify it, what are you doing about it, and how much of a threat is it to the average citizen?

DANIEL SIEBERG: I think it is certainly the topic du jour for lots of newsrooms, and understandably so.

DEVIN STEWART: Does it deserve the attention it is getting?

DANIEL SIEBERG: I think it does, and I would say we think it does. There are a number of things that Google is currently doing and some things that will be announced soon.

One of them is the Fact Check schema. For example, if you are a newsroom and you adopt this schema, or coding, for your site, when you publish content that appears in Google News, it will have this FactReview label, if you will. It is a little bit like a nutrition label. It is a newsroom's way of saying, "We have fact-checked this and we want to stand behind this and say that this is something that"—

DEVIN STEWART: You get what, like a letter grade or something?

DANIEL SIEBERG: It is literally fact-checked as a tag.

DEVIN STEWART: Like a thumbs-up, it has been checked?

DANIEL SIEBERG: Yes, a little bit.

DEVIN STEWART: Verified kind of thing?

DANIEL SIEBERG: Yes. It is a voluntary thing that newsrooms can opt into. This is not required. This is not meant to have an effect on ranking or anything like that. It is much more a signal that people can see "All right, this newsroom really cares about fact-checking and they want to demonstrate that to their audience."

There is certainly a debate to be had on what type of signals matter to people, and it is different for different audiences. Some people really care about brand, and for them, if it comes from a brand that they know, for them a trusted news organization, maybe having a fact-check tag is not necessary because for them they already trust that brand and that is all they need. For maybe a smaller newsroom, or one that is not as familiar, it could be helpful just to say as an audience person, "Ah, I see that they care about this and they are indicating that to me." But I think it is different for everybody.

DEVIN STEWART: If I am interpreting you, it sounds like you are very gently saying that there are some people out there who just do not care if it is factual or not, they just want to consume the stuff they want to consume?

DANIEL SIEBERG: I think we have to be honest about that, that there just ultimately are people who maybe are less concerned about it or do not seek it out as much. News consumption behavior is very unique to each person. What we want to do is to work with publishers and broadcasters and figure out the right ways to indicate to people that this is verified content, this is trusted content.

But I think we live in an era where so much is debated, and you could argue that something is fact-checked to the nth degree, but if it goes against somebody's personal beliefs and their worldview, it can be harder for them to stomach. So I think part of it is us putting ourselves in the shoes of other people who are consuming this and different points of view.

DEVIN STEWART: We had Tom Nichols here in this room. He wrote a great book called The Death of Expertise. It talks about the cultural phenomenon of—he links it to pervasive narcissism, also "We have all the information we need, thanks to the Internet, so why do we need experts to tell us what is right or wrong?"

Do you see this emergence of fake news to be a growing problem? And if so, why? And to what degree is it a problem at all?

DANIEL SIEBERG: Well, I would say that, from what I have heard from publishers, fake news is not new per se, it is just that we are hearing about it more in light of, of course, what happened last year with the election and with the news about it today. Publishers and broadcasters have always been thinking about this and slightly concerned about it, but now it has just been amplified.

Is it potentially going to get worse? I don't know that I am the person to authoritatively say that. I think that we as a company want to try to ensure that it doesn't. We want this to be a very collaborative kind of a thing. This is not meant to be Google, or any other company, saying, "This is exactly how people should consume news" or "We think this is the best way to do it."

But at some point decisions have to be made, and we are acutely aware of our responsibility in all of this as a platform for news where people consume their content.

DEVIN STEWART: What about the criticism from, I would say, conservative voices who say that, "Well, these are a bunch of East Coast liberals who are working at Google, or other elite places, and what gives them the right to be the ones to determine what is fake and what is not?" How do you respond to that type of thing?

DANIEL SIEBERG: I think it is a fair comment. I think that my response would be that we are working very closely with the industry and we want to make sure that whatever we put out there lands the right way so it does not feel like it is Google's point of view, or "Google says that this is what it should be," that it is coming more from the industry.

In terms of people's political leanings or something, I think it might surprise people to know just how tirelessly people at Google work to try to have as much of a 360-degree view of all of this content as we can. It is not easy, I would say—and this is not "Oh, woe is Google" in any way, because this is our responsibility—but I think that, from my experience, people really do try to understand the entire landscape. Whatever your political leaning is, we want to ensure that it is quality and accurate information for people.

DEVIN STEWART: Let's talk about some of the global trends. Russia and China are often pointed out as maybe the most innovative countries in terms of having their governments be involved with generating media misperceptions, propaganda, changing the way people think about global politics or local politics.

I think, for example, there is a big story about RT that figured out that, instead of being an agent of promoting Russian soft power, it was actually much more effective at covering local stuff in the country where they are based, for example in the United States.

What do you see going on with China and Russia? I actually was in a conference recently that pointed at China as the most dangerous country in the world for the future of democracy, for these very trends that I am talking about. Should we be alarmed about these two countries, and what do you see happening?

DANIEL SIEBERG: I think alarmed may be one way to describe it, certainly concerned and aware. I am not as close to some of what the team at Google is figuring out with countries like China and Russia, but they are certainly plugged into that and trying to figure out how to best manage it. This is, as we know, the source of a lot of the so-called fake news has been coming from, especially Russia being in the news mostly.

But it is complicated. We are talking about geopolitical stuff that is not easy to just figure out overnight. So I think there are a lot of people who are in the mix trying to figure this out.

Our team, to be clear, we have somebody who is based in Singapore and we have some presence in Asia. There is a whole other Google team that figures out what is happening across Asia. So I would say that we are a part of figuring all of that out. But, as you say, this is an area where we need to pay a lot of attention.

DEVIN STEWART: Would you agree with the perception that Russia and China are exceptional at putting resources and research into this field?

DANIEL SIEBERG: I would say that it seems that way anecdotally to me. I do not have any hard evidence other than what I read and what I have seen. But certainly, they have a motivation to do it for a number of reasons—they want to disrupt Western democracy and they feel like this is the best way to do it.

What I keep hearing is that, yes, it affected one political party in the United States this time, but that three years from now or next year or whenever it could be affecting all sorts. The argument that I have heard is it is not just about one political leaning or the other, it is about unsettling the foundation that we have as a democracy, and that any amount of confusion or stirring up people's angst and anger is helping their cause, whether it is for one person or one party or the other.

DEVIN STEWART: It seems like everything these days is "weaponized," in American lexicon, the weaponization of news or information. Is that getting a little too hysterical, or is that a correct way to describe it?

DANIEL SIEBERG: Perhaps we are course-correcting for an era where that was not part of the discourse for a long time. People did not talk about fake news before the election, and certainly not to the level that we are now. So maybe it has become for some people it feels like this huge thing and a lot of effort is being put into this and more attention, but maybe that is just course correcting for the fact that it was not like that before, and maybe it will sort of even out over time.

DEVIN STEWART: A pendulum?

DANIEL SIEBERG: A little bit of a pendulum perhaps, yes.

DEVIN STEWART: You started this program at Google, so you must believe in it. I really thank you for coming here today, Daniel.

If you could just give us a sense of, if this is not addressed, what is at stake here? What do you see for the near future?

DANIEL SIEBERG: There is a lot at stake, and I think that is clear from the number of companies that are involved. Obviously, it is not just Google; it is Facebook, it is Twitter, it is wherever people consume information. I think that there is some amount of debate and criticism and what should be done exactly and how should we do this. Speaking on behalf of Google, there are people every day for whom this is their job, just figuring this out.

There is not a silver bullet to any of this, and I think it is going to be a combination of what we do as a tech company but also the industry. I think that, in talking with newsrooms and publishers, they are aware of that fact. They need to own their own share of responsibility in how to handle this. But certainly we have, at least, if not an equal amount, certainly a sizable amount, of figuring this out alongside them.

DEVIN STEWART: One last question, Daniel. What about for the many, many people out there who just have a certain taste for sensational, crazy stories? Is there anything that you can do engaging the public to change their preferences?

DANIEL SIEBERG: That is such a good question, Devin. I don't know. I started my career at CNN.com—well, I was a print reporter at the Vancouver Sun even before that—and when I came to CNN.com I was the editor of the technology section, and I was in the morning meetings, and we would always try to figure out what was the priority content that we wanted to put out for people. You know, the most important stories were not always the most-read stories, right? It is just sort of the nature of the beast. The sensationalist stuff often does get more traction. I think a little bit of that is just human nature, and we are battling that as much as we are battling people's Internet behavior or their political leanings or whatever it is.

I think, as an audience, we sometimes gravitate to the Kardashianesque kinds of stories versus something about foreign policy in Afghanistan. I think that is almost a commentary on just as a society what types of news do we care about.

I think newsrooms, and anybody who cares about quality information, are trying to figure out how in some scenario do we help people understand those stories better, make them more appealing, more relevant to their lives, whatever it takes. But it is not an easy challenge. I will admit, if I were to allow somebody to look at my news consumption behavior over the past month or something, I do not know that I would want to trumpet it to everybody, because I think we all fall into this "Oh, I kind of want to read that story, but the other stuff is really important." There is a little bit of "you've got to have your vegetables along with your desserts" kind of thing.

DEVIN STEWART: Of course.

Daniel Sieberg from Google News Lab, thank you very much for coming today.

DANIEL SIEBERG: My pleasure. Thanks, Devin.

blog comments powered by Disqus

Read MoreRead Less