Toward a Human-Centric Approach to Cybersecurity, with Ronald Deibert

Jan 29, 2019

Discussions around cybersecurity often focus on the security and sovereignty of states, not individuals, says Professor Ronald Deibert, founder and director of University of Toronto's Citizen Lab. If you start from a "human-centric perspective," it could lead to policies focusing on peace, prosperity, and human rights. How can we work toward this approach?

ADAM READ-BROWN: Hello, and welcome to another episode in our Ethics & International Affairs (EIA) interview series, sponsored by the Carnegie Council. My name is Adam Read-Brown, and I'm the managing editor of Ethics & International Affairs, the Council's quarterly peer-reviewed journal, published by Cambridge University Press.

With me today is Professor Ronald Deibert, here to discuss his work on cybersecurity, human rights, and global cybergovernance. He is a professor of political science and the director of the Citizen Lab at the Munk School of Global Affairs and Public Policy at the University of Toronto.

Professor Deibert is also the author of the book Black Code: Surveillance, Privacy, and the Dark Side of the Internet, in addition to numerous other publications. His recent essay for Ethics & International Affairs, called "Toward a Human-Centric Approach to Cybersecurity," appears as part of a larger roundtable collection on "Competing Visions for Cyberspace" that was put together by Duncan Hollis and Tim Maurer. That collection appears in the Winter 2018 issue of the journal.

Welcome, Professor Deibert. Thanks very much for joining us.

RONALD DEIBERT: Thanks for having me. Appreciate it.

ADAM READ-BROWN: As I mentioned just there in your bio, you're the director of the Citizen Lab out of the University of Toronto. Before we get into some of the topics from your essay for the journal, could you start by telling us a little bit about the lab and the work you do there? Are there any current projects that you're excited about?

RONALD DEIBERT: Sure. I'm the founder and director of the Citizen Lab. The Citizen Lab does research on digital security issues that arise out of human rights concerns. In other words, we don't cover the entire spectrum of digital or cybersecurity issues. It's a broad topic. There's something in the news almost every day about it. What we do is focus our lens on topics that arise that have human rights concerns like barriers to free expression, Internet censorship, surveillance, and targeted espionage against civil society.

The signature of the Citizen Lab is our mixture of methods. While I'm a political scientist and international relations person by background, a lot of the people who work at the Citizen Lab come from different disciplines, especially computer science and engineering science but also law and area studies.

I think we're most well-known for the technical parts of the work that we do. I found many years ago, perhaps it was my one great idea—it's often said that people have one—it was this, to recognize the very powerful technical methods that come from computer science and engineering science that could be appropriated and used to shed light and gather evidence on what's going on "beneath the surface," so to speak, of the world around us, especially with respect to security-related issues. So we do a lot of investigative work, digging up evidence on state controls around the Internet through Internet censorship, companies that supply governments with the technology they use to undertake espionage and targeted attacks.

We're not an advocacy group. We see ourselves as fulfilling an academic function, that is, to do evidence-based research and raise awareness about problems that we see as needing addressing in this area.

We've been called a kind of CSI of human rights, and although I don't watch that show I can understand why people say that.

ADAM READ-BROWN: That's great, CSI of human rights.

Obviously, some of the work that you're talking about there and the insights from that work are reflected in the essay that you wrote for EIA called "Toward a Human-Centric Approach to Cybersecurity." Just to get our terminology straight at the outset as we start the conversation, what are people generally talking about when we use the term "cybersecurity," particularly in an international affairs context? What's the prevailing understanding of what constitutes cybersecurity?

RONALD DEIBERT: That I think is a very open question. It means many different things to different people. It's described in certain ways and practiced in other ways as well.

I think there is a prevailing set of practices that conform mostly if you're looking at this through the lens of international relations theory to a kind of realist or realpolitik approach to world affairs where sovereign states' interests are privileged, and there's a kind of military-centric approach to the topic. Even in Western liberal democracies you hear a lot of lip service paid to principles and values.

The fact of the matter is that when it comes to cybersecurity the approach that has been taken has really been dominated by this realist, national security-centric way of looking at the problem. And frankly, I think that is an issue. I think that's a concern for those of us who care about human rights, which is why I've tried to elaborate this human-centric alternative.

ADAM READ-BROWN: You mentioned the lip service paid by Western democracies, and in the essay you mention that the approach, though, this sovereign state and militaristic approach, is really most compatible with authoritarian and illiberal practices, which seems a bit concerning and maybe a little chilling.

RONALD DEIBERT: Yes, it definitely is. This is where the work of the Citizen Lab I think has helped inform my perspective of what's going on. We spent well over a decade tracking the growth of information controls in the Global South especially but really globally, and I think everyone recognizes today that the world is unfortunately sliding into authoritarianism. The number of autocratic regimes and authoritarian practices generally are on the rise.

In many countries many policymakers are using the discourse, the rhetoric of cybersecurity to implement some pretty draconian controls, so really we're talking about creating controls that limit access to certain types of information that end up being deployed in ways that stifle or neutralize civil society as in, for example, through the type of targeted espionage we've been tracking at the Citizen Lab, and that privilege secrecy and national security agencies, which tend to be already shielded from public scrutiny.

I think in the West among the liberal democracies you see a paralleling of this going on, or maybe even it is because of the way in which we've approached the issue of cybersecurity that has created a kind of model for authoritarian and illiberal regimes, which is kind of unfortunate and maybe even unintentional. I think, for example, of the impact of the Edward Snowden disclosures, which I believe had the unintended consequence of presenting to policymakers a model of how to control information that maybe they hadn't quite fully understood and actually helped dramatically increase the market for cybersecurity products and services coming from the private sector that have now been deployed for authoritarian ends.

ADAM READ-BROWN: And so, as you already alluded to, in your piece and at the Citizen Lab more broadly you're pushing for a reconceptualization of this idea of cybersecurity toward a "human-centric approach," as you put it. Can you tell us what that approach entails, and is it wholly in opposition to this current narrative, or is there overlap?

RONALD DEIBERT: That's an interesting question. I think there is definitely overlap to some degree.

First of all, I would say it's not often that I get an opportunity to write something like this and reconnect back to my international relations and political theory roots, but what I call this human-centric approach is very much derived from a larger tradition of human security.

At the heart of that way of thinking about security is a concern for the security of the individual, of the citizen, if you will, and putting that front and center of the model. I think the most frustrating thing for me to see these days is how that's so often overlooked, especially in liberal democratic countries. We talk about human rights, and we talk about liberal values, but when it comes to how we actually practice or implement cybersecurity, it's as if we forget about it, and it has become something that begins and ends with the security of the sovereign state.

So rather than the individual being the center of what it is that we want to secure, the object of security, if you will, and everything that goes along with that as the starting point, the starting point appears to be the state and the territory of the state and the critical infrastructure of the state and maybe secondarily the private sector within a certain territorial jurisdiction. When you approach it that way it creates an international dynamic, a very competitive one, maybe even a zero-sum game.

If you start from a human-centric perspective—take for example how you would think about the security of global networks like the Internet. They are seen much differently from this perspective. It is something that is very much supportive of human rights, like access to information, freedom of assembly, freedom of association, and other rights. Whereas from a national security-centric perspective, networks are to be secured but only in a particular territorial jurisdiction, and in fact networks in other territories might be the object of manipulation and subversion. When you start at a different starting point you end up with different policies.

I think the overlap—it's interesting that you bring that up, because it is very much around the role of the state. So I think there's an important sense in which the governments have an important role to play in both perspectives in a human-centric approach, not looking to eliminate sovereign states.

In fact, that would be counterproductive. We're not looking to eliminate law enforcement or even intelligence agencies for that matter, but especially law enforcement because in order to protect rights you need to enforce the rule of law, and that requires a government and a representative government. So in both senses there is overlap around the important role that a government plays in securing cyberspace.

ADAM READ-BROWN: At an intuitive level it is important of course to protect the security of individuals and to protect human rights. Is there a reason to privilege the cybersecurity of the individual over the cybersecurity of the sovereign state? Or is not perhaps a zero-sum game? Is this more important, or is it just overlooked as far as how we should be thinking about cybersecurity, because it doesn't sound like you're saying that states shouldn't protect themselves?

RONALD DEIBERT: No. I think it's a matter of emphasis, and yet it's very important to start with first principles. This is what I find that the theorizing around what's referred to in the international relations literature as "securitization theory" is so important. At the heart of that theory is the idea of what is it that you're securing, what is the object of security. As funny as it seems, that basic question is often overlooked, and yet how you define it matters for the policies that flow from it, including what's considered to be a threat.

So if your starting point is the security of the United States is first and foremost as opposed to human beings regardless of territorial jurisdiction, you end up with a different conceptualization of what is the most important threat. From that perspective it might be other governments.

Whereas if you start from a human-centric perspective, what you're looking to do ultimately is provide for the security of individuals to enjoy to the fullest extent possible the exercise of human rights and all that goes into a notion of human security—peace, prosperity, and so forth. And that leads to a different set of policies. Again, it's a matter of nuance and emphasis, because in order to enjoy human rights you first and foremost need to have a sphere in which you live that is secure and provides a degree of protection for you as a human being. But the liberal project out of which the human-centric approach arises in the first place is to hopefully gradually expand those principles internationally in terms of a web of interdependencies, if you will.

I think, even stepping back further—we don't want to get too up in the clouds here, but I think in the long term, the ulterior motive, if you will, for me is we have to be thinking about the planet as a whole here because we have obviously some existential threats that face the species, and if we're going to address them properly I think at some point we have to recognize that the sovereign state system that we live in now for the most part is ill-suited to solving those problems. And so we have to start moving in a direction that transcends it, and I think that's again where this comes from.

ADAM READ-BROWN: So, drilling down, coming back from the clouds maybe, although that is all terribly important, but to get to some of the specific aspects of what you outline as human security, one of the things in the article that you discuss is the idea of data stewardship, which I think is something that resonates with anyone who is using social media today, this idea that personal users emit today both greater and more sensitive data than ever before, and it's continuing to grow.

As a society you say we should start understanding privacy as an element of human security, which, to me anyway, seems appealing and intuitive perhaps. But with what seems like everyone racing to make their lives public and consumable on social media, what hope is there to actually make significant progress toward this aspect of human security?

RONALD DEIBERT: That's a good question, and it's hard not to be pessimistic these days when you look around and notice the ways in which people are so willingly turning their digital lives inside-out and effectively engaging in a kind of auto-surveillance.

We used to worry about the surveillance of some third party or Orwellian state, but in fact most of the surveillance occurs today wittingly or mostly wittingly, and it's undertaken by ourselves as we consent to being surveilled by companies as part of the business model that is predominant today, this idea that in exchange for free services we consent to having everything we do monitored and tracked and analyzed and then repurposed for targeted advertising.

By the way, I write about this topic in an article called "The Road to Digital Freedom: Three Painful Truths about Social Media." That's the lead essay in this month's Journal of Democracy. I don't know if it's uncouth to mention a competitor journal on your podcast.

ADAM READ-BROWN: Spread the wealth. Check out both essays.

RONALD DEIBERT: But I think in terms of is there any hope, from one perspective it looks like a very steep challenge because social media, in spite of the lack of popularity around specific platforms—for example, there's a lot of unease about Facebook right now—the fact of the matter is that the business model upon which all of these social media platforms depend continues to rise in popularity worldwide. That "surveillance capitalism," if you want to use that term, that's at the heart of it is I believe just extending almost exponentially across sectors of the economy.

So how do we turn it back? Well, there are some promising developments. I would point, as I do in these articles, to the General Data Protection Regulation in Europe. Recently that regime led to a huge fine being levied on Google for not respecting Europe's privacy laws. As I point out in the article, if we believe in this idea of privacy as a security principle, we need to actually empower regulators so that they can take action with real consequences against companies.

Looming in the background here, though, is China. Of course, in China there's a much different model being presented, wholesale surveillance and surveillance capitalism, but linked very much to social control through the Social Credit System they're attempting to create. That model is formidable because China is now the world's largest country in terms of Internet-connected people, and it's a model that many autocrats will find attractive. So we need to work fast to counter that model.

ADAM READ-BROWN: Is regulation the primary means by which you see that happening? As you say, it's a business model, what currently exists for corporations, and for governments there is the motivation of keeping backdoors in platforms for them to be able to do whatever monitoring they want for security purposes—you can put that in scare quotes if you want.

Is it simply regulation? Are there ways to motivate these actors perhaps particularly in the private sector to shift their behavior, or is the profit motivator too great?

RONALD DEIBERT: That's a good question. I think the way that we need to look at this is to first of all recognize that there is no one single lever here, that regulation is an important part of it, but it's only one component.

I do believe in order to properly address what we're talking about here, which by that I mean this personal data surveillance economy that is at the root of what we're talking about that presents threats to privacy but which is also now linked to some pretty disturbing social and political trends around the spread of prejudice and ignorance and disinformation and tribalism and so on, we have to really undertake a wholesale change in our way of life, and that means that individuals have to think differently about how they consume information.

And we need to think about maybe some kind of restraint or conservation. In the same way we hope one day to think about the natural environment, we should treat our communications environment with a bit more restraint and conservation. That's what I mean also by this notion of stewardship. We have to think about it as something that we don't just give away, that we take ownership over, that it's a common-pool resource that we're creating together, that we have an obligation to tend to.

So, more than regulation. It requires almost like a new civic virtue that has to be cultivated among consumers, and I recognize obviously that's a huge uphill battle, but it's something that I think is nonetheless important.

Lastly, I would say we may need to find alternative technological platforms that don't give away the benefits of what we've created but do away with some of the problematic aspects, especially around the surveillance that's going on. Some people when they look at the situation as it is today, they say, "Oh, we should just unplug, disconnect." You hear a lot of that with respect to Facebook right now. I understand the motivation behind it, but I ultimately think that's counterproductive to the long-term goals I talked about when we were in the cloud. Where are we going to manage the planet? We need a wired world where we can all get together and discuss and debate ideas.

It's just the economic model on which it's based is really problematic and also the security component, which is what we started out talking about, is also counterproductive, and so we need to rectify that, and that may come through technological innovation, new types of social networking that don't depend on this exchange around personal surveillance.

ADAM READ-BROWN: Having heard you say all that, I'm curious. In this reframing, which has been the subject of our discussion, a shift in the way that we define security, the way that we approach the object of who's being secured, we've talked a bit about the sovereign states and corporations and the reasons that each of these types of actors have perhaps been resistant to a reframing of this issue.

We've been dancing around what I've been seeing as this other big obstacle, and you sort of just said it, the public unfavorably may be viewed as an apathetic public when it comes to these issues. So my question is, what do we actually—you can't regulate public feeling or public views. What more is there that we would need to see or to do?

Over the last few years it seems that there has been one revelation after another about major social media companies, whether it's Facebook or other platforms, cooperating with governments to compromise user data in one way or another. The needle doesn't really seem to have shifted, or perhaps that's my own perception. Maybe it actually has.

But how does one go about actually shifting the way that the users themselves think about these issues, not just the people in power?

RONALD DEIBERT: That is a really good question. My view is that it is a huge challenge, but it's not insurmountable. I think it points to some very basic things that people have talked about in this context and many prior contexts.

For example, the important role of research on the one hand and education on the other, and especially the type of education around public education and learning at lower levels. There in my opinion needs to be much greater emphasis on the type of civic responsibilities and kind of philosophical orientation toward data and information that we just spoke about.

The problem is, of course, that the humanities, the arts, the social sciences that would be required as part of this emphasis are being eviscerated right now. Then, when it comes to research and awareness raising, same deal. There's a lot more research being done that's really in the service of the machinery that we're describing than is critical or contrary to it.

So that needs to be corrected. It's not about regulating the public, as you stated, to shake an apathetic public out by forcing them or legislating them in some manner. Instead, it's going to have to be something where an emphasis is put on creating a certain culture and instilling certain values in the population so that a different attitude is taken toward the environment we've created.

ADAM READ-BROWN: Hearing you say this, it sounds like what we need in grade school along with civics lessons for how to be a good in-real-life citizen we need how to be a good online citizen in our social studies.

RONALD DEIBERT: Absolutely. When I think about this topic, I think about John Dewey and The Public and its Problems. If you go back and take a look at that classic work, even though it had nothing to do with the current context per se, I think it really helps; it's very illuminating in terms of thinking about how important education is to this idea of the public sphere. I think that's something that we could learn from today.

ADAM READ-BROWN: Having been studying these issues, do you see any shift, whether it's from particular revelations—Facebook's alleged role in the U.S. presidential election or any of the other—has there been a public shift in any way that has started to shift the conversation, or are we still at a starting point here?

RONALD DEIBERT: I think there have been important shifts from the episode that you're mentioning, the role of Facebook in the 2016 election, not just Facebook but social media as a whole, and also the data analytics companies alongside a lot of really good investigative reporting by organizations like ProPublica and The New York Times that really I think have opened up a conversation that has been a long time coming.

You mentioned my book Black Code. That was published about five years ago, and the topic we're discussing now was very much front and center then. I can remember when that came out those who read that book were taken aback by some of the topics, but now it's a bit more of a part of the conversation. People are beginning to understand that applications on their devices have tangible personal consequences and that they're vacuuming up personal details.

As we mentioned before, there is this Facebook disconnect campaign, that while maybe—I question the utility of it in light of the challenges we face, but I understand the motivation behind it, and I think that sentiment could be easily molded into something more comprehensive. I see there being important kinds of elements out there right now and a climate where that could be seized and reoriented toward some of the principles that we're discussing.

ADAM READ-BROWN: Are you perhaps cautiously optimistic about the prospects for some of this? It sounds like maybe you are, but I won't put words in your mouth. How do you wake up in the morning and think about this?

RONALD DEIBERT: The work of Citizen Lab really focuses on the "dark side," for lack of a better term. Our job is to highlight wrongdoing and spotlight illicit activities or programs and projects that violate human rights. We're constantly doing this inventory every day of bad things. You get inured to it after a while, and normally when I give talks I may not even recognize myself how bleak I'm presenting things because audience members come up afterward, and they say, "I want to throw my phone in the ocean after listening to that presentation," because they're so discouraged.

That's not the aim. The aim is to recognize what are the challenges in order that we can take action. I wouldn't be doing what I'm doing if I weren't optimistic in the long run.

And after all, what else are we going to do? Resign to it? Just do nothing?

I look at China, and I think, This is the most formidable problem when it comes to the topic we're addressing right now. It's very large, almost insurmountable, but what else are we going to do? Resign to it? I don't want to do that. I'm a tenured professor, and I have the luxury of being able to do the work that I do and speak out about this for as long as I can, and I hope others will join me.

ADAM READ-BROWN: And you haven't thrown your phone in the ocean just yet. I think we're talking on it right now.

RONALD DEIBERT: Yes. I have a secure phone, though, that I use for work.

ADAM READ-BROWN: Fair enough. On that note, I'm sorry to say I think our time is about wrapped up, so we'll stop there. But I want to thank you again for joining us, Professor Deibert. It has been a real pleasure.

RONALD DEIBERT: Thank so much. Appreciate it.

ADAM READ-BROWN: Yes. This has been a great conversation. Once again, I'm Adam Read-Brown. I've been speaking, as I just mentioned, with Professor Ronald Deibert. His essay, "Toward a Human-Centric Approach to Cybersecurity," appears in the Winter 2018 issue of Ethics & International Affairs as part of the roundtable, "Competing Visions for Cyberspace." That essay, as well as much more, is available online at www.ethicsandinternationalaffairs.org.

We also invite you to follow us on Twitter @eiajournal. Thanks very much for joining us, and thank you, Professor Deibert, for this great discussion.

RONALD DEIBERT: Thanks very much.

You may also like

A Dangerous Master book cover. CREDIT: Sentient Publications.

APR 18, 2024 Article

A Dangerous Master: Welcome to the World of Emerging Technologies

In this preface to the paperback edition of his book "A Dangerous Master," Wendell Wallach discusses breakthroughs and ethical issues in AI and emerging technologies.

APR 11, 2024 Podcast

The Ubiquity of An Aging Global Elite, with Jon Emont

"Wall Street Journal" reporter Jon Emont joins "The Doorstep" to discuss the systems and structures that keep aging leaders in power in autocracies and democracies.

APR 9, 2024 Video

Algorithms of War: The Use of AI in Armed Conflict

From Gaza to Ukraine, the military applications of AI are fundamentally reshaping the ethics of war. How should policymakers navigate AI’s inherent trade-offs?