ALEX WOODSON: Welcome to Global Ethics Weekly. I'm Alex Woodson from Carnegie Council in New York City.
This week’s podcast is with Carnegie Council Senior Fellow Arthur Holland Michel. He is the founder of the Center for the Study of the Drone at Bard College and was its co-director from 2012-2020.
Arthur and I spoke about surveillance, privacy, and ethics. We first discussed aerial surveillance in Baltimore, which he wrote about in his book published last year, Eyes the Sky: The Secret Rise of Gorgon Stare and How It Will Watch Us All. And then we discussed these issues in the context of COVID-19. We touched on location tracking, drones, and we looked ahead to what could happen with these technologies in a post-pandemic future.
For more from Arthur, including his talk from last June about Eyes in the Sky, you can go to carnegiecouncil.org. You can also find more podcasts about the pandemic and surveillance technology including last week’s talk with the Future of Privacy Forum's Brenda Leong on facial recognition.
We’re also hosing a Zoom webinar on Wednesday, April 22 on many of these same issues. Carnegie Council President Joel Rosenthal will be speaking with ETH Zurich’s Effy Vayena and Johns Hopkins' Jeffrey Kahn in a discussion titled "Health Data, Privacy, & Surveillance: How Will the Lockdowns End?"
For now, calling in from outside Barcelona, Spain, here’s my talk with Arthur Holland Michel.
Arthur, thank you so much for taking this call today. It's great to speak with you.
ARTHUR MICHEL: Thanks so much. I appreciate it.
ALEX WOODSON: I thought we would start with what's happening in Baltimore. You came last June to Carnegie Council to talk about your book Eyes in the Sky: The Secret Rise of Gorgon Stare and How It Will Watch Us All. I encourage all our listeners to go back and listen to that podcast or watch the video or read the transcript. I'll post a link to it.
Stuff has been happening since then, so I thought maybe you could give us a quick rundown of what happened in Baltimore with this surveillance program and what has been developing since.
ARTHUR MICHEL: Basically, where we left things in the Carnegie Council talk that was based on the book was that in 2016, the Baltimore Police Department had run a secret experimental aerial surveillance program over the city for several months using a wide-area aerial surveillance technology that had originally been developed secretly by the Pentagon and the Central Intelligence Agency (CIA) to track insurgents in Iraq and later Afghanistan. The police department in Baltimore was using it to watch a vast area of the city to try to provide investigative leads. At the time Baltimore was in a crisis with its level of homicides and violent crime.
The problem, though, was that this project was done in secret. Not even the mayor knew about it. Once the program was revealed in August 2016, it was shut down very quickly.
The update is that since then the company, Persistent Surveillance Systems, has been lobbying the city to relaunch this experimental aerial surveillance program and has succeeded in those efforts. The Baltimore Police Department approved a new program just before the holidays last year, and just a few weeks ago, after a couple of town hall meetings—one of which was held on Facebook Live because of the lockdown in Maryland—the Baltimore City Board of Estimates approved the program, so it is going ahead.
This would be a several-month program that would cost about $3.7 million that actually wouldn't be paid for by the City of Baltimore. It would be paid for by an outside foundation known as the Arnold Foundation. It would once again watch about 90 percent of the city for days on end.
The latest update here is that just a few days ago the American Civil Liberties Union (ACLU), representing a number of local Baltimore community organizations, filed a suit against the Baltimore Police Department, asking for an injunction to pause the program before it starts. That is a significant and unprecedented legal challenge. This is all of course happening in the context of the COVID-19 pandemic, so it's not receiving a lot of attention, but this case could have significant ramifications for aerial surveillance more broadly in the United States.
ALEX WOODSON: What exactly are the grounds that the ACLU is suing on?
ARTHUR MICHEL: There are basically two parts: one part is that the aerial surveillance program is a violation of the Fourth Amendment, and the second part is that it would impinge upon the plaintiff's First Amendment rights.
With regard to the Fourth Amendment, which grants us protection from unreasonable search of our property and any invasions on our privacy without a warrant, the basis is that this technology will be able to watch the entire city at once. From an investigative standpoint the idea there is that, say there is a murder somewhere in the city, and on the ground the police have no investigative leads. Because the camera has been watching from above the analysts can actually rewind to the exact place and time where that murder took place and see all the people and vehicles that were nearby at the time of the incident, track them back to home addresses, and also track them forward in time, the idea being to identify those individuals and follow up with house visits. These are breadcrumbs that will, in theory, aid in any investigation.
As the Baltimore Police Department would have it, that is completely legal under existing U.S. privacy law because we're allowed to collect images from above. If you have been on an airplane, you have likely taken a photo out of the window with your phone if it's a pretty view. That is not a violation of anybody's privacy. That's public space. It's like taking a photograph when you're in a public park. On that basis, aerial surveillance has been used widely for several decades in the United States, and indeed there were several court cases a number of decades ago that established that law enforcement does indeed have this right to conduct warrantless searches through aerial surveillance.
But the contention with the ACLU suit is that this technology goes above and beyond. It is not just a regular camera that you or I might point out of the window of an airplane. It is able to view the entire city at once, and crucially it is able to map the locations of all of the city's residents over an extended period of time. There was a recent Supreme Court case, Carpenter v. United States, that found that when it comes to cellphone location data accessing a person's location over an extended period of time did constitute what is called a "search" under the Fourth Amendment, and so you need a warrant to do so.
The other part is the First Amendment part, and this is where the plaintiffs come in. These are community organizers who spend a lot of time in the communities that are probably going to be much of the focus of these aerial investigative techniques, and the worry is that it will impinge upon their right under the First Amendment to association.
These groups attend lots of community meetings in areas with high crime rates, interact with individuals who may have had interactions in physical space with places that are subject to investigation, and as a result, because of the presence of this aerial surveillance technology, this will have a "chilling effect." That is to say that they will not feel like they can just freely associate with these groups because of course they are worried that as a result of those associations they will be seen by the camera and will get caught up in these investigations into very serious crimes. This suit is fresh off the press, so to speak, so it very much remains to be seen how it will be handled from here.
ALEX WOODSON: We will definitely be following that over the coming months.
As you said, they decided to use this new surveillance program late last year before the holidays, so obviously this is before the pandemic. But just as you're speaking, all these issues that come up because of the pandemic come to mind. You hear about surveillance increasing because of the pandemic, contact tracing, proximity tracing. How do you think about these issues now as opposed to a couple of months ago, before there was this quarantine, before there was public health issues tied to leaving your apartment or not?
ARTHUR MICHEL: It's interesting. In the suit the ACLU claims that, because of the COVID-19 pandemic and the resulting lockdown in Baltimore, it actually doesn't make any sense for the city to be conducting aerial surveillance at the moment because there's nothing to look at. This technology can't see into people's homes or into their apartments, where right now they're spending the vast majority of their time, and so it will be in a sense a wasted resource.
That being said, you could imagine that this aerial surveillance technology could be quite effective for at least lockdown enforcement. Everyone is supposed to be in their homes. Anytime you see a vehicle on the street you can very easily track it to ensure that it is there for a reason, that it is either an essential worker, or it's someone who is only traveling between their home and a medical center or a gas station or somewhere to get food. It's interesting in that regard, and it's very much unclear—all the more so as a result of the suit—whether they are actually going to deploy during the pandemic or whether, as with so many other things, the lockdown is just going to put a big pause on the program.
Your question of course goes to another issue, which is the broader issue of surveillance in the context of the COVID-19 pandemic. As a result of the pandemic, you have seen a tremendous amount of interest in previously largely overlooked surveillance technologies that could give a really strong read on people's locations over an extended period of time. Not only is there renewed attention, but it's also attention in a new context. We're not thinking about this just in terms of law enforcement or in terms of marketing, which had been the two main ways that location data had been used in the past, but now in terms of public health.
Of course there are significant questions around that related not only to privacy, which is an obvious one, but also effectiveness. This is an urgent situation, and are these location-tracking measures the best way to really achieve the goals that are front and center in the minds of all public health authorities? That is very much at the moment an open-ended question.
These technologies are tremendously powerful, but they have not as yet been tested in a public health context, and I think that is one of the points that you haven't heard so much in the growing public dialogue around these technologies. This is all very new, and when you're seeing a new application of a technology, particularly in a context with extremely high stakes—we're talking about a global pandemic here after all; we're not talking about an isolated pandemic that could serve as a test case. You're talking about national policies adopting these technologies across vast areas.
That raises all sorts of questions that go beyond the privacy element and are more to do with whether we want to emphasize these resources in this way. To what extent are major policy decisions with significant policy ramifications being made using this data? That's part of where we need to be really careful, aside from the privacy stuff, which I'm happy to talk about as well.
ALEX WOODSON: I definitely want to get into some of these larger issues, but you said "previously overlooked" technologies are being used or being thought about. What exactly are those technologies?
ARTHUR MICHEL: There are two parts there. One is the location-tracking stuff. It has been very well-known and well-documented in recent years that smartphones generate a tremendous amount of location data, generally through apps that then sell that data to this very murky marketplace of third-party vendors. The New York Times Privacy Project obviously did a big exposé on the volume of the data that exists and also how granular it can be, that is, how much you can do with it. You can identify people just based on a few location pings.
This data is all in theory "anonymized," but of course if you know that this same cellphone was at this house and then at this place of work and then at this restaurant, that's three location pings, and that alone can give you a very strong lock on who actually owns that phone, so the anonymity part falls away very quickly.
It was very well-publicized in that respect, but it wasn't so much publicized as a tool in the hands of civil agencies. In fact, just before the outbreak there were a couple of stories, one in The Wall Street Journal and another in Protocol about how law enforcement agencies had indeed started using these cellphone location records provided by these mostly marketing companies for law enforcement activities. That was a new step. That was actually not the use of this data, which was to serve us with detailed ads based on our likely interests, but this is information now residing in the hands of governments.
Those stories, which should have been much bigger stories, had a shorter life in the spotlight because they were quickly overtaken by all these other things requiring urgent attention in the context of the pandemic. I say that they were previously overlooked in that regard. They were previously overlooked as tools for either enforcement or the public health.
The other piece of technology that I put within that category is drones, which we all know about. We are all very well familiar with them and know they are being used for aerial cinematography and, in some cases, by law enforcement for aerial surveillance and inspections, but now in the context of COVID-19 there are all sorts of discussions about: Could drones be used for enforcing the lockdown? Could drones be used for disinfecting large areas? Could drones be used for remotely detecting high temperatures on individuals in public spaces or as a sort of quick narrow-down in a public diagnosis sense?
That again is a totally novel application of the technology, something that did not have widespread attention, even though there was probably some discussion of it previously. And again, just like the location tracking, there's an untested quality to it. There are not reams of historical data about the efficacy of drones for detecting fevers in individuals. With regard to this technology in this context, it seems like we're very much flying by the seat of our pants. It's a little hard to know what will stick and what won't.
It's also a little hard to know, especially in the case of drones, how much of it is driven just by hype. Drones are very good at generating headlines, and if a police agency, say, advertises that it is using drones in this way, that gives off the impression that it is very high-tech, that it is at the cutting edge, and that it is putting the full extent of technology into the fight against this pandemic, and that can often generate a quite positive response.
There again you have to treat it with a degree of caution. It's very much unclear in both of these cases—the location tracking and also with the drones—whether this is the best tool for the job or whether this is really just a case of a technology existing, there being some sense that it could be very useful, and we give it a shot because we are, after all, in a crisis and there will be many lessons learned at the end of it all.
ALEX WOODSON: Another issue from speaking with other experts in this field about increased surveillance and less privacy during the pandemic is that, even if it proves to be effective, these methods could continue on after the pandemic and become part of the surveillance infrastructure of Spain or New York City or any other places that have had a real issue. Is that something that concerns you as well? What are some ways that we can make sure that these are temporary measures if we actually do need them?
ARTHUR MICHEL: From the government-use perspective there is a fairly simple answer: you put sunset clauses into the authorizations for these programs, assuming and hoping that there are authorizations in the first place and that this isn't just a government program that finds its justification in existing regulatory frameworks, which is the part that would be very worrying: a government claiming that the use of location tracking on such a broad scale is perfectly legitimate, and so it'll do so for this and any other exigent circumstances that arise.
We're seeing growing calls for sunset clauses in these authorizations for a notion of having just a temporary authority because we aren't in an emergency situation. I feel like that's probably, from a legal perspective, the best option that is available.
But there is a subtler and perhaps more insidious dimension to all of this, which is the notion that the use of these technologies in these inarguably beneficial ways could have a lasting effect on the dialogue, the public discourse around whether on balance these technologies are good for society; whether these are technologies that should be embraced. There is, I think, a legitimate concern to be aired here that these use cases could be used as a counterweight to arguments about the very real dangers that these technologies pose to civil liberties and privacy. I think that is—not to be cynical—one of the motivations when you see organizations that operate these technologies jump to offer these technologies for benign applications.
To put it bluntly, there's a very strong PR element there. That is by no means to dismiss any of these companies' genuine desire to help with what's going on, but even if that's the case, even if their desire to help is entirely genuine—and we have every reason to believe that it is—there will be lasting effects here.
We have to be sure that we can compartmentalize the uses of these technologies in such a way that, yes, we had an exigent circumstance and that required us to deploy all the tools that were available to us, but that does not mean that every law enforcement agency in the country should have these tools and authorities available to them all the time or on a short-notice basis because other exigent circumstances might arise. When that becomes the tone of the discussion, then you are really in danger of allowing abuses to happen under the cover of a legitimacy that has been established through this one narrow use case. This is why you see so many companies that provide aerial surveillance for law enforcement talking in their marketing about disaster response before they even get to the law enforcement applications because disaster response is something we can all agree to, but that's really not where we should be focusing our efforts when it comes to ensuring that there are privacy protections.
That's the part that I think we need to keep an eye on, that the technology doesn't get a strong foothold in our systems because it has this one potentially beneficial use. That goes far beyond sunset clauses. That comes down to a question of continually raising these uncomfortable questions, that it shouldn't be assumed are intended to undermine efforts to use the technology for good.
ALEX WOODSON: This goes to something that I know you have been thinking about, which is technology being ahead of the privacy law. I think that's what we saw in Baltimore in 2016, where there weren't laws in place to corral the surveillance technology. It seems like that's so much more of an issue now because, as you said, this is worldwide. There is definitely technology that I'm not thinking about; is there technology that no one is thinking about, that is just lurking out there that someone is thinking about? There are a lot of different ways that you can think about this issue.
ARTHUR MICHEL: The worrying thing is that, if a technology has not been outed by some heroic reporter usually who has gotten an anonymous tip or has been combing through obscure public records, we don't actually have any way to know about it. The default of a lot of the companies that operate in this space, and also of the users who operate in this space, is to avoid public scrutiny at all costs because it makes their job a lot more difficult.
One of the companies that has been known to provide location data information to federal law enforcement agencies, a company called Babel Street, actually has a user agreement where the law enforcement agency agrees not to disclose its use of this technology, even in court proceedings. So even if an individual is charged with a crime as a result of these investigative techniques, the government is not allowed to disclose that it used those techniques in gathering that information.
That's the crux of the issue. Why is it that these companies are able to have those user agreements where the technology is not disclosed? How is it that law enforcement agencies are able to deploy these technologies without announcing their intentions to the world? It's not because they haven't done the legal analysis. They have done the legal analysis, and what they found in the legal analysis is that the law says nothing about these technologies.
What is our response when these technologies come to light? One day it might be facial recognition, the next day it might be aerial surveillance, the third day it might be cellphone data, the next day it might be something that combines all of those. Who knows? Our response is to call for a ban often or to call for fit-for-purpose regulations that will rein in the use of this technology and prevent potential abuses. That will potentially be very useful for that technology and for preventing the specific abuses that could arise from that technology's use, but it's not going to do anything about those technologies that are still operating in the shadows, that are still operating in secret.
I've seen this happen again and again and again. To me it raises the question as to whether there needs to be a way to actually create principles that can be more broadly applied and can ensure that technologies cannot be legally deployed because they are omitted from the law, in the same way—and this isn't a perfect parallel—that if a company makes a new type of food, and it's a superfood and it's amazing, and they want to sell it because it can do all sorts of incredible things for your health, but they perhaps have some internal questions about whether it's really good for you in the long term, they need to get Food and Drug Administration approval. It would be horrible to think that they could just put that in Whole Foods and people buy it.
Companies that make surveillance technology don't have that same regulatory or indeed even moral obligation. Somewhere there needs to be a broader discussion about, not the technologies that are out in the open already, but the technologies that are already—and likely in the future to be—operating outside of the frameworks that we've created.
I should say one other thing about all this, which is, you may say, "Well, we have privacy protections in the United States, we have a Constitution. Surely that is enough to cover potential abuses by technologies that don't even exist yet." The answer to that is actually no, because new technologies create new forms of privacy that we hadn't even thought about even a couple of years ago.
Take smartphone location data as an example. A few years ago we would not have thought that our mere presence in three different locations—our home, our work, and our place of worship—was a "private" space. That is just a bizarre and abstract way to think about privacy. The existence of the ability to collect that information created that private space, and suddenly this is something that we need to protect. And if you look to the law, the law isn't going to say anything to protect you in that regard because it doesn't even have that conception of privacy.
So in that regard there is possibly—and again, this is something that I'm just starting to think through—a way to create principles that account for the omissions, rather than just respond in a case-by-case basis to those technologies that we know about because eventually we will have facial recognition controls and we will have controls on location data and we will have controls on drones and aerial surveillance and everything else. But we'll still be having this same broader conversation five years, ten years down the line, and ultimately, because the technology always moves ahead of the law, abuses will always slip through in that lag, and that's the part that we have to worry about. That's the gap that we have to try to close.
ALEX WOODSON: Just before we started recording, you said you were supposed to be in Prague right now talking about some of these same issues and working with other experts and researchers and trying to actually make some of these principles a reality, enshrine them in some kind of official way. It's too bad and maybe a little ironic that these talks have to wait six months when they're urgently needed right now.
ARTHUR MICHEL: Absolutely. And why are we talking about principles now when we perhaps weren't talking about principles in this way a few years ago? It's because we have entered an unprecedented time of technological evolution in the space of surveillance. I call it a "Cambrian explosion" in surveillance technology because the diversity of technologies that is emerging and the pace at which that diversity is growing is truly unprecedented. It was decades probably between new surveillance innovations that emerged in the early 20th century, and that gave the courts and civil society time to respond.
But we don't have that luxury anymore. The fact that that Congress in Prague that you mentioned has been delayed by six months is a huge blow because there are going to be all sorts of technological developments, often in many cases spurred by the COVID-19 pandemic, that will raise those questions. So time is very much of the essence, and I'm very much looking forward to that congress when it does happen, hopefully in the fall.
ALEX WOODSON: Just so our listeners know, it's the First International Congress for the Governance of AI (ICGAI). It is supposed to be in Prague as we're speaking on Thursday, April 16. It has been postponed to October 20–22, and hopefully those dates will still hold.
For the last question, you're in Spain right now. You were in New York before. Maybe not even specifically related to the pandemic, how are these conversations different in Europe as opposed to New York?
ARTHUR MICHEL: Europe arguably is a little ahead of the United States with regard to some dimensions of the privacy debate because it has the General Data Protection Regulation (GDPR) that puts strict controls on the dissemination and use of personally identifiable digital information.
But putting the GDPR element aside for a second, I think that in Spain particularly an element of the dialogue that has been interesting is that it wasn't so long ago that Spain's government was an authoritarian dictatorship, and encounters between the police and regular citizens were very different to how they're supposed to operate in an open liberal democracy, and that is recent memory. A lot of what has been happening with the enforcement of the lockdown is that some of those still very raw memories are being brought up, and there are questions about the extent to which these authorizations are compatible with the kind of country that Spain wants to be now, which is a liberal open democracy, and to what extent you need to be sensitive of the fact that police power in a liberal democracy must always be subject to controls, no matter how exigent the circumstances.
In the United States that consideration is perhaps a little more abstracted just because there's a different cultural context. Of course, there is a whole dialogue around law enforcement's relationship with the public in the United States that is very recent, much more recent than Franco's dictatorship, and indeed an ongoing one. My sense is that I haven't seen that be front-and-center in the dialogue in The New York Times, whereas in Spain that is certainly looming over all of this.
It raises all sorts of interesting questions. Certainly Spain has been very, very strict with its enforcement, and that seems to fall in contrast with some parts of the United States. But again, it's very early days. I think some of the interesting differences will also come out in the follow-up, countries talking about how they want to move forward, which authorities they think should remain in place, which authorities should be sunsetted, and what will be the long-lasting changes. The differences in the dialogues in that regard will, I think, more strongly reflect some of these cultural and historical differences between, say, the United States and Europe.
ALEX WOODSON: We will be following that up too.
Thank you so much, Arthur. This has been great.
ARTHUR MICHEL: Thanks so much, Alex.