Six Questions for New Surveillance Tech, with Arthur Holland Michel

Oct 18, 2022 7 min watch

For Global Ethics Day, Senior Fellow Arthur Holland Michel looks at issues connected to emerging technology and privacy.

How will these systems be used? How can we make sure they’re being used transparently? Who is accountable when accidents happen? “The best way to address these questions is to have an honest, inclusive discourse where everyone has a voice,” says Michel.

Michel is also a member of the Artificial Intelligence & Equality Initiative's Board of Advisors.

For more on Global Ethics Day, please click here.

These days, we are witnessing the emergence of a wide variety of new surveillance technologies. Things like facial recognition, drones, location databases, and data fusion. It’s all happening so fast that it can be hard to know where to start when thinking about and discussing each of these technologies’ ethical implications.

However, though these machines come in many different forms and do many different things, they all raise some of the same ethical concerns. So for Global Ethics Day 2022, I wanted to highlight six key ethical questions that are common to all emerging surveillance technologies.

First, does the technology actually work?
It's easy to think that these technologies are all super powerful. But sometimes—indeed, oftentimes—new surveillance technologies don't prove to be as effective or reliable as one imagines they'll be. A poorly performing surveillance system might be more likely to cause harm (say, by misidentifying a suspect in a crime) , and its limited or inconsistent benefits won't outweigh its costs to privacy and freedom. Therefore, understanding a technology's real-world effectiveness is important for preventing unintended harm, as well as for deciding whether or not the technology should even be used in the first place.

Second, is it fair?
There's ample evidence to show that new surveillance technologies are disproportionately used against—and cause disproportionate harm to—non-white populations, as well as socially and economically marginalized societal groups. This has been a consistent pattern in the past, and is likely to continue to be a pattern in future. Therefore, new surveillance technologies require a robust assessment to determine their impact across different segments of society and, as needed, rules to forestall anticipated inequities.

Third, how will it be used?
Even when a new surveillance technology is at first only used for a seemingly noble purpose (say, for example, finding people who have been kidnapped), that doesn't mean it will only ever be used that way. New surveillance technologies often end up being used in ways that go far beyond their original purpose, for tasks that raise serious ethical concerns, for example identifying protesters who are exercising freedom of speech. Therefore, when a new surveillance technology emerges, it is helpful to consider not just the ethical balance of its stated purpose but also to imagine and consider the ethical implications of all the other ways it might hypothetically be used.

Another question is what happens to the data?
New surveillance technologies tend to generate large amounts of detailed digital data. In the absence of clear standards for how—and for how long—surveillance data are stored and secured, as well as rules for how the data can and cannot be used, the data may be exploited for privacy intrusions that have nothing to do with the original reason that these data were collected.

Next, it’s important to ask, Will it be accountable?
Even the most reliable, equitable surveillance technologies can fail, and there is always the chance that they will be intentionally used in ways that overstep ethical bounds. When this happens, a clear, transparent, standardized process of accountability is important for ensuring that those who were affected have recourse to justice and that those who are responsible for the harm face appropriate consequences. This will also help to dissuade authorities from using surveillance technologies in unethical ways, or in ways that go beyond their original stated purpose.

And finally, is it being rolled out and used transparently?
It is impossible to address any of these other questions if we don’t know about the surveillance technologies used in our communities. And yet unfortunately, new surveillance technologies are often deployed in the dark. This makes it impossible to have any kind of public scrutiny or discourse that addresses any of the other questions I’ve just mentioned. Transparency is also, in itself, an ethical principle. Privacy scholars and existing case law tends to agree that you have the right to know about the new surveillance technologies that are used either directly against you or your community as part of an investigation, or that may collect data about you incidentally in the course of its use. Oh, and transparency is also, in itself, a good way of keeping our overwatchers accountable.

There are, of course other questions. But this is a good place to start. The best way to address these questions is to have an honest, inclusive discourse where everyone has a voice. So I invite you all to share your own questions, concerns, and personal experiences using the hashtag #GlobalEthicsDay. Thanks for watching!

You may also like

JUN 30, 2022 Podcast

Emerging Technology & the War in Ukraine, with Arthur Holland Michel

In this "Global Ethics Review" podcast, Senior Fellow Arthur Holland Michel discusses facial recognition systems, loitering munitions, and drones in the context of Russia's invasion ...

NOV 17, 2021 Podcast

AI & Warfare: Are We in a New "Horse & Tank Moment"? with Kenneth Payne

Will AI systems transform the future battlefield so dramatically that it will render existing paradigms and doctrines obsolete, feeding new intense security dilemmas? In this "...

Sheffield, UK, March 2020. CREDIT: <a href="https://flickr.com/photos/shefftim/49683551823/">Tim Dennell</a> <a href="https://creativecommons.org/licenses/by-nc/2.0/">(CC)</a>.

APR 24, 2020 Podcast

Health Data, Privacy, & Surveillance: How Will the Lockdowns End? with Effy Vayena & Jeffrey Kahn

How should we think about privacy and government surveillance during the COVID-19 pandemic? Johns Hopkins' Jeffrey Kahn and ETH Zurich's Effy Vayena discuss health data ...