From the Russian invasion of Ukraine to the hearings on the January 6 insurrection to the U.S. Supreme Court's decisions on guns and abortion, the news has been full of stories to get enraged about. But by focusing our rage on one issue or another, we risk missing how these stories are all connected. We should step back to see the bigger picture, with a historical context that stretches into the past and looks to the future.

Since the Second World War, the world has made slow but significant progress in a range of areas, from broadening respect for human rights to making inroads into fighting poverty and inequalities. Having taken two steps forward, however, we are now unmistakably in the midst of a step back.

Russia's behavior in Ukraine in particular has dealt a devastating blow to hard-won norms on the conduct of conflict. Its obliviousness to international humanitarian law (the laws of war) has set back this important cause dramatically. At least as devastating are likely to be the second- and third-order effects; for example, the weaponization of Ukrainian grain has exacerbated starvation in Africa leading to the inevitable socially destabilizing consequences, such as forced migration.

Comparably far-reaching and destructive impacts are sure to follow the U.S. Supreme Court's short-sighted decisions to enact gains for a narrow-minded ideology, at the expense of undermining respect for the rule of law. Meanwhile the revelations about how close the U.S. came to a coup in 2021 will reverberate within the country and beyond, weakening confidence globally in the resilience of democratic norms.

In these and other developments, I believe we are seeing a bigger picture characterized by three trends: a retrenchment of human rights, an exacerbation of structural inequalities, and the further entrenchment of existing power structures.

Looking to the future, we should all be concerned about how these three trends will influence the development of emerging technologies. The most obvious example again comes from Ukraine, as the imperative of fighting the invasion rides roughshod over ethical concerns about the development of lethal autonomous weapon systems.

The Franco-British weapon Brimstone One, for example, may be the most autonomous weapon yet deployed on a battlefield: Its operator defines a search area, within which it identifies and destroys vehicles such as tanks. Despite widespread calls for a ban on developing some lethal autonomous weapons, the Convention on Certain Conventional Weapons (CCW) has spent years failing to reach consensus.

In the U.S., the spiraling problem of gun violence is giving more impetus to ethically questionable innovations. In June, nine of the twelve members of the Axon ethics advisory board resigned in protest when the company behind the Taser responded to the Uvalde school shootings by announcing plans to develop a drone capable of surveilling public places and Tasering any person it identified as an active shooter. After the resignation, Axon halted the development of this technology.

As the ethics advisors understood, the fallibility of surveillance algorithms raises the risk of mistakenly targeting innocent people; and once such technologies are accepted for a seemingly narrow use case, such as school shootings, the door is opened for wider applications in which the potential downsides will increasingly outweigh the benefits.

In the long run, all forms of digital surveillance—from the social credit systems of authoritarian states to the surveillance capitalism models of the U.S. tech titans—are used to entrench power by inculcating fear.

Where once Eleanor Roosevelt championed the International Declaration of Human Rights, establishing America's reputation as shining a light throughout the world for individual rights and freedoms, U.S. tech companies are now at the forefront of limiting any digital rights to control one’s own data. When more of life is lived digitally, digital rights become a central component of human rights.

Russia's reversion to old-style imperial colonialist aggression, meanwhile, has obvious parallels in the forward-looking phenomenon of digital colonialism—with tech companies making countries and their peoples subservient to those who control their data, as opposed to their land.

Do we have the will to respond to our current backward step by slogging on to make another two steps forward in the decades to come? I believe the answer will depend on whether we fixate too much on the individual issues, or we can appreciate how they all fit into the bigger picture. The shared need to address climate change offers the prospect for collective action. A fundamental shift in outlook is required, as we transform from obsession with individual issues to embracing a universal consciousness where collective well-being is placed at the forefront of both national policies and international affairs.

You may also like

MAY 15, 2024 Podcast

Beneficial AI: Moving Beyond Risks, with Raja Chatila

In this episode of the "AIEI" podcast, Senior Fellow Anja Kaspersen engages with Sorbonne University's Raja Chatila, exploring the integration of robotics, AI, and ethics.

MAY 9, 2024 Podcast

The State of AI Safety in China, with Kwan Yee Ng & Brian Tse

In this "AIEI" podcast, Carnegie-Uehiro Fellow Wendell Wallach speaks with Concordia AI's Kwan Yee Ng & Brian Tse about coordinating emerging tech governance across the world.

APR 30, 2024 Podcast

Is AI Just an Artifact? with Joanna Bryson

In this episode, host Anja Kaspersen is joined by Hertie School's Joanna Bryson to discuss the intersection of computational, cognitive, and behavioral sciences, and AI.