Mapping AI & Equality, Part 5: The AI (De-)Legitimation & Distribution Crises

Feb 17, 2022

In mapping AI and equality, it is helpful to develop broad categories that highlight distinct trajectories showing how AI impacts people and their relationships with each other, and with our shared environment. This series of blog posts aims to provide fodder for further thought and reflection.

Humanity has entered an inflection point in human history. The convergence of crises caused by climate change, pandemics, structural inequality, and destabilizing technologies all contribute to what the philosopher Jurgen Habermas called a “legitimation crisis,” in which citizens are losing faith in their governments to solve their problems (Habermas, Jürgen, 1975. Legitimation Crisis. Boston: Beacon Press).  The instability in the international order is palpable and being played upon by cynical or aspiring authoritarian leaders.   

In 1930, the British economist John Maynard Keyes coined the term “technological unemployment” to capture the longstanding Luddite fear that each new technology would rob more jobs than it creates. Over the past 200 years nearly all technological advances from the Industrial Revolution to breakthroughs in agriculture, manufacturing, chemistry, and healthcare have created many more jobs than they destroyed.  However, these jobs are rarely distributed in such a way that those whose occupations are most likely to be decimated benefit directly. The digital economy  creates many new high-end jobs that require advanced skills, but has not yet matched this with investments in civic education for adequately building and expanding the necessary talent pool. Thus the worst-off parts of the world do not actually see significant improvements. 

A distribution crisis has ensued where productivity gains increasingly go to the owners of capital, those of us able to invest in financial instruments. For example, those who invested in the tech sector during the pandemic saw dramatic growth in the value of their portfolios, while hundreds of millions lost their jobs. Leading firms that supply digital services , which facilitated new way of working virtually, expanded rapidly.  According to Satya Nadella,  the CEO of Microsoft, goals his company predicted would occur over two years were realized in two spring months of 2020.

Governments have increasingly fallen under a cult of innovation where innovation in and of itself is perceived as good and should not be tampered with through regulatory constraints. Whenever legislators or government agencies look at ways to rein in technology companies or social media platforms, for example, they are often told that they do not understand technology and that if they institute constraints, they will undermine innovation and essential productivity gains. In some instances, governments are also worrying about regulations getting in the way of strategic objectives rooted in national security concerns. Corporations are thus let off the hook for the societal costs of their innovations.If the purveyors of disruptive technologies only reap the rewards and have no responsibility for social costs, structural inequalities will exacerbate. This is not a healthy situation. 

There has always been a pacing problem, a lag between the implementation of a newtechnology and the speed at which ethical/legal oversight is put in place. As David Collingridge noted in 1980 (The Social Control of Technology , New York: St. Martin's Press; London: Pinter), the development of a technology can most easily be shaped early on. Unfortunately, early in its development we seldom fully anticipate a technology’ssocietal impact. By the time we do recognize the undesired societal consequences of adopting a technology, it is likely to be so entrenched in the political/economic milieu that it becomes difficult to alter. "When change is easy, the need for it cannot be foreseen; when the need for change is apparent, change has become expensive, difficult, and time-consuming." (Ibid) The speed at which technologies are being deployed is rapid and our ability to tame it is poor. 

In other words, the digital economy is exacerbating inequality, and governments have failed to tame its excesses or effectively address harms caused to those who have lost their jobs, need training, or are subjects of algorithmic biases. As Christina Colclough says, “We could be demanding of companies, when they invest in disruptive technologies that they are also obliged to invest in their people, in their reskilling and upskilling, and in their career paths.” It has worsened the opportunities, and will likely continue to present challenges, for those who have not yet been meaningfully (or safely) integrated into the digital economy.


Anja Kaspersen is a Senior Fellow at Carnegie Council of Ethics in International Affairs. She is the former Director of the United Nations Office for Disarmament Affairs in Geneva and Deputy Secretary General of the Conference on Disarmament. Previously, she held the role as the head of strategic engagement and new technologies at the International Committee of the Red Cross (ICRC).

Wendell Wallach is a consultant, ethicist, and scholar at Yale University’s Interdisciplinary Center for Bioethics. He is also a scholar with the Lincoln Center for Applied Ethics, a fellow at the Institute for Ethics & Emerging Technology, and a senior advisor to The Hastings Center.

You may also like

A Dangerous Master book cover. CREDIT: Sentient Publications.

APR 18, 2024 Article

A Dangerous Master: Welcome to the World of Emerging Technologies

In this preface to the paperback edition of his book "A Dangerous Master," Wendell Wallach discusses breakthroughs and ethical issues in AI and emerging technologies.

MAR 22, 2024 Podcast

Two Core Issues in the Governance of AI, with Elizabeth Seger

In this "Artificial Intelligence & Equality" podcast, Carnegie-Uehiro Fellow Wendell Wallach and Demos' Elizabeth Seger discuss how to make generative AI safe and democratic.

FEB 23, 2024 Article

What Do We Mean When We Talk About "AI Democratization"?

With numerous parties calling for "AI democratization," Elizabeth Seger, director of the CASM digital policy research hub at Demos, discusses four meanings of the term.