The Techno-Military-Industrial-Academic Complex

Feb 18, 2022

This article originally appeared on Fortune.com

The Harvard Strike in the spring of 1969 emerged out of what we students perceived as the university’s complicity in the Vietnam War. After Harvard President Nathan Pusey called in police to forcefully remove students who took over the central administration building, the ensuing strike received widespread support from both the student body and faculty.

One of our main grievances was how the university had been captured by the military-industrial complex, whose increasingly detrimental power President Dwight Eisenhower warned about in his 1961 farewell address. By the time of the Harvard Strike, that complex had bequeathed to us a manufactured war in Vietnam, for which young men were being required to give up their lives on the battlefield in name of “our country, right or wrong!”

As a graduate student, I played a small but not insignificant role in the Harvard strike, as well as the nationwide strike the following spring when the war in Vietnam was expanded to Cambodia. The university’s complicity in the Vietnam War became a template for strikes and protests on almost nine hundred campuses. These strikes increased in furor after four young people were killed by the Ohio National Guard during a protest on the campus of Kent State University.

Universities were perceived as channeling manpower through a system of rewards that helped discern who had the correct values and goals and was therefore worthy of being advanced into leading positions within government and industry. To us, there was no doubt that the military-industrial complex had morphed into a military-industrial-academic complex.

This past December, a disturbance erupted in my soul as I read about the acceptance of 15.3 million dollars from former Google Chairman Eric Schmidt by Yale, the university at which I have been an affiliated scholar since 2001. The gift to the Yale Jackson Institute for Global Affairs is to establish the Schmidt Program on Artificial Intelligence, Emerging Technologies, and National Power.

Today, the tech industry and tech money already play an oversized role in setting the agenda for contemporary universities. This gift seems to be a significant step in establishing a techno-military-industrial-academic complex.

At the time I read about this gift, I was already formulating a critique of the Henry Kissinger, Eric Schmidt, and Daniel Huttenlocher book entitled The Age of A.I.: And our Human Future. To my mind, this book, along with a report from a National Security Commission on Artificial Intelligence that Schmidt chaired, is fomenting the weaponization of A.I.and ratcheting up a new Cold War between the U.S. and China. Meanwhile, Schmidt is busily working to garner defense contracts for Silicon Valley firms in which he has a vested interest. For all the legitimate critiques of China, it is not at all clear to me that the Chinese want a Cold War. They would rather dominate the world economy.

Furthermore, the Age of A.I. proselytizes the narrative that A.I. systems will inevitably be much smarter than humans and that we will not be able to understand how they arrive at their decisions, and therefore we should be prepared to surrender to judgments made by A.I. As an A.I. ethicist, I view this abrogation of responsibility to machines as the height of human immorality.

Smarter-than-human A.I. and the need to militarize A.I. are being proselytized as inevitable. Make no mistake, the current narrative dominating the tech revolution is in fact fashioned by corporations and the tech wealthy and in their interests, presuming they get the rest of us to buy into the metaverse, targeted advertising, cryptocurrencies, and the weaponization of A.I. But we must remember that nothing technological is inevitable, except that human will and intention make it so.

When I spoke to Ted Wittenstein, the executive director of International Security Studies at Yale, he was rightfully proud of the grant from Eric and Wendy Schmidt, which elevated the stature of the Jackson Institute. In a Zoom call, he assured me that there was a need for critical perspectives in fulfilling the grant’s goal. This would ensure a minimum of academic integrity. We’ll see. Commonly, critical voices are co-opted and marginalized. And in naming the program “A.I., Emerging Technologies, and National Power”–instead of International Cooperation or International Security, or even National Security–its ideological orientation was made quite clear.

More importantly, at a time when universities and scholars are hard-pressed to get programs in the humanities funded, leading tech firms and the tech wealthy are readily available to fund research that serves their interests. Universities such as MIT get millions of dollars to fund programs helpful for building a pipeline to those students most likely to contribute to a burgeoning tech economy. With the newly emerging alliance between technology and the defense establishment, university capture by the tech industry is nearing completion.


Wendell Wallach is a consultant, ethicist, and scholar at Yale University’s Interdisciplinary Center for Bioethics. He is also a scholar with the Lincoln Center for Applied Ethics, a fellow at the Institute for Ethics & Emerging Technology, and a senior advisor to The Hastings Center.

You may also like

FEB 3, 2022 Article

Mapping AI & Equality, Part 3: AI’s role in altering the human condition and what it means to be human

The current AI discourse revolves around a core question: Will the human condition be improved through AI, or will AI transform the human condition in ...

NOV 9, 2021 Article

7 Myths of Using the Term “Human on the Loop”: “Just What Do You Think You Are Doing, Dave?”

As AI systems are being leveraged and scaled, frequently calls are made for, “meaningful human control” or “meaningful human interaction on the loop.” Originally an ...

DEC 21, 2020 Podcast

AI & Equality Initiative: Algorithmic Bias & the Ethical Implications

In this AI & Equality Initiative podcast, Senior Fellow Anja Kaspersen speaks with three researchers working with the University of Melbourne's Centre for AI and Digital ...