The Ethical Abyss: A Tech Ecosystem Reliant on Conflict, with Professor Elke Schwarz

Aug 26, 2025 60 min listen

The race to develop the latest AI-enabled military technology is often justified as essential to preserving democracy. Yet, this “virtuous” messaging deployed by tech CEOs and venture capitalists is driving the creation of a new highly militarized tech ecosystem—one which relies on perpetual conflict to test, iterate, and improve weapons systems.

Elke Schwarz, professor at Queen Mary University of London, joins the Values & Interests podcast to unpack the virtue signaling of today’s techno-military-industrial-complex, society’s failure to cultivate ethical thought in an increasingly computational world, and the resulting dehumanization of civilians in conflicts in places such as Gaza and Ukraine.

Tech ecosystem Schwarz V&I podcast Spotify link Tech ecosystem V&I Apple podcast link

KEVIN MALONEY: Today on Values & Interests I am joined by Elke Schwarz. As a professor at Queen Mary University of London, Elke examines ethical questions at the intersection of technology and conflict. We discuss the factors to consider when interrogating your own value system in a world that increasingly feels as if it is defined by technology and seemingly perpetual conflict. In the latter half of the discussion we explore the growing role that venture capital (VC) firms and Silicon Valley companies are playing in the military-industrial complex, and we close with an assessment of the moral and human costs of the ongoing conflict in Gaza.

As always, be sure to subscribe to Carnegie Council on YouTube and wherever you get your podcasts.

Welcome, everybody, to another episode of the Values & Interests podcast here at Carnegie Council. Everybody heard my introduction to Professor Schwarz, Elke, as we will go on a first-name basis today.

What we do, Elke, on the Values & Interests podcast is first try to understand the value system of the guest we have brought on beyond their professional expertise. I would like to start there. Maybe you could tell our listeners about your own personal background, your values formation, and how does one choose to interrogate these very specific values questions as a career beyond their personal life? If we can get a little bit of that framing, then we will dig into the geopolitical business of the day.

ELKE SCHWARZ: Thank you so much for having me on the podcast. It is a wonderful question to open with, an interesting question. Being an ethicist or dealing with ethics is my profession. I rarely go back and think: Well, where did it all come from? What are the foundations of my own ways of thinking ethically? It is important because it forces me to examine my intuitions and also be accountable for the things I hold to be true or clear or certain and which always of course affect others.

It is also a challenging question because it is not ever one thing. It is not monocausal. There is always a certain dynamism and complexity at work informing one’s values, and that is the case for me too. It is a mix of I would say experiences, what we learn through others, and what we learn about the world as we go on and the people in the world as we inhabit also various roles, positions, and relationships.

With that, and I am thinking here about Alasdair MacIntyre’s essay on moral responsibility—to interrogate one’s own moral position in various roles because sometimes there are conflicting values attached to certain roles or values that are in tension depending on these roles, contexts, or relationships that we might inhabit—it is a complex question to answer, I think.

It struck me perhaps early on that it seems possible that moral perspectives can change—and here I will keep it with Hannah Arendt, who made that observation in the context of totalitarianism—so it is all the more important to cultivate moral and ethical sensibilities.

My own value framework comes from my parents as much of us start off in thinking about right and wrong, what to do, and what not to do. So it is a mix of my parents’ own empathy, their kindness, their acting in the world, and their broader moral compass I would say about how to be with one’s self but also within the world. That is then paired with various experiences which prompted deliberation and which sometimes were jarring and made me think, Is this right; is this wrong?, so deliberations about values and relations.

Maybe a third strand was a growing understanding that we as humans cannot exist purely as individuals. In our thriving we are embedded in a social context, so values always relate to others, the wellbeing of others, and how we treat others. It is always reciprocal in some ways. The foundation of this moral responsibility is always in relation to others.

This is an early way of thinking about this, where my parents showed me empathy and integrity, and my interest in the backbone of morality in relation to violence comes perhaps from experiences with witnessing violence and sensing the impact of conflict and violence and my ways of thinking about acting well comes from understanding the political consequences of thinking of oneself purely as an individual rather than as part of a whole society. That is the broader framework through which I work. Ethics is a practice that needs to be cultivated, that is relational, and that is never really finished, and often is a fine balancing act that can easily be thrown off-course.

KEVIN MALONEY: There is a lot there that goes directly to the heart of what we do here at the Council. I think one of the things I am struggling with a lot as an American right now is that there has always been this fetishization of individuality from an American perspective, but it seems to be a doubling down on that right now and a de-prioritization beyond values, just using ethics to understand someone else’s value system. There is a shrinking space right now, so you are seeing that manifest in multiple ways, whether it is income inequality, hate crimes, or fill in the blank.

It is interesting when I think about my own values formation from a perspective of growing up in the United States in the 1990s. Things I was quite sure of as a 20-year-old I am not so sure about now as a 36-year-old. This is not a bashing of the United States; it is just an interesting point to how one’s values and interests personally and at an institutional or even state level will evolve over time, and it is up to the individual to interrogate that and be aware of it.

ELKE SCHWARZ: That is also very interesting to me, and I am thinking a lot about this, how the significance of experience factors into this. Even as we grow older through the decades we change our perspectives on things as we learn more, and that makes it very difficult sometimes to understand as you get older how younger people think or their lack of experience, and likewise the lack of experience younger people have but feel quite sure about how they see things. With a different kind of experiences comes also sometimes a different perspective. That is not to say that values are relative in any kind of way, but I think it is more to highlight the fact that we must be kind with one another in understanding that there is a lot we do not understand, so cultivating a certain generosity in dealing with one another is part of that ethical conduct.

KEVIN MALONEY: Yes, certainly. We constantly talk here about the traps of this moral relativism approach or moral absolutism, different concepts but related in many ways especially in the world of politics. As you said, you evolve and grow older and wiser, but maybe then you become disconnected from other people’s perspectives. It is this individual ethics spectrum that you constantly have to be aware of and live on and grapple with.

I have had a lot of conversations recently on this podcast about what it means to be an ethical individual beyond a moral individual. I can distill the conversations most recently as saying that if you feel like you are a moral individual it seems like that value system is in place and is not moving or is not being interrogated. The few philosophers I have spoken with recently seem to think it is the willingness to question, the willingness to evolve and interrogate that is the differentiating factor between an ethical life and maybe what you might view as a moral or principled life.

How do you react to that framing or that distillation of my recent interviews?

ELKE SCHWARZ: I am firmly onboard with considering reflexivity a fundamental cornerstone of moral responsibility. Alasdair MacIntyre has a wonderful essay about the moral responsibility we have to interrogate how we act, and that is a core moral responsibility that should not be eroded. Certain structures in society of course make it harder to engage in that reflexivity, but I think it is a crucial dimension of being a moral person or an ethical person.

Here I am of two minds. In some ways I think the distinction between being moral or ethical is less clear cut than it is sometimes made out to be. Very often the distinction is that we have ethical frameworks, rules, and principles, and we have moral philosophy and moral reasoning and somehow they are related but also distinct, and I am not entirely sure if it helps to differentiate between the two because if you go through the various texts and schools of ethical thought and ethical thinking, it is undecided, let’s say, to what degree the two are the same or whether we should make a stark distinction between the two. The wonderful woman who supervised my Ph.D. said that she does not make a distinction because it just gets us into a way of compartmentalizing or differentiating that is not helpful ultimately.

KEVIN MALONEY: You have opened the can of worms in terms of trying to create messaging here, which is a big part of my job at the Council. I remember being a year into the job and trying to silo these principles, values, ethics, and morality, and I came to a moment where it was like: “If we’re all going toward the same thing, do the semantics necessarily matter?” To some people it matters a great amount, but I think we can get in our own way sometimes in being overly attached to definitions that do not have the practical efficacy on the other side of things.

ELKE SCHWARZ: I agree with that. I think there is a case to be made for differentiation, but we always have to be very clear as to why: What is the task of this or what is the utility, not to be all utilitarian about this, or what is it we are doing here as a fundamental question? Sometimes it just turns out to be a fairly academic exercise.

KEVIN MALONEY: I think we could talk about this for a long time. As a professor of theory in this space, I am sure you have spent days talking about this over your lifetime, but I am going to pivot so that we do not lose the geopolitics audience for part of this podcast.

I want to move into a more serious space right now in terms of your area of expertise, the intersection of ethics and technology, specifically the military, and what is going on right now in the implementation of new technologies into that space.

I want to split the conversation into two areas, the first looking at the evolution of this techno military-industrial complex over the past few years within the United States and then pivot to the consequences of that on a global scale and from specific conflicts, and we will go into that.

You wrote a great piece for The Conversation back in January, where you really dug into the changing economics around the U.S.’s military-industrial complex, specifically from a Silicon Valley and venture capital perspective. Can you give that framing and the consequences of those changes that are happening right now for our audience?

ELKE SCHWARZ: It struck me that the advent of military technology companies has changed certain dynamics in terms of military practices and priorities, and I had that suspicion for a long time. I started thinking and writing about this in 2018 and 2019.

In particular I was struck by the language that became more pervasive in the military space, the language of agility, language I had associated with software and computational technologies and specifically also the industries attached to them about revolutions, agility, iteration, doing things faster and at a much larger scale. Those kinds of words seemed to become more pervasive, and with that a prioritization of speed and scale.

Then I was interested in the political economy of that. When we wish to interrogate the ethical substrate or what happens to ethics in the military space, the ethics of war, or the practices of war, I think a more holistic picture is required and part of that is the political economy, so who benefits from what kind of future?

It struck me that venture capital, a very specific form of financing technology companies, and private equity financing, is becoming more prominent in this space, and then I got interested in what kind of dynamics are intrinsic to venture capital itself. Very similarly it is driven, just like artificial intelligence itself or digital technology as we see it manifest now, by speed and scale.

Scaling up fast is important for the venture capital environment in order to make high returns. Venture capital, unlike other forms of financing, private equity, or investing is high-risk/high-reward, meaning you run a greater risk, not everything you invest in will pan out, but a small percentage, perhaps 10 percent of your investments, will scale up so big that everything else is mitigated, if you will, and you make enormous returns.

That is the idea and has worked like that with venture capital for digital technology in the civilian space over the last two decades. That logic and dynamic started entering the military space, and with that we saw a different language and a different ethos.

Add to this the Ukraine-Russia conflict and various other conflicts which prioritized the development of digital-type technology for and in warfare, specifically artificial intelligence (AI), and a new picture emerges. It is always problematic in my book to have publicly traded weapons or defense companies because the logic is that in order to give your shareholders a return on investment there has to be a certain level of conflict so that you can at least test or sell your products, so there is a tension between peace and the military industry.

That is amplified with venture capital in the mix, specifically also since together with that and this notion of iteration and agility comes an environment in which live conflicts are actually needed in order to test new weapons systems, iterate them, and improve them, so you kick off an entire ecosystem which needs conflict in order for the business to be able to be developed, and so on.

With that, adjacent businesses spring up. I was reading about another venture capital fund which organizes boot camps for military startups to come to Ukraine and mediate their startup technologies to get to the frontlines to be tested in a live context, so you create an ecosystem that actually depends on conflict, and undoing that is difficult when we know about the dynamics of business, development, sustainability, and the mandate of growth in particular.

KEVIN MALONEY: Thank you for raising the ecosystem question. I think it is a powerful point to make beyond the political and business consequences because it goes directly to an individual and their lives. If there is a greater amount of conflict in the world, naturally you understand that you will be, let’s say—you fill in the blank. I think that is a powerful narrative that we need to underline a bit more.

One of the things in the article that was also very interesting was the changing dynamics between the government and military and these new actors in the VC space. Traditionally you had these large defense contractors, it was not really an open space, and the power dynamic was that you had to get into the State Department, you had to get contracts, and there was a door, and with that you had controls and naturally those institutions were part of a broader, open society, democracy, albeit extremely flawed, but at least there were inherently in the system checks and balances built in. Now you are talking about that dynamic switching a bit, where it is the VCs saying, “We need this technology, and you need to adopt this,” etc. Could you expand on that because that was a point that was quite troubling for me.

ELKE SCHWARZ: The relationship between government and the defense industry has long been fraught. What is commonly known as the “revolving door,” whereby defense companies bring in former government officials to give credibility to the defense company and its product, is a well-established practice, as is extensive lobbying, so doing lobby work and inviting government officials and policymakers to change the narrative and change the ethos.

But again, with the advent of certain very financially strong venture capital companies, that has accelerated and been amplified. The amount of money spent on lobbying, on bringing in former government officials, and activating this revolving door by which the government officials will come to the companies but also people from the companies get placed in government positions is incredibly active.

This is paired with a fast-moving environment in terms of the technological development. The technologies are advanced; they don’t advance themselves. Very often we say, “Ooh, AI is developing so quickly.” It does not develop itself. It is being advanced at a pace at which many lawmakers and policymakers cannot develop the knowledge themselves about the pitfalls, challenges, dangers, or limitations of the technology. You seemingly need that expertise from the technology companies, so you create a shift in narrative by which a different mindset is created that, “This is now all about software, software will win our wars, and we have the expertise to do that.” That leaves governments always on the back foot.

Add to that in Silicon Valley and military technologies with AI there is a certain “popular allure,” I want to say. It is not a dusty industry; it is dynamic. It is seen as at the forefront, slightly avant-garde. It has crafted itself and branded itself as being young, dynamic, and a little bit crazy. With that in the mix there is an incredible cocktail of financial appeal to policymakers who are brought into the fold, appeal for technology engineers to be brought into government that is almost irresistible.

I see it in the United Kingdom a little bit. I feel like the Labour government interestingly is very much falling for the ideas and drive that the military technology companies offer without activating critical faculties and saying: “Okay, wait. Do we really need that? What are the consequences of that? What are the risks and dangers?” Rather, everything becomes enrolled in a mandate of speed, scale, and innovation as a key or master value. It is actually a shift in value systems where speed is a key value, innovation is a key value, scale is a key value, sometimes to the detriment of other values.

KEVIN MALONEY: One of the things I focus on here at the Council is interrogating narratives, specifically from a values perspective. You rarely get this Bond-villain scenario where somebody is playing the part. Everybody is trying to gain the moral high ground across geopolitics and across business. We think about this in terms of “moral masking.”

From a case-study perspective it has been very concerning for me to see. When certain people were in office it was liberal international order, software is going to cure cancer, and boom, boom, boom, and now we are seeing quotes around “defending Western society” and putting into a binary versus China, and “You’re either with us defending the West and liberal ideals or you are against us.” As I said, it creates this binary.

What are the narratives you are seeing deployed effectively right now by Silicon Valley and the military-industrial complex? People like Alex Karp and Palmer Luckey come to mind. I think there is a spectrum around sophistication. Some people want to just say, “We are creating this tool to do damage,” and other ones are mixing in this nationalistic identity cocktail, which I find much more insidious. I would love to hear from you about “the best people” deploying this right now and your thoughts on counter-narratives in that space that might be effective.

ELKE SCHWARZ: That is such an interesting question. I am glad you asked this because the language deployed in the pursuit of business development is so interesting to me. Again, I see it reverberating in the United Kingdom in ways that were not present in the last five years or so.

There is an interesting duality going on in this public positioning of military technology companies. One might interpret it as duplicity rather than duality, but I would say it is consciously deployed. The current line of marketing or argument amongst many of the new military technology companies is that the development of AI-enabled systems, including weapons systems, targeting systems, and autonomous weapons systems, is morally necessary and ethically defensible because it is a way to uphold the values that we or the “West”—whatever that might be—hold dear, and often that is couched sometimes around freedom but very often democracy.

A number of companies build their ethos, justification, and marketing materials quite overtly around the claim or the promise to either safeguard democracy, rebuild the “arsenal of democracy,” or protect democracy with AI drones and autonomous weapons systems. Of course this is not how democracy works or how democracy is safeguarded or upheld. You cannot do this with weapons; you can only do this with political work, with communication, with agreement and disagreement, and so on, but it has utility.

It is completely performative in the sense that democracy usually fares very poorly in the context of warfare, where exceptionalism and crisis decisions often disallow democratic oversight and deliberation. There is a clear and obvious tension there, but it is also a conscious packaging of tools of harm and destruction in terms of hard-won values. As a model it is quite effective. It must be because it is very pervasive.

I am not necessarily saying, with some caveats, that those running these companies try to consciously pull the wool over our eyes and pretend to uphold certain values when they know their products are in tension with these values, but maybe they don’t think further than, Well, in order sometimes to not have war, you need weapons. That could be a ground-level way of reasoning about this. I would suggest that the signaling of values and virtues in that sense is performative and effective as a marketing tool, and people do know that.

Then you have some companies, which you highlighted, which deal much more starkly in producing what seems to be almost a cognitive dissonance by on one hand publishing and producing texts and discussions that clearly take the ethics and laws of war seriously, so they gesture to moral and ethical values as being fundamental to their business model while also cultivating their CEOs with very unsavory and sometimes patently unethical or ethically transgressive views as part of their marketing campaigns. If you don’t want the views of your CEO to be public, you don’t send them on talk shows and TED Talks, so it is part of their marketing campaign or strategy.

I think this duality or duplicity serves to pull the rug from underneath any viable critique because you can always gesture toward the publically available good intentions: “Yes, we have thought about it. Have you met our civil liberties team? We are taking this very, very seriously.” Generally it is part of the broader cognitive dissonance that runs quite starkly through our digital AI-infused ecology.

I listen to many of the CEOs who have produced technologies that turn out to be quite harmful, whether it is algorithms promoting violent or harmful material, sit there and say, “Well, this is really awful,” while knowing that they have produced this environment. I think there is a strategy at work that we have to be mindful of and that we must not fall prey to by getting complete whiplash engaging with one or the other.

KEVIN MALONEY: I was criticizing binaries before, but I tend to try to tackle this through a good-faith/bad-faith assessment. These things are not monolithic companies: There is a CEO at the top; there is a board of directors. I fully believe good people can work at certain companies who are trying to do good. It comes down to moral courage and ethical leadership. I think we are seeing a lack of that right now. I think we are seeing too many people who are willing to be these moral chameleons when it serves them.

Somebody said to me the other day: “It’s very clear that certain politicians have a strong value set, but they just lack principles.” The principles are performative to an extent and then they change.

We are not necessarily going to be able to interrogate every person’s soul and their individuality, but there is a lot of value I think in just letting people know that this is happening, that just because you speak about the values of democracy does not mean that is an unassailable position. It is ironic that people use it in that way. The more we can interrogate and push people who work for companies that are plugged into the government, et cetera, the more it has a domino effect in a good way. It can be a bit overwhelming right now with the kinetic nature of these discussions and what they lead to.

ELKE SCHWARZ: When I think about this I do want to sometimes come back to this point that MacIntyre raises, that it is not sufficient to say: “I am working in a company for a CEO who I know will discuss things publicly that could be interpreted as a violation of international humanitarian law, and I am not okay with that, but I still work for the company because we also do good things and we protect data.”

One has a responsibility to interrogate how the various moral frameworks and value systems in private life cohere with that of your professional role and whether there are tensions. That is one’s own moral responsibility, to be reflexive about that. I think that is really important.

I understand it is difficult—if somebody has a great job, they probably make a lot of money, they have responsibilities, perhaps a family—to say: “I’m not going to do that; I’m going to find another job.” But that is how the problem starts.

If nobody says that—we are not in 2018 anymore, when employees at Google were staging a walkout, which was quite effective, because Google had a contract with the Department of Defense for an AI tool, and employees were unhappy about that. Times have changed, but they can also change again.

KEVIN MALONEY: It is very difficult. You think about your responsibilities and your own value system and what you are prioritizing in terms of, like you said, maybe having a family. I think we have seen especially from a U.S. perspective over the past ten years red lines that continually were crossed, and it is a permission structure, so then it becomes “ethically tenable” to do this. As the society shifts and as norms around behavior shift, you can swallow it, you can make the case to yourself, and you can sleep at night.

I do not have the prescription for that. What I do think is very effective is to have a community of people you can stand shoulder-to-shoulder with to think deeply in a good-faith way, like you said in terms of the Google walkout, because being the “lone virtuous person on the ethical island,” not only are you not going to make a lot of change occur, it is going to be very lonely and difficult. I do very strongly believe in good-faith community. That does not mean 100 percent values alignment; it just means that you are going to be there for somebody in good faith to help them tackle these ethical issues.

ELKE SCHWARZ: That’s right. I think that is important, and that is one thing that is at the essence of political power as well, to create this power together. In lieu of having enormous financial means, which most of us don’t, I think having the possibility to act together, to act in concert, and stand in solidarity is important.

It has limits too. I am with you that it is a bare minimum that we need, but we probably need more. It would helpful to have big financial means to start a counter-marketing campaign.

KEVIN MALONEY: Before we transition into the use of these technologies currently in conflicts on the battlefield, I want to sum up our conversation in a quick example.

I am not the first person to say this, but this is classic branding and marketing that is also very personally offensive to me in that all of these Silicon Valley companies, whether it is Anduril or Palantir are basically stealing from The Lord of the Rings. As somebody who loves those books and will be reading those to my son as basically a moral lesson throughout his early years the irony, it goes back to this democracy conversation we had.

The story in the book is that two friends through love and empathy conquer the military-industrial complex. I cannot tell nowadays politically if people understand the irony in that or if they are doing it as some type of joke, but I am just putting that out there and maybe will get your reaction too, but from a personal perspective that infuriates me.

ELKE SCHWARZ: I can understand that and also I sympathize, but I am not very well versed with The Lord of the Rings trilogy.

I have been thinking about this a lot, and together with Neil Renic we wanted to write a paper on this and the ways in which The Lord of the Rings ideas, tropes, names, and some of the darker aspects of the story are becoming a justification for violence actually, for the infliction of harm, and it is becoming almost like a military scripture because it allows for identification of the world or seeing the world in extremely stark terms: It is good and evil, there is the light and the dark, there is always a looming threat, and there is always crisis.

Of course there are wonderful days in the shire, I’m sure, but the heart of it is danger, threat, and crisis of a crumbling world, and that facilitates a crisis narrative and exceptionalism, and if that serves as a way to view the world, then we are in for a bumpy ride because that means paving the way for large-scale war, and we don’t want that because a handful of people read The Lord of the Rings texts very differently than you read them. It goes to show how a different perspective can inform how you read and understand texts and then take them forward.

I was struck that The Lord of the Rings trilogy serves as scripture for so many right-wing endeavors as well. Giorgia Meloni is a big, big fan. There has been a movement in the Italian right wing from the 1970s onward where they have little Hobbit-type getaways as political activities.

KEVIN MALONEY: I did not know that. You are destroying my worldview right now.

ELKE SCHWARZ: I am so sorry. I have more terrible examples, but I will hold off.

KEVIN MALONEY: I won’t go down The Lord of the Rings rabbit hole, but, yes, there are definitely some problematic framings in the story in terms of West versus East, but I have always read it through the friendship angle. Maybe that is another narrative battle to fight.

ELKE SCHWARZ: That says good things about you.

KEVIN MALONEY: Let’s pivot now to the more serious conversation in terms of the consequences of this perpetual violence ecosystem that we talked about. We are seeing this play out in Gaza; we are seeing this play out in Ukraine. This is not some conference room, this is not some military conference. These are children dying.

We have been thinking about this at the Council for a while now, which from an ethics perspective is this gap between principle and practice. Principles can’t be this thing you take off of a dusty shelf, look at, and feel better about yourself or think about as a checklist. The pitch for years has been that these tools will allow us to do more but do it more accurately and minimize civilian casualties, but in practice we are seeing this horrifying case study play out in real time right now.

I want to get your thoughts on the situation, how these things are being deployed, and where we find ourselves right now.

ELKE SCHWARZ: I often find myself almost speechless about the extent to which the global community is allowing such horrors to unfold before our very eyes because nobody can say we don’t know.

There are many ongoing conflicts—there are not just two conflicts—that we decide not to pay that much attention to or generally the conversations don’t revolve around other conflicts. We often refer back to these two because these are also the ones where new technologies are being tested, trialed, and used and are very different types of conflicts. You have one in which the use of AI technologies probably accelerated the investment globally in AI infrastructures, AI drones, and autonomous weapons systems in Ukraine. They are hailed as: “This is the pinnacle of innovation, and look at how well these technologies are serving the Ukrainians in their struggle against an oppressive aggressor.”

That is true to some degree. There are new technologies, there is innovation, and there is drone technology that has helped Ukraine fend off a rather powerful adversary to the extent that they have, but we must not also forget that that was mostly in lieu of having other types of weapons to help fend that off, and Ukrainians are often valorized in the way they defend themselves.

Then you have Gaza, which we know also AI systems play a role in enabling large-scale destruction and target selection at a scale that was otherwise not known and at a speed that was otherwise not known, and we see with our very own eyes that the entire strip is demolished and there is suffering beyond comprehension.

Neither conflict seems to give us robust evidence or empirical substance to the claims made by some military technology companies that wars can be won swiftly with AI weapons systems or that indeed these are tools for great precision and superior in every way, so there is again a tension between that which was promised, hailed, or claimed and what we see unfolding before our very eyes. In fact the contrary seems to be case, and these two horrendous wars have dragged on in an ever more cruel and dehumanizing way, especially the war in Gaza.

It seems as though there isn’t any kind of substance to the claim that these are better technologies; rather the contrary. As we have often seen in recent wars and especially those in which algorithmic types of technologies have played a role, the tactical always seems to supersede the broader strategic considerations about ending wars and specifically also about finding peace. Rather, what is foregrounded is an ongoing tactical action plan more than a strategic plan toward peace because you need politics for that and not technology. You need a different kind of approach to that.

I find it also deeply troubling about our collective impotence to even process the great degrees of inhumanity and suffering, especially with respect to Gaza. I think we are in the middle of a type of nihilism or an unfolding of nihilistic types of wars, which is dreadful.

KEVIN MALONEY: I think back to what used to basically shake the United States to its core from an ethical perspective. You would see a single photo from Darfur do that, and now there is this irony where we have this view into things like never before. I don’t know what it is doing, but it is not resulting in action. It is beyond horrendous, as you said, this nihilism that is creeping over us.

This goes to some of the points you made in a recent article as well in terms of the efficacy of norms and international law being hollow in that societies themselves don’t feel they need to support those things. You talked in your article—I believe written with Neil—in terms of the centrality of ethics, that people need to be willing to grapple with those things and need to prioritize those certain values in order to support the institutions and the laws. There just seems to be this gaping hole there now. I think you wrote that back in 2024.

I don’t know where we go from here if those things are even salvageable based on the human cost on a day-in and day-out basis. I would love your updated take on that in lieu of what is happening in Gaza right now.

ELKE SCHWARZ: The take is even more bleak than it was last year. In some ways I am not surprised because we have not in the last two or three decades, or perhaps even since the 1950s and a growing prioritization of computational technologies, cultivated a way to practice ethical relations or ethical thought. Something else has happened in lieu.

In an environment where digital technology or computational technology dominates, I get the sense that having not cultivated this reflexivity or the importance of ethics or morality vis-à-vis others, only perhaps vis-à-vis one’s self, there is a certain helplessness in the face of what ethics is or how it works, and I do think that perhaps is itself a consequence of this computational condition in which things are either on or off or zero or one or a problem or a solution, an attachment to computational binary structures within which ethics just does not fit because it is not something you can solve this way or that way or through principles or certain actions, but this computational environment is an environment in which we resort to principles, structures, and crutches.

It is exactly where these ethical principles or ideas that let us morph them into checklists and tic boxes rather than prompts to deliberate and understand one’s own responsibility in the face of an unsolvable problem or dilemma. This technological, technified thinking cannot accommodate that, certainly not one that is informed by the logic of speed, scale, and action. I do think the technological environment in which we find ourselves informs how we are able to think.

What I have been seeing in the last 12 months if not two years are two specific trends that were articulated in my conversations with others, whether military personnel, policymakers, or academics. The first one is that ethics is basically seen as a hindrance to efficient or effective action. So often I have encountered this question, asked by all kinds of audiences: “Isn’t ethics a luxury in warfare? Can we afford to have ethical concerns in the face of an unethical enemy?”

That is very often accompanied then by some mental gymnastics as to why the things we traditionally hold as ethically important or perhaps even incontrovertible, let’s say the protection of innocents and civilians in war, why that is indeed ethical and should it be the opposite? This then often manifests statements that deem it appropriate to expand the justifiable circle of targets or even advocate large-scale violence.

KEVIN MALONEY: There is this multilevel and impersonal justification equation happening. It is like: “I have my value system, and because at least I have a value system that means I can justify anything from a moral perspective.”

I am also interested—I talked about this in an interview with a moral philosopher from Johns Hopkins—in this de-prioritization of agency, this cloak where you feel like you have agency but are not really interrogating your ethical structure, and at the same time that is being used to take agency from other people. I think you are seeing this in the Gaza conflict right now, where you have this thin moral justification for what’s happening, therefore anything is acceptable.

This has happened in conflicts throughout history, but it is supercharged right now by the ability to target and inflict harm at this speed and scale, and we are seeing the consequences of that. As you said, Gaza has been flattened with thousands of civilians dead.

It is tough because I feel like we are trapped in this cycle where there is not a way out, but that does not mean we should stop having the conversation. We cannot do that. I think a lot of times people want solutions. I have been struggling a lot with what feels like a perfect storm for people who care about these things.

ELKE SCHWARZ: I think in some ways we are all very distracted and always running out of time. Again, I think this is part of the technological condition in which we find ourselves, and I don’t think it is something that has just happened in the last ten years; I think it is something that has shaped our subjectivity toward functionality and technicality, so that we read everything we do in life, even our politics and ethics through a technical lens and through the lens of functionality, and I think that is a significant contributor.

I come back to Günther Anders, a philosopher of technology in the 1940s, 1950s, and 1960s. He wrote interesting observations about the relationship between technological mindsets and harm. He wrote a public letter to Klaus Eichmann, the son of Adolf Eichmann, in 1964, and posed the question—he was engaging in a deep analysis of how this monstrosity could have happened: Who is complicit? Are we all complicit? Is it just Adolf Eichmann who is complicit?—how and when can this monstrosity occur that he names as the institutional and almost conveyor-belt-like extermination of human beings and the fact that there were leaders and henchmen who were not just condoning but facilitating these activities, so different levels of complicity, if you will.

He landed on one specific element that enabled the monstrous, which was that we had become creatures of a technologized world that included the unmitigated and perpetual progressive technologization of everything that we do, and we were kind of enrolled in a functional technical logic that disallowed us to feel responsibility with others, to develop ourselves not as functional products but as humans with a capacity to expand our understanding of the implications of our actions with the tools that we have, so we just kind of fit within the logic of the technology rather than make the technology more fit with what we needed in order to work well with one another.

I think that is an interesting and important insight because technology is not ever just a tool. It shapes how we think, how we situate ourselves, what we prioritize, and what we can see and choose not to see.

KEVIN MALONEY: This goes directly to what is happening in a U.S. domestic perspective in a society that was deeply pluralist, albeit with many faults. You should actually see us at the local level in the United States, at like the city council level. There still is that deep pluralism, a neighbor approach to getting things done, but we are seeing that decay because of technology I would posit to a large extent, and it scales up to the geopolitical level, the ability to basically reach out and take somebody’s life at the flip of a switch, and if you don’t take the pluralistic approach to seeing people as equal to you in their rights or you don’t want to empathize with their position it becomes very easy, as you said, to take an industrialized approach to war or other-ing people, et cetera.

Certainly using words like “inflection point” are quite reductive, but I don’t have the right word. It feels like we are at some sort of inflection point from a geopolitical perspective but really from a moral perspective.

ELKE SCHWARZ: I agree. I think it is high time we realize that we depend on one another and that the fast-paced advancement of dehumanization is not going to produce the futures that we want. That is important. Hannah Arendt asked the question I think early on in The Human Condition: “What are we doing?”

I want to ask that question: What are we doing, and what are we doing this for? What are the kinds of futures that we want? Anecdotally at an individual level what is happening at this point nobody seems to want, or let’s say very few seem to want. How do we re-shift that balance to be able to create a better understanding that we have to deal with other humans and that we had better develop strategies to deal with other humans as humans and not as things, not as objects but as humans.

We learn through friction. There will always be friction, but we learn through friction, and we have to learn how to understand one another through friction that does not necessarily result in the elimination of somebody else.

I think this object-oriented world perspective, this technologically oriented perspective, as Anders said, is the foundation for a totalitarian type of approach, and that is still in place. We have not done away with that. A small scratch at the surface can allow that to resurface, and I think this is where we are right now.

KEVIN MALONEY: We are heading in a more positive direction in terms of the conversation, which is good. I want to close by giving you the floor. This is an open question, but you have mentioned a lot of philosophers, some quite famous and some I have read through, but for our listeners to understand this moment not from an academic perspective but an applied perspective, things they can use in their own lives, what would be the one book or who would the one person about whom you would say: “Go investigate, and this might build out your ethical toolkit a little bit more.”

ELKE SCHWARZ: Yes. I am going to say three people.

KEVIN MALONEY: I was going to say a laundry list is okay, the more the better.

ELKE SCHWARZ: The philosopher I find the most interesting at the moment or have for a long time is Günther Anders. This may not necessarily help your listeners get very far because most of his works are still only in German, but his work is being translated increasingly into English, and I would urge everyone to read the work. It leans on very similar ideas that Hannah Arendt offers in her work, which I would also highly recommend because she is a wonderful philosopher to help us understand ourselves as political beings in the world, so how we are unique but how we have to work with one another in order to change and affect the world in positive ways. She talks not just about death—most philosophers are obsessed with death—but she talks about natality, the thing that we can birth, the things we can give rise to. That is wonderful.

Günther Anders happened to have been her first husband, so there is obviously a relationship here, but Günther Anders is an interesting philosopher of technology whose work is so insightful in light of our present condition and is very prescient. I would strongly recommend reading those two. I think I will leave it at that.

At the moment I am also reading a lot of Norbert Wiener, who is a mathematician but also a philosopher, who worked specifically on weapons technologies, and was an interesting thinker to put the brakes on saying, “Well, let’s just do whatever we can do.”

He says: “Everything we do as mathematicians, philosophers, technicians, and as engineers is of the highest moral order because we affect one another, and we have to take again that responsibility of what we send out in the world. Just because we can do it, just because we know how to do it, does not mean we should do it.”

I think these three spoke to each other at a time when cybernetics was very young, budding, and nascent, and given that our present condition is very much cybernetically inflected or inflected with computational technologies, I find these three quite insightful still for today.

KEVIN MALONEY: Two of the three were married at one point, so that is very impressive. I would have liked to have part of some of those dinner conversations. I can only imagine actually. I am not sure if those would have been the best dinners.

Elke, thank you so much for joining us. I appreciate it.

ELKE SCHWARZ: Thanks so much for having me. It was a pleasure.

Carnegie Council for Ethics in International Affairs is an independent and nonpartisan nonprofit. The views expressed within this podcast are those of the speakers and do not necessarily reflect the position of Carnegie Council.

You may also like

NOV 25, 2025 Video

Geopolitics in an Era of AGI

As nations and researchers race to develop artificial general intelligence (AGI), watch this expert panel discuss the geopolitical impacts of this technology.

NOV 20, 2025 Podcast

The Principle of Pragmatic Idealism, with Björn Holmberg

Björn Holmberg, executive director of the Dag Hammarskjöld Foundation, joins "Values & Interests" to discuss the power of pragmatic idealism across international relations.

OCT 30, 2025 Podcast

Misinformation and the Global Manosphere, with Odanga Madung

Kenyan journalist and researcher Odanga Madung discusses the corrosive effects of misinformation, the global rise of the manosphere, and the power of collective narratives.