Langdon Ogburn presents at Carnegie Council's sixth annual Student Research Conference.
Langdon Ogburn presents at Carnegie Council's sixth annual Student Research Conference.

Drones and Warfare, with R. Langdon Ogburn

May 18, 2020

On May 8, 2020, R. Langdon Ogburn's presentation on "Drones and Warfare" was selected as the winner of Carnegie Council's sixth annual Student Research Conference. Afterwards, Carnegie Council intern Richard Anar, who helped organize the conference, conducted this email interview with him about his research.

RICHARD ANAR: What was the topic of the research you presented at Carnegie Council?

R. LANGDON OGBURN: The research I presented at this year's conference was based on work that I have conducted throughout my undergraduate studies on military ethics. I focused on the challenges to traditional just war theory and the international law of armed conflict created by new technologies of conducting war. In my presentation, I examined the moral significance of the conditions under which drones operate and how these conditions question the traditional standards of just war theory. While my research was primarily philosophical in nature, I tried to ground it in pragmatic suggestions as to what the international community and individual states can do to conduct ethical drone strikes and how new international norms can be created as war technology continues to advance.

RICHARD ANAR: Why did you select that topic?

R. LANGDON OGBURN: Being a philosophy major at West Point means that I spend a lot of time reading, thinking, and talking about military ethics—a topic that I find incredibly important and intellectually valuable as both an academic and a future officer in the United States Army. Militaries throughout the world are tasked with protecting the citizens they represent. However, this task necessitates that they must think deeply about the hardest ethical question: when and how it is permissible to take life. While I believe that traditional military ethics and the international law of armed conflict has done a great job of answering such questions in the past, I saw politicians and media pundits calling for the use modern war technology, most specifically drones, to be held to a new, more stringent standard than weapons of previous conflicts. I intuitively agreed with this idea of the moral difference of drones and I wanted find justification for my beliefs.

RICHARD ANAR: What was the process of doing the research?

R. LANGDON OGBURN: To understand why I believed the traditional standards of just war theory and the law of armed conflict do not adequately apply to drones, I began by examining works by theorists like Michael Walzer and Gabriella Blum. This helped me understand the ethical justification behind the status quo. One of the first pieces I found in my modern theoretical research was an argument against their moral difference by theorist Daniel Statman who, I believe, effectively rebutted early claims for the ethical difference of drones. I researched the works to see their justifications and whether Statman had represented them properly. Finally, I researched statistics about drone strikes and drones themselves as well as testimonies from those who have witnessed drone strikes and what it is like to live under the threat of the use of drones. Together, these work sources helped me understand the conditions under which drone strikes are conducted and enabled me to form a response to Statman's argument.

RICHARD ANAR: What were your findings?

R. LANGDON OGBURN: My findings were that the ultimate undoing of early arguments on the ethical difference of drones were that they tried to identify single conditions of drone strikes that suggest this difference. However, Statman was able to effectively counter each of these arguments by showing that each defined condition was not exclusive to drones, but were shared by traditional war technology as well. Drones, he argued, therefore should not be held to a different standard. In response to this, I drew from the ideas of the early critics and the firsthand accounts of drone strikes to form a list of three conditions of drone strikes—extreme precision, power disparity and knowledge disparity. While traditional tools of war may share one or two of these conditions, I argue that none of them share all three. Furthermore, I found that these conditions create an ethical situation that challenges the traditional idea that sides of war can indiscriminately kill enemy combatants. Rather, I found that the moral situation of drones is similar to traditional exceptions to the principle of indiscriminate killing of enemy combatants just war theory. This entails that drone strikes must be re-imagined if they are to be conducted ethically.

While my research focused on one principle of just war theory and one modern tool of war, I believe it shows a broader idea: the increased use and efficiency of technology in the modern war space requires just war theory to be rethought to be applicable to the conflicts of tomorrow. New discoveries must then be codified into international law to ensure that future wars are fought ethically.

RICHARD ANAR: Tell me about what you plan to do in the future regarding this research?

R. LANGDON OGBURN: In the immediate future, I hope to continue researching the ethical implications of new war technology as I finish my undergraduate experience and encourage my peers to do the same. After becoming an officer, I hope to be a voice for ethical considerations when employing these modern weapons and help the U.S. become an example in the international community on how to morality use new tools of war.

RICHARD ANAR: What were your impressions of the first online Carnegie Council student conference?

R. LANGDON OGBURN: As participating in this conference was something that I have wanted to do, I was originally concerned that it would be canceled. I was very excited to hear that the Carnegie Council would still be holding the conference virtually and it ended up being executed very well. It was incredibly interesting to be able to hear students from all over the world present their thought-provoking from research. It is encouraging to meet so many future leaders working on important issues that have global impacts like cyber security, conflict resolution and environmentalism. Despite extreme circumstances, the Carnegie Council still enabled us to come together and learn from one another, and I would suggest any student interested in such an event to apply.

RICHARD ANAR: What was your personal experience like presenting at the virtual conference?

R. LANGDON OGBURN: Despite not being able to have an in-person conference, all of us participants had the opportunity to watch each other’s presentations throughout the day on Zoom. I was very fortunate to be able to listen to each of my peers’ research and learned a great deal from each one of them. As there was a break after each of the three Zoom sessions, I used the time to practice presenting to a different member of my family (to the chagrin of a few of them). As the final session started, I became incredibly excited to present my research. At the beginning of my presentation I felt a little strange talking to my computer. However, this awkwardness soon melted away as I got into my ideas and thought about trying to convince those watching of my argument. At the conclusion of my presentation, I got to answer several valuable questions from the panel of judges that pushed me to think critically about my research and its implications. At the conclusion of the presentations, all the presenters returned for the concluding remarks and award ceremony. At this point we were able to congratulate each other on our presentations and spend a few moments together despite each of us being in locations all over the world. This entire experience was incredibly valuable for me and I am thankful to Carnegie Council for adapting to current events in a way that still enabled the participants to still present their research and learn from one another.

You may also like

MAY 13, 2022 Article

Ethics As We Know it is Gone. It's Time for Ethics Re-envisioned.

Given the troubling state of international affairs there is reason to be greatly concerned about how ethics is framed or co-opted. To meet this moment, ...

APR 9, 2024 Video

Algorithms of War: The Use of AI in Armed Conflict

From Gaza to Ukraine, the military applications of AI are fundamentally reshaping the ethics of war. How should policymakers navigate AI’s inherent trade-offs?

Uruguay signs the Artemis Accords, February 15, 2024, Washington, DC. CREDIT: NASA HQ Photo.

MAR 6, 2024 Article

Empowering the Artemis Accords Coalition for Peace and Stability

As missions ramp up, Zhanna Malekos Smith writes that the U.S. should lead an effort with the Artemis Accords for space sustainability and security.