The Future of American Warfighting: Lessons of the Contemporary Battlefield

Feb 27, 2014

What are the ethical and legal questions raised by unmanned aerial vehicles, drones, and surveillance? How do they affect combatants, decision-makers, and civilians? An expert panel explores these crucial issues.


CARL COLBY: Thank you for coming. My name is Carl Colby. I'm one of the hosts for this evening's event, along with David Johnson, who's not here, who is a founder of Act 4 Entertainment. I directed and he produced The Man Nobody Knew, the film I made about my father, which played theatrically a couple of years ago. We're now in active development on a film on unmanned aerial vehicles, [UAV] drones, and surveillance. Some of the same questions that we asked in the earlier film—the moral, ethical, legal questions surrounding covert action—we would really address those questions to UAVs and drones and surveillance.

We thought this was a good way to have an event that you can hear various opinions about these issues, particularly from a panel as distinguished as this. So thank you for coming.

I'd like to thank Joel Rosenthal, who is here as well. Thank you, Joel, for your hospitality.

Let me introduce the panel.

In the middle is Col. Patrick J. Mahaney Jr., former commander of the Asymmetric Warfare Group, 7th Special Forces Group, Joint Special Operations Command, the Special Warfare Center and School, and the Combined Forces Special Operations Component Command Afghanistan. He is currently a military fellow at the Council on Foreign Relations.

To his right is Ben FitzGerald, senior fellow and director of the technology and national security program at the Center for a New American Security [CNAS]. He was the former managing director of Noetic. He is also Australian.

To my immediate right here is Noah Shachtman. He was the former executive editor of news at Foreign Policy magazine, and he is currently the executive editor of The Daily Beast.

Thank you very much.

Remarks and Discussion

NOAH SHACHTMAN: I just moved over to The Daily Beast, which can be a bit of a tabloidy operation, delightfully so. So if I was going to put a tabloid headline on what I'd like to talk about, it would be something like "How the U.S. Lost the War of 2023."

It would go something like this. Folks like the guys in the back of this room or the guy to my right [Mahaney] are sent over to some delightful corner of the world and asked to take on a regular army, or a regular foe, kind of like what happened in Afghanistan or Iraq, except this time the irregular foe comes a little bit better-equipped.

They've got drones of their own—maybe some that they bought over the Internet, maybe some they made, maybe some they printed out on a 3D printer. They've got pretty much near unbreakable communications, because they've got advanced cryptography on their phones, advanced encryption. They've got satellite and overhead imagery from Google Earth and Google Maps. They know where each other's locations are at any time so they can perform advanced infantry tactics. And they've been training on Call of Duty 12, so they are able to execute those tactics in a pretty sophisticated fashion.

The U.S. forces, on the other hand, unfortunately, there has been a little slowdown in defense spending, and so they are using the same crap they did in 2012 or 2013.

I don't think you need to be a student of history or a military historian to know that the fights in Afghanistan and Iraq were pretty tough, even when the United States had this incredible technological advantage over its foes. What happens when there is something like near parity in technology—because, as we all know, the technological drivers in the world today are not the Pentagon anymore, maybe in some very specialized way the NSA [National Security Agency] perhaps, but it's really Google and Apple and the like that are really driving technology innovation. When that happens, it's not just a small, select group of troops that benefit from those, but it's everybody, good and bad, American and foreign.

Sort of my canonical example of that is the Xbox One, the new Xbox. Has anybody checked out the new Xbox? Well, not only does it play great games, but it comes with this amazing suite of sensors. Basically, it comes with a camera that works in day or night. It's got an active infrared camera so it can see you in the dark. It also comes with a number of biometric features, ways to identify you by your body parts. It will check your heart rate, it will track your skeletal motion, and it also does face recognition. In other words, any insurgent group in the world can get a really sweet biometric perimeter for like $395.

BEN FITZGERALD: Actually, the Koreans have those on the DMZ [Korean Demilitarized Zone] now, the actual Xbox Ones. It's not official, but someone put one on there and it worked.

NOAH SHACHTMAN: Right. So fast-forward just a couple of years and that's not just going to be a one-off on the DMZ. That kind of technology is going to be everywhere.

I've raised this issue at a bunch of military forums here in the United States. The answer is always like, "Hmmmm." So I'm hoping to get a slightly more involved answer from my colleagues, put you guys on the spot—not just in terms of what technological innovations there are, but also in terms of use of forces, training forces, because in the end—I'll give like a little give-away to my own tabloidy headline—training and people and logistics trump technology nine times out of ten.

PATRICK MAHANEY: I'll pick up on that. A very unmilitary comment to what he said. That is, the vision of the future there is something that many of us have been looking at, because if you follow the trajectory of where we have been—I'll run you through a few technologies, a few concepts, that are out there. But you can see—it's not too hard to figure out—that that's where we are going.

I'm going to start my comments with basically focusing from here back: In other words, how did we get to where we are?

If there's one theme that I would hope that people take away from my comments, it's that essentially we have seen in this very human endeavor called warfare—and it is a human endeavor—that technology has helped to empower lower levels of command.

Everybody focuses on the higher levels. You've got the ability for the executive branch to reach out through drones and touch people in third countries, with which we may or may not be at war, particularly when we're not. But really, there is a story behind the story that is really quite remarkable and is, fortunately I think, starting to come out.

Essentially, we are powering down to troops in the combat zone, which is very near and dear to my heart, and what we're seeing is a very dramatic increase in the effectiveness and the survivability of our troops.

To put this in a quick perspective, we've been in this war for 12 years. This is my third war. I was in this one seven times, in my case in Afghanistan.

One of the things that has always struck me, for all the difficulties that Noah pointed out, if you make a quick, numerical comparison with Vietnam, the 12 years we've been in Afghanistan, the eight-ish that we spent in Iraq, we lost about 6,700 killed—each one tragic; it's very difficult. But in Vietnam we lost 58,000 dead in about seven-and-a-half years. In this particular case, Vietnam had about 47,000 killed in combat.

But to just make a quick comparison, we had two wars going on simultaneously. We can talk about it as one war—fine—but really, it was two distinct theaters of combat.

The enemy was and is a learning-adaptive enemy that has been able to enable their operations through the types of technologies that Noah talked about. And yet, we have always been able to get that one bit ahead.

Now, it has been painful, and I could speak all day about how painful it has been. But focusing on how we got here and what strikes people's attention, I'll cover a few points.

First of all, there's offensive and defensive portions of this. Normally, people want to talk about the offensive strike part. There have been some excellent articles that came out in Foreign Affairs, for example, with General McChrystal about a year ago, coming out and talking about some key enabling technologies. I'll touch on those very briefly.

But there's a defensive part of this, too, that helped enable the survivability efforts.

I have to address—and I think it's worth discussing a little bit later—how did we get there?

A lot of this started with Special Operations forces, without a doubt, particularly the adaptability stuff and the ability to quickly integrate new technologies and get them into the hands of war fighters rapidly. Imperfectly, clearly, but it starts with Special Operations forces, through organizations like the Combating Terrorism Technology Support Office and others reaching out to other key organizations, especially once the counter-IED [improvised electronic devices] fight took off.

So JIEDDO, which is the Joint IED Defeat Organization—which former Deputy Secretary of Defense Ash Carter just wrote an article in the last issue of Foreign Affairs on—organizations like that stood up, got empowered as of 2006, and started really moving forward very rapidly. Units like the one I commanded most recently, the Asymmetric Warfare Group, worked to get that into the hands of war fighters—not just on the Special Operations side, but also on the conventional force side, in fact most significantly on the conventional force side. And then there are others, the Rapid Equipping Force and others.

Now, the question for us going forward is: Are we going to be able to take the lessons learned from this adaptability and move forward so that we don't end up like the force you described in 2023, where we're stuck in 2012, 2013, 2014, and yet everybody else is out there?

A very important point also is the private/public partnership there, because most of the enabling technologies were not from the Department of Defense. They were commercial, off-the-shelf technologies or technologies that were designed or redesigned, enhanced, in cooperation with the Department of Defense. Extraordinarily important.

Okay, what are they? The main things I would hit on are:

    • First of all, night operations, night vision goggles, the ability to see at night, was huge.

    • Global positioning system [GPS], the ability to know where you are. I can't overestimate the importance of that.

    • Then, obviously, what we call ISR, intelligence surveillance reconnaissance. It's not just drones, but drones are a big thing, and no doubt we are going to talk quite a bit about that.

  • And then, of course, this umbrella of information technology (IT), taking all that information, fusing it into something usable that doesn't overwhelm the commanders, who are human at the end of the day, and we're able to get a sense of what we're up against and how we're doing. The bad news is with the stuff that we have now they can get it too, and they are.

I'll skip the drone part a little bit to hit on something that people rarely talk about. There's key technological innovations that we have seen in such things that we call close target reconnaissance, meaning the ability to get very close to a target with a person usually—the iPhones, the smartphones that we have, just the technology embedded in that camera, recording devices, all that sort of thing—has sometimes been completely missed, and yet it's something that's really quite usable, not just by U.S. forces. Whereas in the old days, with very sensitive technology, spy-type technology presumably, we had to be very careful about giving it into the hands of some third-country national who might do something with it. Now it's ubiquitous. So that has been huge.

Then there's what we call site exploitation, which is essentially biometrics and forensics, leading towards what we call identity dominance, ID dominance. We don't dominate anything necessarily, but the ability to know who we're up against—what we used to call, when I was a commander of a Special Operations task force in Afghanistan in 2007, CSI Kandahar.

An IED goes off. It's now a crime scene. It's treated like a crime scene. So did we have to reach out to law enforcement communities? Of course we did, because they're the experts in that.

And the ability to reach out and pull in biometrics and forensics-type things is something that has been completely missed, in my opinion, in the majority of the dialogue.

And then finally, protection, the defensive part of this. One of the main lifesavers out there—and there's many—is the use of electronic warfare at the sub-tactical level, if I could just use that term. Tactical level is usually battalions, companies—we're talking about several hundred soldiers, for example. We're talking about at the level of a nine-man squad or a 12-man A team from the Special Forces, a 13-man Marine squad.

Every vehicle has a jammer on it. That was something that only ships had in the past. Maybe a battalion would have electronic warfare capability. But now every vehicle has a jammer on it to jam the remote control of IEDs. So how we were able to tool that from something that was only for large-scale operational, maybe strategic-level, units and drop that right down to where every corporal has to know how to use that system and de-conflict it with their own radio communication signals. This is massive.

And finally, robotics—we could talk later in the question-and-answer about that—the small, unmanned ground vehicles, the use of them for counter-IED purposes; but then also for urban operations, being able to use them for reconnaissance and for subterranean operations, which is something we're seeing a significant rise in.

BEN FITZGERALD: I'm going to talk a little bit about the future.

I think what Pat has just talked through is how, from a U.S. warfighting perspective, when we get our minds around a problem or an opportunity and we get some consensus about what needs to be done, it's quite breathtaking the level to which we can be innovative, that we can apply technology and all of the other human stuff to support that, to actually increase our combat effectiveness.

The challenge that we've got going from today into the future is that we don't have a clear sense of what those problems or opportunities are, and we are being pulled in different directions. So there's a bunch of different technology trends, which I'll talk to a little bit later, that are diffusing and converging in ways that are disruptive for our existing organizations; there's an increasing range of threats; and oh, by the way, we've got declining budgets. This is a recipe for the services, the Department of Defense, the United States government, and Congress to do a whole bunch of nothing and to waste a lot of money doing it. That's challenging.

If we had a clear understanding of who that threat was going to be in 2023, that would be a piece of cake; we could address that problem. But we're not sure what the problem is going to be.

As we look at some of these technologies—given that we're here in collaboration with the Center for the Study of the Drone, I thought I'd give a UAV example, to pull out some of the issues that come there. If we look at UAVs, which are often in the news and we all have some understanding of, let's unpack it a little bit.

The current zeitgeist discussion is talking about killer robots. But it's really not that simple. From my perspective, what makes UAVs contentious is that we are using them to undertake direct action, counterterrorism, against non-uniformed enemies or irregular actors, often not in war zones, with the potential for civilian casualties. That's a pretty contentious policy. It's not actually an issue of the UAV.

Just to test that out, how many of you are familiar with the work that the UAVs did in Libya? [Show of hands]

Not that many. They were used primarily to hit tanks and clearly designated enemy targets. Not contentious at all.

Similarly—and I don't have any security clearances, because if I did, this would probably not be a very smart thing to say—a lot of the strikes that we see in Yemen, which are highly controversial, it seems unlikely to me that all of those are from UAVs. There's probably some manned jets that are involved there.

So we need to be able to parse out what's a policy issue, what's a strategy or an ethical issue, and what's an issue of the technology.

But given that that has occurred, we are having this debate now about killer robots—even though they're not actually killer robots; they're remotely piloted aircraft, which is what the Air Force calls it. That would be a rare and really good thing, having a talk about the technology before it arrives on the battlefield, which almost never happens. It usually gets deployed and then we go, "Actually, I wish we hadn't done that." But we're not having that conversation. We're having a confused conversation about a little bit of the technology, a little bit of current policy, and the sense is we shouldn't do this stuff.

So even if we start debating the right issues and don't get distracted with this counterterrorism stuff, how do we think about all of these problems?

We've got issues of human accountability. Where do you put the human in the loop? Do you put the human in the loop on the trigger? On developing the target? Is the human represented in the code? How does that work out? How do we adjudicate how we think all of those things through?

As we're thinking about that, there is significant pushback against UAVs and autonomous systems from within the Department of Defense. Just for people who don't hang around with the Department of Defense, you should never view it as a single, monolithic organization. It's usually in competition with itself. So when you see sclerotic or schizophrenic behavior, it's because it's not one entity. Otherwise it would be totally different.

So if you look at current budget cuts, you'll see that autonomous programs and UAVs are taking a disproportionate hit relative to other programs. Despite the fact that a lot of people don't want to see them coming forward from the broader community, that's also happening inside the Department of Defense.

My prediction here is that the thing that will tip the balance in favor of autonomy will be an operational need. We'll find that the United States Air Force won't be able to establish air superiority somewhere, or other people will start using autonomous systems in large numbers in anger, or, similar to UAVs today, we won't have any other tools in the toolbox and that's what we'll need to use. That's how it will move forward. The organizations themselves will just keep this as something of a hobby horse.

Even as we think about that, assuming that we figure all of those issues out, within the Department of Defense, what's the future for autonomous systems? Is it big systems, like the X-47B, or the future of those, where they're talking about F-14 Tomcat-size autonomous systems that can fight just like a fighter jet but without a person in it? Or are we talking about swarms of micro-UAVs, thousands of small things? That hasn't been decided. So we don't have the concepts down, we don't have the technologies down, even aside from whether we want to do this at all or not.

At the same time, private sector and other nations' investments are going to essentially make this a done deal. This thing is going to happen at some point in the future. It's just going to be a question of when.

That's really because autonomous systems represent an opportunity for our adversaries or our competitors to make our current Air Force advantage irrelevant nor incredibly, incredibly expensive. This is something I have written about at the Center for a New American Security.

When you've got the F-22, which is absolutely the most capable man-to-man air-to-air combat fighter in the world today, it costs $187 million. Can you really afford to lose one of those? If you can put a swarm of dumb UAVs that cost $1,000 each and you can put 200 of them up, what's going to happen? So there's an asymmetry cost issue there.

And so, despite the fact that all of that is happening, as other actors are bringing these technologies on board, they've still very immature and we don't know how to think about them.

I'm not sure if any of you guys saw a few months ago that the Chinese flew a UAV directly towards the Senkaku Islands. China has been investing in UAVs. It's cool. They want to seem like us in some ways. So, I mean, hey, everyone's doing it. But they haven't really thought through the command and control of that or the international norms.

So they wanted to send a message. They sent a UAV directly towards the Senkaku Islands. It did not enter Japanese air space. The Japanese scrambled two manned F-15s to intercept it, escorted it out. No shots were fired. That was fine.

The Japanese immediately said, "As a matter of policy, we will shoot down any UAV that enters our air space."

The Chinese responded and said, "If anyone shoots down any of our UAVs, that will be an act of war, the same as shooting down a manned aircraft."

We probably shouldn't be figuring this stuff out on the precipice of a conflict. But that's how it's going to happen.

So for me there are all of these sets of issues. Everything that I've just talked about now is just about unmanned systems in the air.

If we look at other things that are coming about, we've got robotics, autonomy on the ground, undersea, under the ground; we've got cyber security; we've got information writ large, big data mapping, intelligence, signals intelligence; we've got directed energy weapons systems, lasers, if we can sort out the power issues associated with that; we've got digital manufacturing, so 3D printers and a bunch of other stuff that comes with that; meta materials, nano materials, the actual stuff that we make stuff with; and beyond that we've got human performance modifications, synthetic biology. That's just the technology bit. Then we've got diverging geostrategic interests.

Basically I'm saying this is complicated. I'm not sure if I'm allowed to say that. I said it. So this stuff's complicated, and we're at a point where we can't figure out which of those futures is going to arrive. In some ways, we're describing parallel universes. We don't know which one is going to get instantiated.

From the United States government perspective, if we could figure out what the one threat was, that's not a problem. The issue is, as the sole superpower that we are right now, how do we invest in mitigating potentially high-risk threats while understanding that for every near-peer competitor that we worry about we'll probably end up deploying people like Pat in the future before we send out a swarm of unmanned, autonomous systems from space-based assets with lasers and all sorts of other things. Figuring that out is significantly difficult.

The key thing for me, as someone who works on technology, is that the stuff that you need to look at is everything around the technology itself. This is something that I think Noah and Pat have both sort of hinted at already. It's one thing to look at how fast is it, how big is it, how cool is it. But all of the human bits are the most consequential. I think that for the Carnegie Council, continue to be focused on ethics. I think that that is the challenge for our time on this issue.

Ultimately, that's the thing that we neglect. Even though it's significantly less expensive to invest in ideas and look at the values and the concepts of operation and all this sort of stuff, we don't do it, even though it's much less expensive than spending billions of dollars on science and technology, research and development, or capability development. But we don't.

Usually, I describe this as a Michael Bay problem. We always invest in special effects technology to make our movie better instead of in the script. So we need a better script. For me that's the key issue. We are going to be spending a lot of money. We better get the script right. We haven't necessarily done that well in the past.

NOAH SHACHTMAN: Pat, I want to talk to you first. Obviously, the work your fellow members of Special Operations forces have undertaken in the last three or four years has been in the headlines a lot, weirdly for a supposedly secretive bunch. So I wonder, are those kinds of raids in the middle of the night, on Captain Phillips' ship, Osama bin Laden's hideout—do you see that as kind of the future of warfare, or at least for the next few years? It sure seems like whenever there's a problem in the world now the answer is to send Special Operations forces.

PATRICK MAHANEY: Right. Special Operations have become very, very capable. Normally, the public things you hear about are from the SEALs, by the way. I'm an Army Special Forces guy, not a SEAL.

We usually hold onto the mantra of being quiet professionals. One of the reasons for that is there's a lot of things that we do quietly that are not just kicking in doors quietly, that sort of thing; it's working by, with, and through host-nation personnel in a variety of ways.

So we're not going to get away from the baseline Special Operations mission, special warfare, which is to operate through local people. That's why we have to speak foreign languages, we have to know cultures, we have to do that.

NOAH SHACHTMAN: What does "operating through local people" mean?

PATRICK MAHANEY: The best example lately, which I'm certainly very proud of, as is our community, is the Afghan local police program. If you take a look at Afghanistan—and the war is in its 12th year—what was the local security mechanism that could provide some measure of security that would mean something to the Afghan people? What was that?

There were some efforts here and there. There was always a fear, of course, that these efforts would turn into militias, that they would fall under the sway of warlords, etc., etc.—legitimate concerns.

But at the end of the day, if you're in a remote village or in a small town pretty much in the middle-of-nowhere Afghanistan, who's going to defend you? Your own people, obviously. Well, how do you stand up a security force like that in a way that would make sense?

The way this was done is the Afghan local police program. We picked the most kinetic (meaning violent) areas from 2006–2007 through 2008–2009, and worked through the politics of this—that's absolutely critical.

Then, by 2010—it was specifically beginning August 16, 2010, was when we were authorized to do this—we created this program with the buy-in of the Afghan government, which essentially said that we could create these forces that were local police.

It's not a radical concept. That's how the West was won here in America. That's how we built our country. We've got the NYPD here and we've got state police. The separation of powers, what did that address? Center and periphery balance.

To wrap that part up, you've got the national army and national police force. The national police were ineffective, largely speaking. The army, not bad actually, and they got better and better, particularly their commandos.

But who's going to be out there protecting people in their homes and villages? Who's going to stand up to the Taliban?

So we literally went to areas that were Taliban in some cases—whatever that means; that's a separate discussion—and they bought into this program. They had buy-in for the whole thing. So the by, with, and through means something like that.

These models have worked in Colombia, Peru, the Philippines, many other parts of the world.

NOAH SHACHTMAN: So it's Wyatt Earp.

PATRICK MAHANEY: It is enabling locals to take care of themselves. But they must have somebody who is brave enough to stand up and be a leader. If it's Wyatt Earp and he's the guy who's going to do that, so be it. And, very importantly, if he could tie together the political elements there, if he could have a shura, which usually is what we have, the council, and get agreement on that, that each group—this is not tribal. Everybody says this is a tribal solution. It's not. It's a local solution. Many parts of Afghanistan are mixed ethnically and in many other ways, religious division and the rest of it. But they have coexisted for hundreds of years in many cases.

So how do you get that political piece done? You get that first, and then you get that other individual to set it up. Again, we're back to in this case talking the human piece again, which it's conflict, that's where it is.

Can you empower that element by having an embedded Special Operations team that has access to the drones and to the precision strike capability and to the Medevac, the medical evacuation, capability? Of course. So you fuse the two together.

Again, this is higher-level, doctorate level of work, but that is ultimately what worked.

NOAH SHACHTMAN: Just to follow up, allegedly U.S. forces are pulling out of Afghanistan this year—do you see that kind of work continuing in Afghanistan beyond that, or is all going to turn to Ben's drones?

PATRICK MAHANEY: I think that Afghanistan, like many other parts of the developing world, will continue to have significant security problems, without a doubt.

I am a little more sanguine about the future of Afghanistan—I spent a lot of time there—because ultimately there are people who are willing to stand up for their country. The Taliban has about a 10 percent approval rating in these polls that are taken.

NOAH SHACHTMAN: Better than Congress. [Laughter]

PATRICK MAHANEY: And so I think that the question is going to be how the central government balances it out. If anything is going to work, it's not going to be a Kabul-driven solution forcing whatever the Kabul solutions are down on the local people. It's going to be a balance between the two. That balance is traditionally an Afghan balance, by the way.

The question is: Why did it take until 2009–2010 to get this, when many of us knew that this is what worked? That's a whole other story, which I would not speak about on-camera anyway.

BEN FITZGERALD: There's an interesting thing linking that up when thinking about future warfare, which is: Is future combat going to be more about people or is it going to be more about states? That breaks out further, which is when you get into big state-on-state conflicts, we start seeing a lot of air power and sea power coming in, and for those types of technologies, those big technologies and platforms, people equip the weapons; whereas, when you talk about the type of conflict that Pat is expert in, technology supports the human. These are very different ways of thinking about things. There's still technology in both, and we are going to have to figure out how we support both of those, because we are going to have to prepare for both of those contingencies.

What is likely to occur is we make all these big-dollar investments for the state-on-state stuff and then we deploy guys in the middle to work below that higher level of conflict.

NOAH SHACHTMAN: In some ways that makes sense, because you want to invest in that higher-level stuff to deter that really bad World War III thing from ever happening.

BEN FITZGERALD: Exactly right. So if we don't figure out all those big defenses, we may have a defense-of-the-homeland problem, which this country has not had to think about for a long time, and I don't think will have to think about for a long time either. But then when you go expeditionary, how do you maintain success?

PATRICK MAHANEY: If I could briefly throw in there, the two are not mutually exclusive either. When you start with the Predators, the drone that everybody knows about, on the one hand, it went up bigger, meaning larger Predators—like the Global Hawk is a great example of that. It actually dropped bombs, big bombs. It has a tremendous loiter time. It goes way up in the atmosphere. Okay, so it went high end.

But on the lower end, in the hands of soldiers at the corporal level, at the lower tactical level, we have hand-thrown drones that are meant for visual surveillance, for example, reconnaissance.

There is a point here where this is being weaponized as well, because the trend towards miniaturization—we're not talking nano stuff here; we're talking stuff that you could touch and feel and use—with these miniature drones, particularly—it's on the Internet, it's called the LMAMS, Lightweight Miniature Aerial Munitions System—the drone itself is a weapon, which sounds scary.

If you think that it's controlled by itself or some robot or there's a computer program, it's not. The purpose of this thing is essentially, to put it in soldier terms, it's a replacement for a mortar tube. A mortar tube, if you're aware of it, you drop a dumb bomb down a tube and it shoots out and it hits what it's going to hit. Once it's shot, that's it.

Well, these LMAMs, for example, as designed to be waved off at the last minute, so if a little child comes running out, you just do the kill switch, meaning you shut down the system.

NOAH SHACHTMAN: It's a good kill switch.

PATRICK MAHANEY: It's a good kill switch, not the bad one. You shut down the system so it's inert. You could wave it off. You fly it around. And, very importantly, it's in the hands of the people who most need it. So do they have to call in a plane with a 500-pound bomb? No. Basically, this thing is a flying shotgun shell. Very, very precise in what it's going to be able to do.

My point in this is as we started with this thing called Predator, sure we went up, and there's a state-on-state piece of that. But when you talk at the ultra-tactical level in the middle of nowhere, or in a city, where you have to be very precise because of concerns about civilian casualties and everything else associated with it, you've got that too. So the technological piece enhances both.

BEN FITZGERALD: Yes, I totally agree. And thinking about the ethics of this gets very interesting as well, because it does sound scary, it's like you have a micro-robot that can blow people up. It's like a kamikaze. That sounds terrible. At the same time, it's far more precise. There are all of these benefits. So weighing these things out is very challenging to do, especially in a charged environment. So we need to find better ways of having those conversations, because often it doesn't move forward.

PATRICK MAHANEY: Think of the conversation we had here two weeks ago in this very room on rules of engagement—excellent discussion, by the way.

It started with the idea that discussions of this type began with land mines, the dumbest of dumb things, that are planted in the middle of nowhere and they blow children's legs off or cows' legs off, and maybe once in a while your enemy. But the people who truly suffer from that are civilians, overwhelmingly, throughout the world.

So we went from a completely dumb system that's extraordinarily dangerous—and, by the way, moves in the ground, if you don't know that. You lay out a minefield, and then the muds happen and various changes. So you might not have the same field.

We're moving to the polar opposite side of this to extraordinary precision with humans in the loop. It's just down to the lowest level, where the person who's actually involved in the fight can be involved in this. I think that's really worth noting, and I don't hear much talk about that.

NOAH SHACHTMAN: Maybe just because I'm a Marvel Comics fan, but I'd like to talk about your parallel universes. Take a guess—which one is actually going to happen first?

BEN FITZGERALD: What I think will happen is we're going to spend a lot of money on big systems and then we're not going to use them. We're going to do more fighting in cities, and it's going to get ugly and messy. Fighting in cities and doing stuff in and around people does not necessarily mean counterinsurgency. You can have straight-up kinetic, conventional, ugly fights in cities.

And it's going to be really, really complicated, because the technology there isn't just technology that we bring; it's going to be a very sophisticated environment. So it's not like fighting in the hills in Afghanistan, although that's complex in and of its own right. Even if you look at Libya or Syria, you've got guys who are out there, whether they are freedom fighters or insurgents or terrorists—who knows?—but they are building all sorts of very interesting weapons. It looks like it has been put on a mini chassis—their own armored vehicles. No windows. They have cameras on the outside, they have an Xbox controller on the inside. This would not deal well with a main battle tank, like an Abrams tank. But these guys are driving around, under armor, protected. They can put weapons on top of it. That's a very, very different proposition to Afghanistan. And they're doing that with nothing.

So if you go into a more advanced place, that's just a really, really ugly place to fight. I think that that's the most logical thing to see.


QUESTION: At the risk of simplification, if we're really talking about the future of American warfighting on the top level, isn't cyber warfare truly what we should be focusing and investing in, both offensively and defensively? Are boots on the ground and drones, with boots on the ground armed to the teeth with high-tech weapons, almost passé, except maybe in failed states?

BEN FITZGERALD: We are absolutely investing in all of the cyber stuff right now. The question is: What's the life of that going to look like?

I recently ran a poll asking people about future technology investments. Everyone said that the most significant investment should be in unmanned systems and cyber security. If you look at the current budget lines, only cyber is getting that attention.

My personal view is I think that the cyber thing is a little overblown at the minute. Not to say it's not critically important, but I think that we are going to see that there is going to be a lot of focus on that over the next five to ten years, and then other things are going to start coming online.

My strategic and operational answer is that there is a limit to how much pressure you can apply via cyber means. At a certain point, you've heard all of us say that warfare is a human endeavor. People are going to need to talk to people directly. It can be hard to message in cyberspace. In the short term, we are not necessarily sure what the messaging looks like, and you can't necessarily apply the pressure that you need.

One quick other thing on that. I was called by a reporter just as we were looking to potentially go into Syria. Someone asked me, "Why are we not using cyber tools instead of using Tomahawks?"

I was like, a) if we could launch a cyber strike in there, it's not necessarily going to change anyone's behavior. It may have switched off some critical infrastructure, but it's not going to stop people going out and shooting other people.

Also, we wouldn't necessarily know what the message we had sent was, the Syrians may not know, and the international community may not. They could have said, "Well, the Americans didn't care enough to send a real weapon. No one died." The Syrians could say, "Well, actually it was just a power outage." So it's really unclear what would occur. Whereas, based on your 20 or 30 years of precedent, everyone knows what a Tomahawk means. That's not to say that cyber security is not important, but that's not going to be the only type of conflict that I think will be important.

NOAH SHACHTMAN: It also might not be an either/or question.


NOAH SHACHTMAN: First of all, just on cyber strikes or whatever by itself, just remember that squirrels have taken out more power stations than any malware has.

Secondly, you can imagine those two things happening at once, right? You can imagine trying to hack somebody's radio at the same time that you are also trying to whack them over the head with LMAMs. So those things could happen in concert, and trust me that people are training to do those things in concert.

I think we're at an age when we're kind of immature about our tools. So we're staring at our phones—I'll speak for myself; I am staring at my phone all the time. But we are going to learn to be more sophisticated about our tools. I think it's going to feel a little less central than it does at the moment. So I think it will be an element but not the element.

PATRICK MAHANEY: I agree with the previous comments, definitely.

I will address, what are we doing about it? How do we even conceive of what you correctly pointed out as a massive challenge for us? Given that the use of lethal force ultimately exists in the real physical world—and that's ultimately what the military specializes in, is warfare—but there is such a thing as cyber warfare and it has taken on new forms. But how do we conceive of this?

There is, I think, a very healthy debate going on right now within the ground forces—Army, Marine Corps, and SOCOM [United States Special Operations Command ]—which is pushing for, particularly on the SOCOM and Army sides, what's called the seventh warfighting function. Without geeking you out on what that means, there are six—

BEN FITZGERALD: We can't geek out?

PATRICK MAHANEY: No, sorry—six far-fighting functions of mission command, fires, protection, support, sustainment, that sort of thing.

But the seventh warfighting function is all the stuff we can't quite understand and we always get wrong. We don't even know what to name it yet. In fact, most guys, like me, just call it "the seventh warfighting function."

Here's the names that have been kicked around:

It started with Special Operations. That quickly turned into—and correctly I think—Special Operations into Conventional Force Interoperability, meaning how we all work together. Which quickly turned into Human Domain/Human Dimension, meaning we went into Iraq, took down the country in 2003, and then suddenly we had a big mess on our hands. Did anybody figure all those people—the "human terrain" is the term that's used—the people that are there might not be jiggy with it, they might have a problem with what happened, particularly given the fractures within that society. So the idea that we have to look at the human element right there. And then, added to that now, if that wasn't complicated enough, is the idea of cyber.

The unifying theme through all of this appears to be influence. I don't know how it's going to turn out, but I am loving this debate, because it should have happened, in my opinion, about 20–25 years ago, this particular piece.

So how do we influence? Influence can be compellence; it could be convincing. But ultimately what are all the elements that go into influencing people? That's where we are with those various elements.

QUESTION: I may be ill-informed, but I don't think our doctrine has changed lately, and that we are still targeted for two theaters of war. I guess the question is—it's the parallel universe thing—how do you define a theater and should there not be sub-theaters in terms of symmetrical versus asymmetrical war?

Then, the second part is: going to 2023 and the war then, aren't we kind of lagging behind in terms of just the practical aspect of funding this through the budgeting process, where we have this great inertia and problems with things like the F-35 and that sort of issue?

PATRICK MAHANEY: I'll start off by saying if the sequestration cuts kick in, together with other cuts, personnel and otherwise, that are going in, we are not going to be able to handle two conflicts.

The term "conflict" is too broad. The term that has been used—it's really not doctrine; it's national policy—is two major theater wars (MTWs is what they are called) and then there would be some sort of low-intensity conflict thrown in there. That was a big thing in the late 1980s and 1990s.

There is no way, with the cuts coming in, that we are going to be able to handle two major theater wars, which are defined as really state-on-state. With the rise of these non-state actors, criminal groups, elements like the Taliban, lone wolf super-empowered individuals who have access to all the technology that Noah and Ben spoke about—we're going to have our hands full.

In terms of how you define a theater of war, we tend to do it right now geographically, and within that geographical area there are a number of concerns, usually in the physical world.

When it comes to cyber and other key support elements, that concept is very broadly expanded already. For example, Afghanistan is an easy one to talk about. The cyber stuff is not being done in Afghanistan. The intelligence support is not there; it's done from outside. Heck, they're flying the drones from Nevada! So the theater of war is greatly expanded with that concept.

Likewise with allies. EUCOM, European Command, had a big voice, because of NATO, in what was going on in Afghanistan and before that for a while in Iraq. So was your theater just strictly the geographical boundaries of Afghanistan, or Central Command more broadly, the Middle East? The answer is no.

But did our structures allow for that sort of very dynamic and highly complex, meaning interconnected, world? The answer is absolutely not.

The doctrine now is coming out that I think starts to capture that. But we're not there yet. But there are some seriously good people working on that right now.

BEN FITZGERALD: I think, just quickly, the point about theater is a really good one. I completely agree with Pat's policy and doctrinal statements.

I was doing some war games a couple of years ago where we were looking at the future of combat in littoral cities, cities by the sea. We were looking at the Tamil Tigers as part of that, figuring out how they did business. They were a very impressive group.

What was their battle space? They lived and worked on an island. They sourced air power from Europe; they actually bought Cessnas in pieces and built their own planes. They built submarines there as well. But they had their propaganda arm in London. They did most of their fund-raising in the UK. So how do you go about fighting one of those? What's their center of gravity? It's not complicated in terms of doctrine; it's complicated in terms of just intellectually how to sort it out.

QUESTION: Thank you for a very compelling presentation. My name is Eddie Mandhry. I'm a Carnegie New Leader here.

My first question is to you, Ben, around the Geneva Conventions and their relevance. There is an increasing conversation around how non-state actors now have access to technology that they can deploy to the United States. What happens when countries that are against the United States deploy this technology? Do we need an additional protocol to the Geneva Conventions?

And then, secondly, on Max Weber's thesis that states have the monopoly of violence. We have a situation where now non-state actors can deploy technology that is off-the-shelf, as you said, Patrick. What are the implications for how we think about warfare now? Can we envision a situation where non-state actors in one country attack non-state actors in a different country? Do we need to think differently about how warfare is conducted?

BEN FITZGERALD: Yes, we need to think differently about how warfare is conducted. I don't have a good answer on what to do. I don't think there's an easy fix. We can't just add an additional subsection in the Geneva Conventions.

This is where people start talking about the Peace of Westphalia and its current relevance. That's a big issue to get into in my 60-second answer.

I don't think that there is an easy fix. More than that, we don't even have an easy handle on the problem itself. So what is actually required—and again, this comes back to some of my prepared remarks—the most useful work that people can do, especially people in think tanks, in academia, is actually figuring out all of the values and the policies, the ethics, and the legal stuff that comes around it, because there is no simple answer that comes out of there. It becomes about determination. Even if you look at Afghanistan, we had a non-state actor operating within a state that perpetrated an attack against the United States of America, and therefore we invaded Afghanistan.

Actually, it reminds me of—did you guys see The Onion headline that came out maybe a month after 9/11? They put together this thing that said, "United States Commits $20 Billion to Building a Government in Afghanistan to Then Destroy." It was all prescient.

That's the long way of saying there isn't an easy answer. We need to look at all those protocols. We need to look at how all that sort of stuff happens.

Democratization of technology is a very good thing in many ways, but super-empowered individuals, all of our legal constructs are not set up to address that.

PATRICK MAHANEY: The key issue is legitimacy. All of our activities have got to be done in a way that is perceived—legitimacy is about perceptions. One of the surest ways for us to maintain the proper perception of legitimacy is to maintain good faith and trust with such things as the Geneva Conventions, clearly. However, the real world is pulling us in exactly the opposite way, and it gets pretty darn messy, clearly.

Issues such as lawfare—warfare and law, together lawfare—that is something we face routinely.

False allegations of civilian casualties, for example, which just came out recently, that is common in Afghanistan, absolutely common, for a number of reasons, and that's a separate discussion. When real civilian casualties come out, we have to immediately address it and take responsibility for it. I am absolutely in favor of that. We must do that because of legitimacy.

But relooking at the laws under which we conduct our activities I think is appropriate. However, ultimately, just speaking as an American officer, we know that the American people are not going to allow us to descend into savagery, even if we're against a savage foe.

If two non-state actors, for example, as you brought out, are fighting each other and it becomes like some of these drug gangs in Mexico or many other scenarios I could come up with—Sendero Luminoso and the MRTA [Túpac Amaru Revolutionary Movement] in Peru, who are fighting each other, which was great for the Peruvian government because they were about to take down the Peruvian government at the time. That is what it is. But they are not going to follow the rules anyway.

But it's critical that we don't descend into pure brutality. We can't. So that has to be governed by codes. We think our codes are pretty good, but there are elements that I do think need to at least be reviewed, even if we come up with the same outcome.

BEN FITZGERALD: Yes. And I think, in terms of what is the American advantage in future warfare if everyone has access to advanced technologies, I think that the values piece actually counts. We should do it because it's the right thing to do.

As we've said, I'm Australian. I have chosen to live in America. I like living in America. A lot of that is about ideals. America is an idea as much as it is a geographical entity.

Cleaving to that I think will set us in good stead. So it's the right thing to do, but I think also strategically it's the smart thing to do.

QUESTION: I'm Amelia Wolf from the Council on Foreign Relations.

You know historically the future of warfighting simultaneously was kind of an involvement of counter-tactics and technologies. I was wondering if you could—it's not something people talk about too much—just shed light on what you think the future of counter-tactics looks like for drones, or possibly joint operations?

PATRICK MAHANEY: The future piece—first of all, warfare is an art. When you start with the premise it's not a science, it's an art. Everything that at one moment is offensive can quickly become defensive, and it's a Yin/Yang balance, without again geeking people out on that. But that's just the nature of what it is.

Whenever you have a strength, such as the United States has in, say, air power, somebody is going to come up with some way to defeat it. If not, they get killed, they get wiped out, they won't survive. So anybody who is going to stand up against us or any other developed power is always going to develop an asymmetric approach, which simply means figuring out how to not get zapped by the other guy's strengths and then how to strike at their weaknesses. Sometimes those weaknesses are political, they're not military per se. And sometimes the two are fused, such as inflicting casualties that play in the press, that sort of thing.

But in the case of drones, what are the weak points of drones? Well, for one thing, they don't fly by wire; they are controlled by radio signal, ultimately, and sometimes the signals are bounced off satellites and sometimes they are handheld radios that people can make in their garage. So you don't necessarily want to defeat the drone; you want to defeat the system.

It could be figuring out how to interrupt the electronic warfare piece that I spoke about earlier, which interrupted the remote-controlled IED piece. That's a pretty simple concept, but it is very complicated. So you can always come up with that.

The swarming technique is one of the most effective things I've seen, which is we have these great drones, cheap bunches of drones, coming after those drones, that are less expensive, drones that dive in and hit, that sort of thing.

I can't give you an answer for each and every scenario, but you figure out how to defeat the system itself. And it is not always a kinetic solution, meaning sending a bullet through it or blowing it up. It's often disrupting a link. That may go into cyber, as the gentleman had brought up, if there is a way to disrupt it through cyber means.

NOAH SHACHTMAN: Just riffing off of that for a second, I think the really quick answer is our military has become hyper-dependent on GPS. That's basic. Although drones are controlled by radio or by satellite, a lot of it is setting "go to this GPS coordinate," "go to that GPS coordinate," "go to the next one."

Similarly, a lot of communications, what have you, are all tied into GPS, how we target things.

The small, teeny, tiny problem is you can disrupt a GPS signal for like—depending on how big the effect you want—50 bucks, 100 bucks, 1,000 bucks. It's not super-hard.

DARPA [Defense Advanced Research Projects Agency], which is the super-forward-thinking arm of the military's R&D [research and development], is trying to come up with ways to do all those things without GPS. They're having a super-hard time. The answer is you can't do one thing; you've got to do a lot of things, you've got to try a lot of different tactics, and you've got to try a lot of different technologies at once. So if I was Dr. Evil, what I would invest in is I'd go long on GPS jamming.

BEN FITZGERALD: Kinetic knockdown of satellites is the other one. That's an inexpensive way to take out a lot of GPS stuff.

NOAH SHACHTMAN: Sure, start a little orbital debris, have a party.

BEN FITZGERALD: That's right. Which is why if the Chinese want to invest in carriers and sixth-generation aircraft, keep going, boys. That is hard. It's expensive. When they start kinetic knockdown of satellites, that makes me feel very uncomfortable—not that I think that we're about to go to war with China. I think there are lots of reasons for that not to occur.

So I think it's a great question. Warfare is always about adaptation. There was a great quote—you probably know who this was—that there's two types of warfare, asymmetric warfare and dumb warfare. So you'll either be successful by doing things that the enemy doesn't want you to do or you'll lose.

Right now at the national level, we are still structured to adapt, I would argue, on Cold War cycles. So we build things like major platforms over 20 or 30 years. What we've seen in Iraq and Afghanistan is that we've got smart enemies who will adapt in months, and we can spend a lot of money, and they're always inside our OODA loop—our Observe, Orient, Decide, and Act. So they're inside our huddle in football terms. We are going to need to figure out how to do that.

I would say that our military forces over the last 12 years, I think absolutely with the Army on the leading edge of this, and the Marine Corps, have shown that they are highly adaptive organizations.

Our bureaucracies have largely not changed over the last 12 years. That is going to have to happen because this continuing rapid adaptation is going to be the signature of any future conflict.

The next threat is not going to be an IED, but it is going to be based on adaptive capability development. Again when I was doing some war games with Dave Kilcullen looking at urban combat, we said that the enemy will have an adaptive capability/development capability and that will be critically important.

So in the same way that we saw people spinning out IEDs, doing different things so that we couldn't jam, we couldn't roll, we couldn't do whatever it was, in the future people are going to have drones that have been doing that, or weapons systems we haven't even thought of. So we're going to have to figure out how to adapt with that, and it's going to be a problem.

CARL COLBY: I have a question to anyone. I went to a lecture a month or so ago, and it was by a woman who is the head of the National Geospatial-Intelligence Agency. Someone mentioned that there is now mapping that is so accurate that a 25-square-kilometer area can be looked upon with extreme accuracy with the topography and with 3D modeling.

Then she just happened to say, "Well, that's just the beginning." She said, "It's when you triangulate and use all the cell phone communications and everything."

It made me think that—you mentioned the Tamil Tigers, or even the Taliban in a certain area, but let's say just something like the Tamil Tigers, or in Peru. You'd almost think that, with the surveillance capabilities we have and the listening, that we could almost investigate them like a criminal gang—you know, you basically can almost identify or nearly identify every member of a particular criminal gang, couldn't you? Is that where we're going? We're not even talking about taking action yet, but from a standoff position almost get to know the entire criminal entity or the composition of the entity itself. Is that too far-fetched?

PATRICK MAHANEY: I think we're there. We have the concept. Back in the old days, before the Internet and computers and the rest, you would layer overlays they were called, acetates. You would literally draw—hey, here's their communications node, here's the way they move, here's people who are like us, people like them. It was literally layers and layers of information. Now it's all digitized, of course.

You've got the big debate—I probably shouldn't even mention it—between Palantir, the intel system, which is from the commercial side Palantir, and a system called DCGS-Alpha. It's a long acronym.

BEN FITZGERALD: And it's not getting very much funding anymore.

PATRICK MAHANEY: I won't even touch that one.

But the point is both systems—I won't get into which is better—seek to layer that information. So once you layer it, the analysts can take a good look at things and say, "Okay, what are we really looking at? What are the depths?" The term we used to use, which is still used, is call chaining. Who speaks to who and how—not just literally with a telephone, like the old FBI wiretaps from the Mafia days, but literally who's communicating with whom?

Does this start to sound something like the NSA controversy right now? Yes. But we are talking about in the conflict zone, so I'm not going to touch the policy issues with the rest of it. But in terms of capability, that capability exists now to analyze the data and then do what you will with it.

Sometimes it's just to understand who's with whom. So you don't have to maybe put a bullet in somebody; you could seek to influence this person in some nonlethal, nonkinetic way because you want to get this person over here. And if this other person that deals with them is a really bad actor, then you kill him. But that understanding, that is where we are right now. It's a very, very complex piece.

BEN FITZGERALD: There are lots of things we can do. I completely agree with Pat. I think probably implicit in Pat's comments there is we need to be careful with that stuff. It's a very powerful capability and it gives us all sorts of operational and tactical advantages. But it doesn't guarantee strategic victory, and the other thing that it doesn't do is it does not provide this omniscient understanding of the battle space.

Some very unfortunate concepts came out in the 1980s and 1990s that sort of manifest themselves in things like the Future Combat System, where we said that we will have the ability to have information dominance over the battle space; we'll have complete situational awareness; fog and friction will no longer exist because we have all this analysis. So we will be able to stand off and we'll be able to launch precision munitions in.

That's not going to be the future. Even though we are going to have more ISR (intelligence surveillance reconnaissance) than we've ever seen in the past, warfare is going to be more lethal, more confusing, more complex, and we need to make these investments just to try to manage that complexity.

NOAH SHACHTMAN: You heard Tish Long speak. First of all, the NGIA, the National Geospatial-Intelligence Agency, is having a lot of trouble just getting the mapping part right.

Secondly, I think that people counter all that intelligence gathering. So I think for the first time tonight I'll disagree with Pat just a little bit. I think, no matter what the intelligence-collection capabilities are, you are still going to need to have old-school snitches if you really want to find out what's going on.

PATRICK MAHANEY: The human intelligence piece. And also the ability to spoof. Just like you can spoof by lying on the human side, you can spoof with false traffic—the old tricks from World War II, when you make false radio transmissions that are picked up and you believe them. You can spoof anything.

Again, I'm not saying there's not a human in the loop piece here. But the ability to layer metadata and analyze it and get something out of it may not be the right answer. But you can do quite a bit with it. That capability exists.

NOAH SHACHTMAN: Look, with all this NSA metadata stuff, they now say that the grand total of active terrorist plots that they stopped with that is one, or maybe two, and that's with basically having access to the call records of everybody.

PATRICK MAHANEY: On the homeland side.

NOAH SHACHTMAN: Yes, on the homeland side.

PATRICK MAHANEY: On the combat side it's a different answer.

NOAH SHACHTMAN: Right. But I'm just saying it's still a lot of information to get a relatively small bang out of. That's just a long way of saying that I think the human side is always going to be really important.

QUESTION: It seems like through the 20th century and up to now one of the constraints in going to war is the cost of war, both in terms of monetary cost, lives, reputation, etc. With all these new systems basically making it more cheaper and more precise, what's going to replace cost as a constraint? What's going to stop us from wanting to project power constantly in this lethal way?

BEN FITZGERALD: It's a great question. There are lots of dynamics there.

There is a guy by the name of Tom Mankin, whose work I recommend to you. He makes an argument that over the last, let's say, 20 or 30 years, from a U.S. warfighting perspective, we have been in an era of inexpensive power projection. He means that in strategic terms, not necessarily in dollar terms.

His argument is that with proliferating technologies, especially A2AD (anti-access/area-denial) technologies, that's raising the cost of power projection significantly. So if everyone has access to technology, and the same technology, you are not going to have dominance from one side. So it could actually create some sort of parity, which in itself creates a barrier.

The other thing that I would say there is that—we talked about cost-imposing strategies. My snide answer there is the most effective cost-imposing strategy I've seen is the one the United States government runs against itself. If someone else can figure that out, we're in a lot of trouble. But I think that is going to be another thing. If the United States keeps investing in boutique, expensive, high-end capabilities, we are not going to be able to use those against inexpensive technologies.

So I think that there are a number of reasons why advanced technology could put a damper on projecting capability. Outside of that, I think that when you start taking dollars and human lives out of the calculation, that creates incentives to probably undertake more conflicts. Then it is going to come back to strategy and we are going to have to get better at that.

PATRICK MAHANEY: For the United States it's very expensive. This machine that has developed over time wasn't cheap, and it will not be cheap going forward.

Now, there is the commercial, off-the-shelf piece and there is innovation, and it becomes less expensive. But that usually benefits the other guy, the non-state actor who can how get the Xbox cameras and all the other things that we were talking about. For the United States it's still expensive.

But I don't think that's the real issue. I think that is a true defining element from the industrial era. I think right now, particularly with the interconnectivity of people, it's legitimacy. The true constraint is political legitimacy. It's people saying, "This ain't right, this is not the right thing to do," and it enters into a debate that happens much more quickly and—this is perhaps arguable—in a more informed way, with the masses of data that are out there and the influences that come into this. So I think the limiting factor is legitimacy.

NOAH SHACHTMAN: And also you can't necessarily figure you can do it in some remote corner of the world and that it will never show up.

PATRICK MAHANEY: It will be a movie about the SEALs in no time flat. That's the reality.

But the camera I spoke about, the close-target reconnaissance is a use of it for a totally unclassified technology. But now everybody can do close-target reconnaissance. Everybody can report on anything, as we well know. So the cost-imposing strategy politically on us is extraordinary, if so much as one U.S. troop misbehaves.

Now, does that apply to other nation-states? It depends on the nation-state. It depends on the group. In some cases, they wouldn't care less. But for us it's quite significant.

QUESTION: Do you think it is sometimes exaggerated, the debate? You mentioned before Yahoo and Google are the sources of innovation. Google and Yahoo are all here in the Silicon Valley. We know very few other big companies. But, actually, 67 of the biggest computer and software companies in the world are from the United States. How can we assume that the others are going to fare way better in a revolution that is actually apparently favoring the United States?

NOAH SHACHTMAN: Even if the companies are based in the United States, their technologies are everywhere. Apple is a U.S. company, but where are the iPhones built? They are built in China. They are sort of U.S. in a lot of ways in name only. A lot of them don't even pay taxes here, which I guess is a different issue. Their technologies go global.

Also, that's assuming that's how it is going to be from here on out. I'm not sure. There's a lot of really interesting Chinese technology companies, for example. Israel, for its size, has an amazing amount of technology firms. So just because it feels like it's in Silicon Valley or their headquarters are in Silicon Valley at the moment, I don't think that's where necessarily the benefits lie, entirely.

Remember, the United States is like number 28 in broadband adoption. Just because the Internet was invented here, we don't really use it as well as other countries do.

We finally got smartphones and some higher-speed cell bandwidth, but we were years and years and years behind Europe and Asia on that. So just because the companies are here, I don't think means a whole lot.

BEN FITZGERALD: I think that's right.

The other thing I would talk to there is comparative advantage. I did a project for the Pentagon with Peter Singer, who is a great guy who has written a lot about this and many other topics. [Editor's note: Check out Peter Singer's 2009 Carnegie talk, "Wired for War: The Robotics Revolution and Conflict in the 21st Century."]

As we were looking at game-changing technology, which was the topic of the project, one of the key findings was that there is significant additional marginal utility for the weaker actor in terms of game-changing technology. So if we develop a new UAV, that's nice. If Hezbollah develops a new UAV, that's much more useful for them than it is for us. So we're not just looking at the parity; we're looking at the asymmetric advantage that people will get from that.

I would also make the case of how far advanced or how much more capable does the United States need to be today to be able to go to somewhere like Iraq? We still have those problems.

So even if we are whatever percentage it is—50 percent, 1,000 percent—more capable than our nearest competitor, how much of that is useful and how will we maintain that?

I don't see a future where we are going to have another state that has all of the stuff that we have. That's not how we are going to lose. We are going to lose in one specific area where someone has developed specific capability, whether that's in a city or a specific strike, or wherever it is.

QUESTION: [off-microphone – inaudible]

BEN FITZGERALD: The question was about deterrence and what's the deterrent value.

The deterrent thing is challenging. There are a lot of people who like to look at nuclear parallels for cyber security. Actually, it was quite a scary number probably three or four years ago. People in the Pentagon asked, "What's mutually assured destruction in the cyber domain?"

I don't think that we have figured that out. We are going to have to look at other ways of establishing deterrence, and it's not going to be purely technological. So I think that politics will continue to play a critical role in this.

I am concerned, however—and I think it's a useful point—when we start looking at advanced manufacturing technologies, we are seeing now an ability to manufacture highly advanced technologies at scale without a large population or a large workforce.

So if you look at—again, we'll use the China example—there are many political and economic reasons that China does not want to go to war with us and we don't want to go to war with China. They are part of an international system and we have many ways of dealing with them.

If you look at North Korea or, let's say, Myanmar, although that is changing a little bit—if Myanmar from 10 years ago was able to build a significant unmanned fleet and significant cyber capability, how do you deter them? What are the levers that you can pull against them? We have fewer of them.

So I think in the interim we can still use existing tools of statecraft. My concern is the rogue actors of the future will be more capable than they are now, and in a non-nuclear way. We have some ways of dealing with nuclear stuff that we can't apply in other areas yet.

PATRICK MAHANEY: Ultimately, in the world that I normally live in, the only real deterrent—and I'm not talking China, Russia; that's sort of the classic old school—it's we can reach out, we can find you where you are, we can identify you, and we can kill you. That has to be credible. Do we have the intention of doing that if certain conditions are met? Do we have the capability of doing it? Yes. So that's the credible threat that we maintain.

What that looks like vis-à-vis non-state actors, drug gangs in Central America, I don't think we've really figured it out. And I don't think there will be a standard model. I think all the factors that come into play are going to be there. So I think it's certainly worth further study, but a dicey proposition at best in the asymmetric irregular world.

NOAH SHACHTMAN: On that uplifting note, please give our panelists a hand. Thanks for coming.

You may also like

CREDIT: <a href="">UK Ministry of Defence</a>

MAR 19, 2013 Article

Drones: Legal, Ethical, and Wise?

The U.S. drone program raises serious ethical concerns, particularly about accountability and due process. Congress, with support from President Obama, must develop new oversight ...

Drone in Afghanistan, 2009. CREDIT: <a href="">David Axe</a> (<a href="">CC</a>)

NOV 6, 2012 Article

The New Assassination Bureau: On the 'Robotic Turn' in Contemporary War

When the film "2001" first came out, the plot--in which a robot faces an ethical decision--seemed like pure science fiction. Today it's becoming reality. This essay ...

Not translated

This content has not yet been translated into your language. You can request a translation by clicking the button below.

Request Translation