The Space Force’s second-in-command told Air Force Academy cadets attending a leadership conference that the U.S. will need machines to make decisions that kill—and that confronting the inherent ethical dilemmas “can’t wait.”
Vice Chief of Space Operations Gen. David D. Thompson brought up lethal autonomous weapons systems during a question-and-answer panel conversation that also featured three other senior leaders: Air Force Chief of Staff Gen. Charles Q. Brown Jr. and the Chief Master Sergeants of the Air Force and Space Force, JoAnne S. Bass and Roger A. Towberman.
The four senior leaders answered questions posed by cadets and local attendees of the Air Force Academy’s National Character and Leadership Symposium on Feb. 25. The two-day symposium’s 2022 theme was “Ethics and Respect for Human Dignity.”
Admitting he’d gotten a “sneak peek” at a question about ethics in the context of hypersonic weapons, Thompson took the opportunity to talk about a type bristling with even more ethical dilemmas—lethal autonomous weapon systems, often referred to by critics of the concept as “killer robots.”
Those weren’t Thompson’s words. He did, however, convey a sense of urgency in terms of needing to have them while also predicting, on the hypersonics side, a period of strategic instability the likes of the early Cold War.
In terms of hypersonic weapons—those able to fly five times the speed of sound—Thompson said they’re ethically “not that much different than things that we’ve done in the past. It’s a tremendous operational and technical challenge. We need to make sure that they’re part of our arsenal. We need to develop defenses against them. And we will.”
He suspects the instability will come with adding a nuclear component.
“When you couple hypersonic weapons with nuclear weapons, it’s tremendously unstable in a strategic sense,” Thompson said. “And we have to understand [how] to deal, again, with a period of strategic instability they might produce—like we frankly saw in the nation back in the early days of the Cold War.”
Thompson then segued into the subject of lethal autonomous weapons—those expected to rely on artificial intelligence. His remark followed the United Nations’ failure in December 2021 to make headway toward a treaty that would ban them.
Their inevitability comes down to “the speed of war—how quickly things are going to have to happen in the future,” Thompson told the cadets. “We’re going to have to have machines that make decisions—like Chief Towberman talked about—that kill people.” (Towberman had talked about the ethics of killing more broadly.)
“And we can’t wait,” Thompson continued. “We cannot let technology drive that, and we can’t wait until it’s thrust upon us to think through and understand how we have to deal with that ethically—when, how, and should we let machines make decisions to kill people. And we have to deal with it because that’s exactly where our adversaries are going.”