US Military Not ‘Building Killer Robots in the Basement’: Pentagon Official

‘Our commitment to international humanitarian law is ironclad,’ Horowitz says.
US Military Not ‘Building Killer Robots in the Basement’: Pentagon Official
A Modular Advanced Armed Robotic System (MARS) armed robot is seen as the Defense Department holds its 'DOD Lab Day' on May 14, 2015, at the Pentagon in Washington. Mandel Ngan/AFP/Getty Images
Andrew Thornebrooke
Updated:
0:00

The Pentagon is updating a key guideline for how it uses artificial intelligence (AI) due to misperceptions about the technology’s goals, according to one military official.

The department has updated its directive, “Autonomy in Weapon Systems,” because of “a lot of confusion” about how the Pentagon hopes to use AI, according to Deputy Assistant Defense Secretary Michael Horowitz. That ambiguity had led many outside the Pentagon to believe the military was “maybe building killer robots in the basement,” he said.

At the same time, some senior Pentagon leaders believed the document prohibited the use of fully autonomous lethal systems.

Neither is true, according to Mr. Horowitz, who said the Pentagon intends to follow international law in the development of lethal autonomous systems.

“Just to be clear about that, the directive does not prohibit the development of any systems,” Mr. Horowitz said during a Jan. 9 talk at the Center for Strategic and International Studies think tank.

Instead of banning or promoting the development of killer robots, Mr. Horowitz said the directive mandated a review process, in which “certain types of autonomous weapon systems” need to be screened by the most senior Pentagon officials.

Such reviews are likely to become increasingly common as the department moves forward with new initiatives in AI and autonomous weapons, such as the Replicator Initiative, which was announced in August 2023.

That program seeks to enhance U.S. military capabilities by increasing the production of thousands of cheap, autonomous, lethal drones and other capabilities to counter the numerical advantage of the Chinese military.

“Replicator itself is about a process,” Mr. Horowitz said. “It’s about figuring out how ... we can field at speed and scale key capabilities that we view as important given the national defense strategy.”

Mr. Horowitz says a key issue in developing autonomous lethal systems is the rapid innovation in software as opposed to hardware. Much of the department’s work, he said, is focused on the systems that weapons operate on, rather than the weapons themselves.

The effort, he added, is part of a suite of research and development priorities including AI and autonomy, directed energy weapons, and hypersonics.

‘Future of War’

Autonomy will play a “critical role” in the “future of war,” and the United States will need to “accelerate adoption” of AI-driven and autonomous technologies as it confronts the “pacing challenge” of China, which is developing lethal autonomous systems of its own, Mr. Horowitz said.

“I think the adoption capacity in terms of the department is improving, but we have more work to do, frankly, as we’ve been very public in stating,” he said.

“We realized that a lot of what we were doing was trying to figure out how the future force could more effectively incorporate these kinds of technologies in a safe and responsible way.”

To that end, senior military leadership has openly acknowledged its ambition to remake the armed forces in largely robotic terms.

Former Joint Chiefs of Staff Chairman Gen. Mark Milley said that the world’s most powerful nations would likely rely on mostly robotic militaries within the next decade.
“Over the next 10 to 15 years, you’ll see large portions of advanced countries’ militaries become robotic,” Gen. Milley said during a talk with Defense One last year.

“If you add robotics with artificial intelligence and precision munitions and the ability to see at range, you’ve got the mix of a real fundamental change.”

“That’s coming. Those changes, that technology ... we are looking at inside of 10 years.”

As such, Mr. Horowitz said, it’s vital that the Defense Department “make clear what is and isn’t allowed,” and uphold a “commitment to responsible behavior,” as it develops lethal autonomous systems.

“Our commitment to international humanitarian law is ironclad,” he said. “All weapons systems that we field, we believe can comply with international humanitarian law.”

Andrew Thornebrooke
Andrew Thornebrooke
National Security Correspondent
Andrew Thornebrooke is a national security correspondent for The Epoch Times covering China-related issues with a focus on defense, military affairs, and national security. He holds a master's in military history from Norwich University.
twitter
Related Topics