Autonomous weaponry is a crucial element of the U.S. military’s shift toward AI
·
By Johnny Franks, Warrior Editorial Fellow
The US military’s integration of AI and autonomous weapons systems is a dynamic and intricate process, marked by significant technological advancements and strategic enhancements. This progress necessitates a careful consideration of evolving policies, ethical implications, and strategic objectives, underscoring the delicate balance between embracing cutting-edge technology and addressing the profound ethical and policy challenges it presents.
In terms of tangible policy that addresses future developments, there already exists an evolving framework of intricate guidelines and principles, echoing a commitment to responsible and ethically grounded utilization of AI. The 2023 AI Adoption Strategy by the DoD, released in November 2023, provides a roadmap for integrating AI to maintain strategic and tactical superiority. This document is an intricate blueprint for AI deployment, emphasizing agility, rapid adoption, and a comprehensive focus on various operational dimensions, from battlespace awareness to enterprise operations. The strategy is a mosaic of objectives and principles, underpinned by a commitment to responsible AI development and safety protocols. DoD also recently updated its Directive 3000.09, which oversees the development and use of autonomous and semi-autonomous weapons systems, though one official “stressed that the update to the directive is ‘a clarification, not a major change.’”
Johnny Franks holds an MA in U.S. Foreign Policy & National Security from American University and a BA in Diplomacy & World Affairs from Occidental College. With a specific interest in geopolitical security and military technology, Johnny has primarily focused his research and analysis on the Russia-Ukraine conflict from 2014 onwards. . As part of his MA coursework, Johnny contributed to developing an Arctic defense strategy in partnership with the U.S. Department of Defense.