Stark on Appropriate Human Judgement: U.S. Policy on Lethal Autonomous Weapons Systems and The Law of Armed Conlfict

David Micah Stark (United States Air Force Academy) has posted “Appropriate Human Judgement: U.S. Policy on Lethal Autonomous Weapons Systems and The Law of Armed Conlfict” on SSRN. Here is the abstract:

This paper will examine the role of artificial intelligence (AI) and command responsibility on the modern battlefield. As nations become more reliant on AI and other autonomous systems, the employment of Lethal Autonomous Weapons Systems (LAWS) becomes a new concern for international law, particularly concerning issues of distinction and proportionality. This paper analyzes the role of Commander Responsibility in the U.S. Armed Forces as a case study for the implementation of LAWS on the battlefield. Focusing on DODI 3000.09 and its requirement for “adequate levels of human judgement,” this paper examines what constitutes human judgement and the associated obligations to the Law of Armed Conflict imposed upon commanders by that standard. Examining the current definitions, we find that the language requires an unrealistic approval authority level to implement LAWS policy effectively. Therefore, this paper proposes that the approval threshold for LAWS should be lowered to lower levels of command and doctrine-setting teams, depending on the type of LAWS, simultaneously increasing the requisite training of those teams to include a more substantial education in the Law of Armed Conflict, enabling them to make educated decisions. By creating a stratified system of approval based on multiple factors, the U.S. and DoD can more efficiently harness this emerging technology while maintaining commitment to adherence to the laws of war. This article does not delve into issues of the morality of employing LAWS.