Compliance with international humanitarian law (IHL) is recognized as a critical benchmark for assessing the acceptability of autonomous weapon systems (AWS). However, in certain key respects, how and to what extent existing IHL rules provide limits on the development and use of AWS remains either subject to debate or underexplored.
This report aims to help states form and express their views on the legal provisions that already do, or should, govern the development and use of AWS, particularly with respect to the required type and degree of human–machine interaction. It maps (a) what limits IHL already places on the development and use of AWS; (b) what IHL demands from users of AWS to perform and satisfy IHL obligations, whether the obligations are of a state, an individual or both; and (c) threshold questions concerning the type and degree of human–machine interaction required for IHL compliance.
In its findings and recommendations, the report does not pre-judge the policy response that should regulate AWS. Instead, it aims to provide an analytical framework for states and experts to assess how the normative and operational framework regulating the development and use of AWS may need to be clarified and developed further.
1. Introduction
2. An overview of the limits on autonomous weapon systems under international humanitarian law
3. Key issues concerning the rules on weapons, means and methods of warfare
4. Key issues concerning legal reviews and legal advice
5. Key issues concerning frameworks for state responsibility and individual criminal responsibility
6. Key findings and recommendations