by Sahar Haroon, Research Associate, RSIL
Autonomous weapon systems (AWS) may be understood as belonging to two distinct categories: semi-autonomous and fully autonomous. The difference between these groups is based on the extent of interference with them by a human operator. Semi-autonomous weapons denote those systems which remain under the control of humans in the performance of their critical functions. Fully autonomous weapons are those which would be capable of handling and controlling their entire functioning without human involvement. Semi-autonomous weapons are part of the arsenal of many States, such as sentry-guns deployed along borders for example the Israeli ‘Iron Dome’, the same is not true for the latter category. Be that as it may, the possibility of the development of fully autonomous robots for military or policing purposes cannot be ignored.
Though commentators take into consideration different criteria for defining AWS, thereby contributing to the absence of a universally accepted definition, the International Committee of the Red Cross (ICRC) defines it as a “system with autonomy in its critical functions” i.e. “such a system would be able to detect, track, select and attack (e.g. fire at) a target without direct, in the sense of spatially, temporally or causally proximate, human intervention.”
There are multiple issues that need to be addressed and have been the point of much deliberation when it comes to AWS. First and foremost is the issue of their legality. Another issue that has been extensively discussed is whether it is ethical to handover critical decisions such as selecting and engaging targets to weapons, while taking humans-out-of-the-loop. The final crucial problem is the determination of accountability for the unlawful use of a weapon which does not involve a human operator. This article touches upon the former matter while the latter two are reserved for an upcoming publication.
Furthermore, the legality of AWS may be ascertained from the perspective of both International Humanitarian Law (IHL or the law of war) and International Human Rights Law (IHRL). Though these are two distinct branches of law, admittedly there exists some overlap between them. IHL primarily addresses issues related to the conduct of hostilities, while imposing obligations in such a way that measures must be taken in peacetime for proper compliance during an armed conflict. IHRL on the other hand, recognizes certain fundamentals that are guaranteed as non-derogable uniformly during both peacetime and situations of protracted armed violence as well as armed hostilities between two or more States. This article is limited to highlighting certain key aspects of the debate under IHL.
For addressing the scope of legality of AWS under IHL, it is first critical to assess whether these weapon systems are lawful under this legal regime. IHL classifies weapons as lawful and unlawful. Weapons are unlawful in their very nature, when they are incapable of adhering to the principles of IHL, for instance unable to distinguish between protected persons and legitimate military targets or can only be employed in an indiscriminate manner or are violative of any other rule of international law which a State is bound by. Examples include all specifically prohibited weapons under IHL, such as biological weapons and anti-personnel landmines. Then, there are weapons which may be lawful as their nature allows them to comply with IHL obligations, however, these lawful weapons may be used in an unlawful way. Therefore, it is not the weapon itself which is prohibited under IHL rather its use in that particular way which violates the law.
Determination of whether a weapon falls into the category of being lawful or unlawful, is an obligation on States to be undertaken during the “study, development, acquisition or adoption of a new weapon, mean and method of warfare.” This obligation arises from both customary and treaty law. IHL imposes certain constraints on States while developing or acquiring new means and methods of warfare. Among other things, it prohibits those means which cause unnecessary suffering or superfluous injury. It is also prohibited to use weapons that would severely damage the natural environment, or those which violate the fundamental principles of IHL, such as distinction, proportionality, military necessity etc.
Proponents of AWS highlight the advantages of such weapons. For instance, their capability to enhance precision in attacks thereby minimizing collateral damage, or the fact that they would remain objective in tense situations as opposed to human counterparts.
However, there are multiple disadvantages to the development and adoption of AWS. First of all, precision in attacks is dependent on the weapon systems’ capability to not just distinguish between objects and humans but also between civilians and combatants and civilian objects and military objectives, a distinction that is not absolutely definitive and thus cannot be programmed into a system which is inherently incapable of ‘deciding’ for itself. Decisions of distinction, proportionality, precautions in attacks and military necessity especially in case of dual-use or dual-nature objects are beyond a machine’s ability and shall always require human input in order to comply with IHL. Therefore, though the development of AWS may not be unlawful per se, they are capable of being used unlawfully, more so, if humans are taken “out-of-the-loop.”
Secondly, their development will trigger an arms race, if they have not already. Finally, they will lead to an increase in asymmetric warfare, i.e. where one party to the armed conflict is in possession of advanced military technology while the other not capable of affording or developing such advancements then relies on other unacceptable means or methods to gain advantage over the militarily-strong adversary. These unacceptable methods include blending into the civilian population in violation of the obligation upon combatants to distinguish themselves from civilians, and taking of hostages or using civilians as human shields. Although, the technologically-advanced party is not legally bound to not pursue such technology which escalates asymmetric warfare, it can still be a persuasive factor in understanding that AWS would contribute to greater destruction and increased warfare.
 Tyler D. Evans, “At War with the Robots: Autonomous Weapon Systems and the Martens Clause”, Hofstra Law Review Vol. 41, No. 13-17, 2014, p. 702-703  Yaakov Katz, “Air Force to Get Two New Iron Dome Batteries”, Jerusalem Post, 29 July 2012, available at: http://www.jpost.com/Defense/Article.aspx?id=279256 (all internet sources accessed in June and July 2017)  Michael N. Schmitt, “Autonomous Weapon Systems and International Humanitarian Law: A Reply to Critics”, Harvard National Security Journal Features, 2013, p. 3  ICRC, Views of International Committee of the Red Cross on Autonomous Weapon System, Geneva, Report, 11 April 2016, p. 1  Maya Brehm, “Defending the Boundary: Constraints and Restraints on the Use of Autonomous Weapon Systems under International Humanitarian and Human Rights Law”, Academy Briefing No. 9, University of Geneva, p. 14  Human Rights Watch, “Losing Humanity: The Case against Killer Robots”, 19 November 2012  Maya Brehm, “Defending the Boundary: Constraints and Restraints on the Use of Autonomous Weapon Systems under International Humanitarian and Human Rights Law”, Academy Briefing No. 9, University of Geneva  See Article 48 of Protocol Additional (I) to the Geneva Conventions of 12 August 1949, and Relating to the Protection of Victims of International Armed Conflicts, 1125 UNTS 3, 8 June 1977 (entered into force 7 December 1978) [hereinafter AP I]  See Article 51(4) of AP I  See Article 36 of AP I  Convention on the Prohibition of the Development, Production and Stockpiling of Bacteriological (Biological) and Toxin Weapons and on Their Destruction, 10 April 1972, 1015 UNTS 163 (entered into force 26 March 1975)  Convention on the Prohibition of Anti-Personnel Mines, 3 December 1997, 2056 UNTS 211 (entered into force 1 March 1999)  Article 36 of AP I  ICRC, A Guide to the Legal Review of New Weapons, Means and Methods of Warfare: Measures to Implement Article 36 of Additional Protocol I of 1977, November 2006  See Article 35(1) of AP I  Article 35(2) of AP I  Article 35(3) of AP I  See Articles 48, 50, 51 of AP I  See Article 51(5) and 57 of AP I  St Petersburg Declaration Renouncing the Use, In Time of War, of Explosive Projectiles Under 400 Grammes Weight, 11 December 1868, 138 CTS 297, (entered into force 11 December 1868)  Christopher P. Toscano, “Friends of Humans: An Argument for Developing Autonomous Weapons Systems”, Journal of National Security, Law and Policy, Vol. 8, No. 1, 2015, p. 61-65  Christopher P. Toscano, “Friends of Humans: An Argument for Developing Autonomous Weapons Systems”, Journal of National Security, Law and Policy, Vol. 8, No. 1, 2015, p. 52-57  Noel E. Sharkey, “The Evitability of Autonomous Robot Warfare”, International Review of the Red Cross, Vol. 994, No. 886, 2012, p. 788-789  See Article 52(2) of AP I  Marco Sassoli, “Autonomous Weapons – Potential Advantages for the Respect of International Humanitarian Law”, Professional in Humanitarian Assistance and Protection, 2 March 2013, p. 3-5, available at: https://phap.org/system/files/article_pdf/Sassoli-AutonomousWeapons.pdf  Human Rights Watch, “Losing Humanity: The Case against Killer Robots”, 19 November 2012, p. 30-39  Marco Sassoli, “Autonomous Weapons – Potential Advantages for the Respect of International Humanitarian Law”, Professional in Humanitarian Assistance and Protection, 2 March 2013, p. 2, available at: https://phap.org/system/files/article_pdf/Sassoli-AutonomousWeapons.pdf  ICRC, “International Humanitarian Law and the Challenges of Contemporary Armed Conflicts”, Document prepared by the ICRC for the 30th International Conference of the Red Cross and Red Crescent, 26-30 November 2007, available at: https://casebook.icrc.org/case-study/icrc-ihl-and-challenges-contemporary-armed-conflicts#chapter2