How well is the law of armed conflict suited to dealing with the challenges posed by new means and methods of warfare?

 -  -  2


Spread the love

Joshua Prior

Introduction 

All warfare is governed by International humanitarian law (IHL), otherwise known as the law of armed conflict (LOAC).[1] This body of law developed out of “a desire among civilized nations to prevent unnecessary suffering and destruction while not impeding the effective waging of war.”[2] In practice, this involves balancing military necessity with humanitarian interests.[3] However, throughout history, the means and methods utilized to pursue the ends of war have have been subject to technological advancements. This “rapid evolution of military technology”[4]poses new challenges to the current law in terms of its interpretation and general applicability. This article will examine two fundamental principles of the LOAC, distinction and proportionality, and their suitability in dealing with the challenges posed by the methods of autonomous weapon systems (AWS) and the means of cyber operations. 

Autonomous weapon systems 

The suitability of the LOAC in dealing with the challenges posed by AWS centers around the capabilities of the technology in question. As the focus of this discussion is ‘contemporary examples’, weapon systems which are fully autonomy fall outside the scope of this paper. Weapons of this nature “generally do not exist today”[5]although they are currently under development by states.[6] This paper adopts the International Committee of the Red Cross (ICRC) definition of an AWS as “any weapon system with autonomy in its critical functions—that is, a weapon system that can select and attack targets without human intervention.”[7] It therefore follows that, the autonomous functions are only enabled following human activation.[8] The LOAC is a human centric body of law, thus this paper argues that the current LOAC will be suited to deal with automation challenges so long as a human remains “in the loop”, maintaining some form of control or involvement in the decision-making process.

Proportionality 

Codified in the Additional Protocol 1 to the Geneva Conventions (AP1),[9] proportionality requires a calculation to be undertaken.This calculation considers the expected collateral damage and incidental loss of life against the expected military advantage of any given attack.[10] Issues of having humans too far removed from the loop in this calculation process are raised by Human Rights Watch (HRW). They contend that the principle of proportionality rests heavily upon the human judgement involved in making such decisions.[11] Proportionality “requires more than a balancing of quantitative data, and a robot could not be programmed to duplicate the psychological processes in human judgment that are necessary to assess proportionality”.[12] Moreover, such decisions are highly contextual, and the unpredictable landscape of warfare requires  constant adaptation. These decisions involve taking in a range of quantitative as well as qualitative factors. 

Quantitatively, in relation to collateral damage, AWS can make use of “collateral damage estimate methodology”.[13] Here they process objective data in order to accurately assess the collateral damage of an attack. Qualitatively, the AWS has to adhere to the international court’s standard of the “Renderic Rule”.[14]This is a standard which evaluates proportionality focusing on the knowledge available to the commander at the time of the attack.[15] HRW contend that it is “highly unlikely that a robot could be pre-programmed to handle the infinite number of scenarios it might face”.[16] Furthermore, Sharkey submits that “the number of such circumstances that could occur simultaneously in military encounters is vast and could cause chaotic robot behaviour with deadly consequences”.[17] However, Schmitt questions this approach, asking if it was “appropriate to ask more of machines than the humans whom the law of proportionality was designed to address”.[18] Although it is true that a machine alone cannot consider all the possible events in the battlefield, the same can be said for humans. Even so, such weapons will inevitably perform at a higher level than their human counterparts,[19] the Renderic rule should not be applied as a standard of perfection. However, whether the Renderic Rule can suitably deal with this challenge is not entirely certain, being an “inherently subjective determination that will be resolved on a case-by-case basis”.[20]

On the other hand, Schmitt illustrates the importance of the human remaining in the loop for the purposes of the LOAC. He contends that until fully AWS are developed, humans “will continue to make the subjective determinations resident in the law of armed conflict”.[21] As the LOAC is centered around the actions of humans, AWS pose no new challenges in this respect as the decision to use lethal force remains with the human and is thus governed akin to existing methods of warfare. Scharre has advocated for the use of such “hybrid systems” as they are be able to “leverage the precision and reliability of automation without sacrificing the robustness and flexibility of human intelligence”.[22] In this reality, contrary to the HRW position, he calls for clarification over which “decisions we believe still require uniquely human judgment,[23]This would clearly outline the remaining responsibility of humans. Furthermore, Anderson argues that it is this that could lead to a more “humane” way of conducting hostilities.[24] With the primary purpose of the LOAC to promote the humanitarian interests, such systems seem to be suited well to its demands.

On balance, Kastan contends that the fact that AWS are being used “with little critique”, suggests that “they can be operated in compliance with the proportionality requirement”.[25] A certain degree of autonomy has been used in executing military tasks, dating back as far as the 1970s,[26] and there has been no legal issues raised thus far. Although such weaponry has been used in primarily defensive contexts against other machines or in environments where civilian risk is very small,[27]it demonstrates the LOAC’s capability to deal with the challenges posed thus far. Moving forward, the advantage of the latency between current LOAC and the yet to be developed fully AWS  allows the law time to develop and adapt before they are implemented into battlefields.[28] Nevertheless, IHL must act fast, technological change is rapid and unpredictable. Attempts at regulation have proven problematic, “akin to trying to pin down a moving target”.[29]

Cyber operations and data

Cyber operations and data pose significant challenges to the current treaty-based body of the LOAC. Although such treaties should be interpreted “in accordance with the ordinary meaning to be given to their terms,” said interpretations must be made in “context and in the light of [their] object and purpose.”[30] The point of contention arises when interpreting such matters that did not exist when  the Additional Protocol 1 (AP1) was first drafted.[31] The consequence is that the LOAC inadequately protects civilian objects. The challenges identified include a lack of clarity regarding the scope of the term “object” in light of data and the failure to adequately account for attacks which are of a non-kinetic nature. This paper adopts the “emerging orthodoxy”[32] of the Tallinn manual to demonstrate that the LOAC in its current form is not suited to dealing with the challenges posed by data.[33] The consequences of this operate counter to the purpose of the LOAC, as civilians may be exposed to harm in the context of an armed conflict an unsatisfactory result from a humanitarian perspective.[34] Thus, a more progressive interpretation will be examined which may serve to overcome this barrier. 

Distinction 

The principle of distinction, codified in Article 48 of the AP1, requires participants in hostilities to “at all times distinguish between the civilian population and combatants and between civilian objects and military objectives and accordingly direct their operations only against military objectives.”[35] The International Court of Justice (ICJ) has deemed this one of the “cardinal principles”[36] of customary international law and, as Schmitt argues, it “incontrovertibly applies to cyber operations conducted during an armed conflict.”[37]

However, the application of this principle  is dependent “on an attack having occurred and that an attack is an action during [an] armed conflict that is violent in nature.”[38] On a strict interpretation of Article 48, “the term ‘acts of violence’ connotes physical force.”[39]Schmitt is able to surpass this stumbling block through an analogy with chemical and biological weapons. He submits that “it is not the violence of the act that constitutes the condition precedent to limiting the occurrence of an attack, but the violence of the ensuing result.”[40] Even though biological and chemical weapons are by their nature non-violent and non-kinetic, their use ultimately results in harmful, even lethal consequences. Cyber operations can likewise “generate such consequences even though they launch no physical force themselves.”[41] Accepting this position, the discussion will turn to whether data resident in computers may be the target of such an attack. This requires identifying data as “objects” under the LOAC. This is a key point, as civilian objects are those which do not constitute military objects.

The international group of experts (IGE) behind the Tallinn manual reject the notion that data falls within the scope of the LOAC. Supplementing Article 52(2) of the AP1,the Tallinn manual sets out the criteria for an object; “Military objectives may include computers, computer networks, and cyber infrastructure.”[42]According to the IGE who developed this criteria, the majority position was that “data is intangible” and therefore does not  fall within the “ordinary meaning of the term object.”[43] This view is predicated upon the ICRC commentary which characterises an object as “visible and tangible.”[44] Under this interpretation, the LOAC will only encompass data when its destruction “entails the loss of functionality of physical infrastructure carrying the data in question.”[45] It must be noted that this is not a binding interpretation, nor has it been endorsed by states. For this reason, an alternative more progressive interpretation will be considered, which could bring data within the scope of the LOAC. 

Mačák advances the argument that the LOAC may be interpreted broadly to encompass data as an object. Turning to the Navigation rights case, the ICJ held that “if parties choose a generic term in a treaty entered into for a very long period, they should be presumed to have intended that such a term is to have an evolving meaning.”[46] Mačák submits that, because the AP1 is indeterminate in duration and the term  a generic one, the favourable approach should therefore be evolutive in nature.[47] This evolutive approach  finds support in international law, namely the Nuclear weapons advisory opinion[48] and the Target killings case.[49] With regard to the latter, the court held that, “if the reality changes, the interpretation of previously developed rules must also evolve.”[50] The reality, as indicated by the attacks on the Georgian cyber infrastructure,[51] is that warfare has extended to cyberspace. As data forms a substantial part of modern military operations, a broader interpretation that includes it within the scope of the LOAC may be achieved in this manner. 

Conclusion 

Both of these contemporary examples illustrate the outdatedness of the LOAC. The dangers of this failure to regulate hostilities will lead to states determining their own standards of practice. This may prove detrimental when less technologically advanced states incur significant disadvantages on the battlefield or their civilians become the target of advanced cyber-attacks. Yet calls for new treaties for both AWS[52]and cyber[53] have been criticised as unnecessary, albeit unlikely, due to their political implications.[54] Moving forward, the only viable solution appears to be revision. However,  this illustrates one of the LOAC’s biggest weaknesses: its lack of universal ratification. States such as the USA are not a signatory to either of the APs, and this fundamentally undermines the LOAC’s ability to govern these means and methods. As long as this remains the case, the LOAC, no matter how adaptable it is to  the demands of new challenges, will remain ineffective dealing with contemporary methods and means of warfare. 


[1] James Foy, Autonomous Weapons Systems Taking the Human Out of International Humanitarian Law, 23 DAL. J. LEGAL STUD. 47, 53 (2014).

[2] Nwanolue, B.O.G, Frhd, Ojukwu Uche Grace, Victor Chidubem Iwuoha, Military Operations Associated With Internal Security And Special Rules For Opening Fire In Armed Conflicts, International Journal of Asian Social Science, Vol. 2, No.7, 1152.

[3] Michael N. Schmitt, Military Necessity and Humanity in International Humanitarian Law: Preserving the Delicate Balance, 50 Va. J. Int’l L. 795 (2010) 804. 

[4] Legality of the Threat or Use of Nuclear Weapons, Advisory Opinion, 1996I.C.J.226, 78 (July 8) 78. 

[5] Paul Scharre And Michael C. Horowitz, An Introduction To Autonomy In Weapon Systems, Working Paper, February 2015, p3, Available at https://s3.amazonaws.com/files.cnas.org/documents/Ethical-Autonomy-Working-Paper_021015_v02.pdf?mtime=20160906082257 , Accessed 11 March 2019.

[6] Ibid. 

[7] Neil Davison, A legal perspective: Autonomous weapon systems under international humanitarian law, p6, Available at https://www.icrc.org/en/document/autonomous-weapon-systems-under-international-humanitarian-law , Accessed 15 March 2019. 

[8] Ibid.

[9] Protocol Additional to the Geneva Conventions of 12 August 1949 and relating to the Protection of Victims of International Armed Conflicts (Protocol I), 8 June 1977.

[10] Jeffrey S. Thurnher, “No One at the Controls: Legal Implications of Fully Autonomous Targeting,” Joint Force Quarterly 67 (National Defense University Press), October 2012.

[11] Human Rights Watch, LOSING HUMANITY: The Case against Killer Robots (2012), p32. 

[12] Ibid, 33.

[13] Schmitt and Thurner, ‘“Out of the Loop”: Autonomous Weapon Systems and the Law of Armed Conflict” (2013) 4 Harvard National Security Journal 231, p254.

[14] See Eric Talbot Jensen, Essay, Unexpected Consequences from Knock-on Effects: A Different Standard for Computer Network Operations? 18 AM. U. INT’L L. REV. 1145, 1181-83 (2003).

[15] Prosecutor v. Galic, Case No. IT-98-29-T, Judgment and Opinion, 58 (Int’l Crim. Trib. for the Former Yugoslavia Dec. 5, 2003).

[16] Ibid.

[17] Noel Sharkey, “Automated Killers and the Computing Profession,” Computer, vol. 40, issue 11 (2007), p.122.

[18] Ibid, 257.

[19] William Marra and Sonia McNeil, “Understanding “The Loop’: Regulating the Next Generation of War Machines,” 36 Harvard Journal of Law and Public Policy 3 (2013).

[20] Air Force Judge Advocate General’s Department, “Air Force Operations and the Law: A Guide for Air and Space Forces” first edition, 2002.    http://web.law.und.edu/Class/militarylaw/web_assets/pdf/AF%20Ops%20&%20Law.pdf , p.27, accessed 15 March 2019.  

[21] (n 13) p266. 

[22] Paul Scharre, Centaur Warfighting: The False Choice of Humans vs. Automation, 30 Temp. Int’l & Comp. L.J. 151 (2016).

[23] (n 22).

[24] Kenneth Anderson & Matthew Waxman, Law and Ethics for Autonomous Weapon Systems: Why a Ban Won’t Work and How the Laws of War Can, HOOVER INST. (Apr. 9,2013).

[25] Benjamin Kastan, Autonomous Weapons Systems: A Coming Legal “Singularity”?, 2013

J.L. TECH. & POL’Y 45, 62 (2013). 

[26] Metodi Hadji-Janev and Kiril Hristovski, Beyond the Fog: Autonomous Weapon Systems in the Context of the International Law of Armed Conflicts, 57 Jurimetrics J. 325–340 (2017).

[27] Kenneth Anderson; Daniel Reisner; Matthew Waxman, Adapting the Law of Armed Conflict to Autonomous Weapon Systems, 90 Int’l L. Stud. Ser. US Naval War Col. (2014).

[28] See (n 22). 

[29] Maziar Homayounnejad, Some Thoughts on Negotiating a Treaty on Autonomous Weapon Systems, Available at http://opiniojuris.org/2018/01/03/33402/ , Accessed on 9 March 2019. 

[30] Vienna Convention on the Law of Treaties art.31(1), May 23, 1969, 1155U.N.T.S.331.

[31] MN Schmitt, ‘Cyber Operations and the Jus in Bello: Key Issues’ (2011) 87 Int’l L Stud 89, 93. 

[32] K Mačák, ‘Military Objectives 2.0: The Case for Interpreting Computer Data as Objects under International Humanitarian Law’ (2015) 48 Israel Law Review 55, 56. 

[33] MN Schmitt (ed) Tallinn Manual on the International Law applicable to Cyber Operations (2nd edn, CUP

2017).

[34] David Francis, ‘The Coming Cyber Attack that Could Ruin Your Life’, The Fiscal Times, 11 March 2013, available at http://www.thefiscaltimes.com/Articles/2013/03/11/The-Coming-Cyber-Attack-that-Could-Ruin-Your-Life  , Accessed 18 March 2019.

[35] Protocol Additional to the Geneva Conventions of 12 August 1949, and Relating to the Protection of Victims of International Armed Conflicts art. 48, June 8, 1977, 1125 U.N.T.S. 3

[36] Legality of the Threat or Use of Nuclear Weapons, Advisory Opinion, 1996 I.C.J.226, 78 (July 8).

[37] (n 30) p91.

[38] Ibid. 

[39] Ibid. 

[40] Ibid, p94. 

[41] Ibid. 

[42] (n 32) p125.

[43] MN Schmitt, ‘The Notion of ‘Objects’ during Cyber Operations: A Riposte in Defence of Interpretive and Applicative Precision’ (2015) 48 Israel Law Review 81, p86. 

[44] Yves Sandoz, Christophe Swinarski and Bruno Zimmermann (eds), Commentary on the Additional Protocols of 8 June 1977 to the Geneva Conventions of 12 August 1949 (ICRC 1987) para 2008.

[45] (n 32) 127 para 5. 

[46] Dispute regarding Navigational and Related Rights (Costa Rica v Nicaragua) Judgment [2009] ICJ Rep 213, [66]. 

[47] (n 31) p70. 

[48] (n 4). 

[49] HCJ 769/02 Public Committee Against Torture in Israel and Palestinian Society for the Protection of Human Rights and the Environment v Israel and Others ILDC 597 (IL 2006) [2006].

[50] Ibid, para 28. 

[51] Enekentikk, Kadrikaska & Liisvihul, International Cyber Incidents: Legal Considerations 63–90 (2010). 

[52] Metodi Hadji-Janev and Kiril Hristovski, Beyond the Fog: Autonomous Weapon Systems in the Context of the International Law of Armed Conflicts, 57 Jurimetrics J. 325–340 (2017).

[53] Peter Pascucci, Distinction and Proportionality in Cyberwar: Virtual Problems with a Real Solution, JAGC, U.S. Navy, Vol 26, issue 2. 

[54] Ibid. 

2 recommended
comments icon 0 comments
0 notes
58 views
bookmark icon

Write a comment...

Your email address will not be published. Required fields are marked *