63
Autonomous Weapons, are they Legal? a1630695 1 Table of Contents Abstract 2 Introduction 3 Chapter 1. What is an Autonomous Weapon? 6 Research Relevance 6 What is an Autonomous Weapon? 10 The Military Decision Making Process 10 Autonomy Incorrectly Defined 11 Automation 11 Autonomy Correctly Defined 13 Chapter 2. Human’s v’s Autonomous Machines 16 The Balance Between War and Humanity 16 Human Weakness and Proponents of Autonomous Weapons 17 Humanity – an Inherent Aspect of IHL 19 Combatant Human Emotion 19 Public Emotion 22 Additional Concerns 23 Robotic Behaviour in Humans that lead to IHL violations 24 Chapter 3. Targeting Law 27 Distinction 27 Customary Law 27 The Codification of Distinction 27 Combatants and Civilians 28 Direct Part in Hostilities 29 Duel Use Objects 31 Cultural Property 33 Hors de Combat 34 Proportionality 35 Customary Law 35 Proportionality Codified 37 The Challenge of Proportionality 37 Why will Autonomous Weapons Struggle with Proportionality? 40 Proposed Solutions 42 Chapter 4. Accountability 44 Autonomous Weapons and State Responsibility 45 Autonomous Weapons and Joint Criminal Enterprise (JCE) 47 Private Corporations and Accountability 50 Where to From Here? 54 Conclusion 56 Bibliography 58

Meaghan_Brockhoff_Dissertation

Embed Size (px)

Citation preview

Autonomous  Weapons,  are  they  Legal?     a1630695    

  1  

Table of Contents Abstract 2 Introduction 3 Chapter 1. What is an Autonomous Weapon? 6

Research Relevance 6 What is an Autonomous Weapon? 10

The Military Decision Making Process 10 Autonomy Incorrectly Defined 11

Automation 11 Autonomy Correctly Defined 13

Chapter 2. Human’s v’s Autonomous Machines 16 The Balance Between War and Humanity 16 Human Weakness and Proponents of Autonomous Weapons 17 Humanity – an Inherent Aspect of IHL 19 Combatant Human Emotion 19 Public Emotion 22 Additional Concerns 23 Robotic Behaviour in Humans that lead to IHL violations 24 Chapter 3. Targeting Law 27

Distinction 27 Customary Law 27 The Codification of Distinction 27 Combatants and Civilians 28 Direct Part in Hostilities 29 Duel Use Objects 31 Cultural Property 33 Hors de Combat 34

Proportionality 35 Customary Law 35 Proportionality Codified 37 The Challenge of Proportionality 37 Why will Autonomous Weapons Struggle with

Proportionality? 40 Proposed Solutions 42

Chapter 4. Accountability 44 Autonomous Weapons and State Responsibility 45 Autonomous Weapons and Joint Criminal Enterprise (JCE) 47 Private Corporations and Accountability 50 Where to From Here? 54 Conclusion 56 Bibliography 58

Autonomous  Weapons,  are  they  Legal?     a1630695    

  2  

Abstract

Autonomous weapons are the future and as part of Australia’s treaty obligations and

customary law new weapons must be reviewed in order to determine their compliance

with law.

Weapons are considered to be autonomous when they are capable of independent

decision-making and operate with humans outside the OODA loop. As a result

autonomous weapons will struggle to comply with International Humanitarian Law (IHL).

The purpose of IHL is to humanise warfare by balancing military necessity with

requirements of humanity. One of the greatest restraints of war has been the reluctance of

humans to engage in mass killing. Autonomous weaponry dangerously removes this

fundamental restraint.

The principle of distinction protects civilians and combatants. Compliance requires

subjective elements such as intent and mercy, which are inherently human.

Proportionality restricts incidental loss of human life. Attacks must not be excessive in

relation to direct military advantage. The principle requires subjective human input to

adequately minimise the number of acceptable civilian casualties.

Additionally the independent decision making of autonomous weapons makes

accountability problematic; however there are various modes of liability applicable to

those who intentionally use autonomous weapons to violate IHL.

Autonomous  Weapons,  are  they  Legal?     a1630695    

  3  

Introduction

Governments around the world are pouring millions of dollars into the research and

development of Autonomous weapons. It is imperative that the legality of such weapons

is discussed before these weapons are released onto the battlefield simply because

International Humanitarian Law (IHL) will be violated by their use and whilst individuals

and States can be held to account for such violations the purpose of IHL is to balance

humanitarian concerns with the needs of State militaries during times of armed conflict,

autonomous weapons will distort this balance.

Autonomous weapons are distinguished from other weaponry because they can think for

themselves and do not require humans to help them preform various missions. Proponents

argue they can process more information more quickly and are not subject to human bias

and error. This may well be the case but what autonomous robots cannot do, and what

IHL inherently requires, is the provision of humanity.

The MacQuarie dictionary1 defines humanity as ‘the condition or quality of being human;

human nature, the quality of being humane; kindness; benevolence’. Humanity is a

restraining factor on war, our compassion, kind-heartedness and empathy for our fellow

man makes not only killing the enemy difficult for soldiers it also makes going to war an

unpopular choice for democratic governments. Interestingly IHL is most likely to be

violated when the enemy has been dehumanised, to an autonomous weapon the enemy

cannot be anything but dehumanised, these issues have raised warning bells amongst the

international community.

                                                                                                               1 (Macquarie Dictionary Publishers Pty Ltd, 5th ed, 2009).

Autonomous  Weapons,  are  they  Legal?     a1630695    

  4  

Decisions must be made during war and autonomous weapons will process the data

required to make decisions much faster than a human ever could. However some

decisions inherently require a subjective human element. The principle of distinction at its

simplest demands civilians are distinguished from combatants, in the unlikely scenario

where a robot can make the requisite distinction, by its very nature it lacks the humanity

to make the best choice. The principle of proportionality requires that incidental loss of

civilian life is to be proportionate to the direct and concrete military advantage. This is not

a quantitative formula that can be pre programmed, again a subjective human element is

required, the cost of human life cannot be pre programmed, the advantage is unknown as

are the alternative choices of action. War is not pre choreographed.

IHL is not equipped to cope with the removal of humans from the battlefield; it is too

dependent on the human elements. Removal of humans would mean the law would

require additional law.

In the event that governments ignore warnings such as this dissertation and autonomous

weaponry finds it’s way into the military States will not escape accountability. State

responsibility could be invoked for violations of IHL by autonomous weapons.

Additionally individual combatants working as a group could also be held accountable via

the modes of liability. The modes are used to describe the way in which perpetrators may

be held to account for various types of roles play in in the commission of substantive

crimes. The mens rea of the principle perpetrator is not required for modes such as Joint

Criminal Enterprise ensuring that the use of an autonomous weapon in the commission of

a crime will not prevent the orchestrators or masterminds being held to account.

Autonomous  Weapons,  are  they  Legal?     a1630695    

  5  

Unfortunately an accountability gap exists for corporations who create and sell such

weaponry.

Autonomous  Weapons,  are  they  Legal?     a1630695    

  6  

Chapter One What is an Autonomous Weapon?

This chapter will begin with an overview of the international obligations States have in

developing autonomous weapons and why research such as this is important. The chapter

will continue with an explanation of what constitutes an autonomous weapon. To

understand autonomy firstly the human decision making process used in military

operations must be understood as some engineers are trying to replicate the process in

robots. This chapter will then move on to discuss the autonomy spectrum, distinguishing

between automated system and autonomous systems.2

Research Relevance

The scientific community is split on the possibility of fully autonomous weapons with

human like cognitive function.3 Despite this uncertainty the United States is investing

heavily in autonomous weapon development and during the middle of 2000 the United

States Senate Armed Services Committee added $246.3 million to speed development of

unmanned weapons systems,4 some of this research undoubtedly includes autonomous

weapons. The United States is not alone, many other nations, for example the United

Kingdom, China, Germany and others are investing significant amounts into autonomous

weapons research.5

                                                                                                               2 A word of thanks must be extended to Ms Fiorina Piantedosi for guiding me through the technical complexities and enabling me to have a sufficient understanding of autonomy to write this dissertation. 3 Human Rights Watch, ‘Losing Humanity The Case against Killer Robots’ (2012) International Human Rights Clinic 1,29. 4 Thomas K Adams ‘Warfare and the Decline of Human Decision making’ (2001) 31(4) Parameters: US Army War College 57,57. 5 Michael C Horowitz The looming Robotics Gap Why America’s Global Dominance in Military Technology is Starting to Crumble (5th May 2014) Foreign Policy The Magazine <http://www.foreignpolicy.com/articles/2014/05/05/the_looming_robotics_gap_us_military_technology_dominance>.

Autonomous  Weapons,  are  they  Legal?     a1630695    

  7  

Due to information barriers such as classified information it is impossible to know with

any certainty the current level of autonomous weapon development, although Honda has

revealed its new helper robot6 which has independent thought capacity suggesting that

autonomous weapon development may be further advanced than has been disclosed. This

dissertation is not about the feasibility of autonomous weapons development. Instead this

paper takes the stance that such development is possible and focuses on the notion that,

‘[I]f we wish to allow systems to make independent decisions without human

intervention, some considerable work will be required to show how such systems

will operate legally’.7

Additionally,

‘[T]he international community urgently needs to address the legal, political,

ethical and moral implications of the development of lethal robotic technologies’.8

The United Nations has recently headed this call and is to debate the use of killer robots.9

States that are a party to the Protocol Additional to the Geneva Conventions of 12 August

1949, and relating to the Protection of Victims of International Armed Conflicts (Protocol

I)opened for signature 8th June 1977, 1125 UNTS 3 (entered into force 7th December

                                                                                                               6 Tim Moynihan Watch the Astounding Dexterity of Honda’s Dancing Humanoid Robot (19th of April 2014) Wired <http://www.wired.com/2014/04/honda-asimo/>. 7 Ministry of Defense ‘The UK Approach to Unmanned Aircraft Systems’ (Joint Doctrine Note 2/11 The Development, Concepts and Doctrine Centre, 30th of March 2011) 5-2. 8 Patrick Worsnip, ‘UN Official Calls for Study of Ethics, Legality of Unmanned Weapons’ Washington Post, (Washington USA), October 24, 2010. 9 BBC, ‘Killer Robots to be Debated at UN’, BBC News Technology, 9 May 2014. <http://www.bbc.com/news/technology-27343076>.

Autonomous  Weapons,  are  they  Legal?     a1630695    

  8  

1979), Additional Protocol I, and are working on the development of autonomous

weapons have an international legal obligation to review the legality of their progress.

Article 36 states,

In the study, development, acquisition or adoption of a new weapon, means or

method of war, a high Contracting Party is under an obligation to determine

whether its employment would, in some or all circumstances, be prohibited by this

Protocol or by any other rule of international law applicable to the High

Contracting Party.10

Incidentally States who are not a party to Additional Protocol I11 either view the review

process as customary law whilst others deem it to be best practice. 12

Autonomous robots are certainly in the conception stage. This is the point the

International Committee of the Red Cross (ICRC) believes the obligation outlined in

article 3613 begins and continues as the weapon moves into development.14

Component review is insufficient, The Judge Advocate General’s Office argued a new

Predator drone did not require because when the drone was used as a surveillance tool it

                                                                                                               10 Protocol Additional to the Geneva Conventions of 12 August 1949, and relating to the Protection of Victims of International Armed Conflicts (Protocol I) opened for signature 8th June 1977, 1125 UNTS 3 (entered into force 7th December 1979). 11 Ibid. 12 Human Rights Watch, above n, 3 21. 13 Article 36 Protocol Additional to the Geneva Conventions of 12 August 1949, and relating to the Protection of Victims of International Armed Conflicts (Protocol I) opened for signature 8th June 1977, 1125 UNTS 3 (entered into force 7th December 1979) 14 Kathleen Lawand and Robin Coupland and Peter Herby, A Guide to the Legal Review of New Weapons, Means and Methods of Warfare. Measures to Implement Articles 36 of API of 1977 November 2006 International Committee of the Red Cross 1,23. < http://www.icrc.org/eng/assets/files/other/icrc_002_0902.pdf >.  

Autonomous  Weapons,  are  they  Legal?     a1630695    

  9  

underwent and passed the review. The Hellfire missile the Predator drone was armed now

armed with had also passed review. However,

‘an existing weapon that is modified in a way that alters its function, or a weapon

that has already passed a legal review but that is subsequently modified..’

must be reviewed again.15 When reviewing robotics for military use, which are not armed,

for example a surveillance drone, the review should consider the potential of the robot for

carry weaponry.16

The purpose of the review is to determine if the weapon is prohibited under international

law.17 To date there are no treaties which ban autonomous weapons. However some

components may be banned, for example a drone could not carry cluster munitions18 nor

could a robot lay mines as that may offend treaties.19 Customary law should also be

considered as it has the capacity to bind States to the obligation of review even if they are

not parties to the Additional Protocol I20 which is largely considered to be customary.

                                                                                                               15 Ibid. 16 Human Rights Watch, above n 3, 23. 17Ibid 24. 18 Convention on Cluster Munitions, opened for signature 3rd December 2008, 2688 UNTS (entered into force 1st August 2010). Providing the State has either ratified this convention or it is deemed customary which it does not appear to be. 19 Ibid. 20 Article 36 Protocol Additional to the Geneva Conventions of 12 August 1949, and relating to the Protection of Victims of International Armed Conflicts (Protocol I) opened for signature 8th June 1977, 1125 UNTS 3 (entered into force 7th December 1979).

Autonomous  Weapons,  are  they  Legal?     a1630695    

  10  

What is an Autonomous Weapon?

The Military Decision Making Process.

The military continues to use the ‘Observe, Orient, Decide, Act’ loop (OODA loop) to

describe human decision making in the combat operations process. 21 Admittedly it is

oversimplified and much of the process is preformed subconsciously by humans. When

humans make decisions the first task is the observation of the world around them, so data

is gathered pertaining to their environment. The next step is orientation, the data is

interpreted, following orientation is the decision, working out how to act by weighing up

the potential causes of action based on the collected and interpreted data, then finally the

action. 22

Technological advances have meant that in some instances a machine may contribute or

replace parts of the OODA loop, indeed machines create and develop more information

than humans are truly capable of absorbing. Discussions about autonomous weapons often

refer to humans in, on or out of the loop. It is the OODA loop to which referral is made.

The OODA decision-making loop provides the basis for leading engineers working on

autonomy such as Thomas Sheridan.23 Sheridan proposes a model that follows the OODA

loop in the development of autonomous weapons. The loop would consist of Information

Acquisition, Information Analysis, Decision Selection and Action Implementation. This

effort by develops is clearly designed to emulate the human decision making process.24

The discussion must now move to precisely what constitutes an autonomous weapon.                                                                                                                21 William C Marra and Sonia K McNeil ‘Understanding “The Loop” Regulating the Next Generation of War Machines’ (2013) 36(3) Harvard Journal of Law and Public Policy 1139, 1146. 22Ibid 1144. 23Ibid 1145. 24Ibid 1146.  

Autonomous  Weapons,  are  they  Legal?     a1630695    

  11  

Autonomy Incorrectly Defined

The United States Department of Defence has defined an autonomous weapon system as:

A weapon system that, once activated, can select and engage targets without

further intervention by a human operator. This includes human-supervised

autonomous weapon systems that are designed to allow human operators to

override operation of the weapon system, but can select and engage target without

further human input after activation.25

This definition is fundamentally flawed as it fails to identify the way in which the weapon

selects and engages targets. Weapons which are both autonomous and run via automation

have the ability to select and engage targets without human in the loop.26 The process or

the degree of autonomy afforded to a weapon is of greatest concern to IHL rather than the

ability of a weapon to preform various tasks with humans out of the loop.

Automation

Understanding autonomy begins with automation. Automated systems fundamentally

differ from an autonomous systems because they are unable to make independent

decisions, instead they work on an ‘if/then’ sequence. A simplified example is a robot

than can be programmed that ‘if’ the green cube27 falls on the floor ‘then’ I pick it up. In

context Israel’s Iron Drome operating near the Gaza border works on such a program. ‘If’

the Iron Drome’s radar identifies a short range rocket and 155mm artillery shells ‘then’ it

                                                                                                               25 Michael Schmitt and Jeffrey S Thurnher ‘Out of the Loop: Autonomous Weapon Systems and the Law of Armed Conflict’ (2012-2013) 4 Harvard National Security 213, 235. 26 Marra, above n 21, 1150. 27 Famous programmer Karl Sims used the possession of a green cube to enable his evolving cubes to compete. See Karl Sims, Evolved Virtual Creatures You Tube <http://www.youtube.com/watch?v=F0OHycypSG8>.

Autonomous  Weapons,  are  they  Legal?     a1630695    

  12  

fires the 20 Tamir interceptor missiles in response.28 Similarly ‘if’ Sea Whiz detects an

incoming anti ship missile ‘then’ it will engage and terminate the anti ship missile.29

There is no reason for IHL to be concerned by weaponry working via automated

programmes. Such weaponry can process information more quickly and accurately than

the human brain, indeed such weaponry should be embraced.

On the 12th of April 1999 North Atlantic Treaty Organisation (NATO) forces legally

identified a bridge in the Grdelica Gorge as a legitimate military objective.30 NATO pilots

engaged the target and fired, upon firing the radar used by the pilots identified a civilian

passenger train to be emerging onto the bridge.31 The pilot was not able to stop the bomb

at this stage. Had the bomb be fitted with a hypothetical ‘if’ vehicle movement is detected

‘then’ do not detonate function civilian lives would have been saved enabling greater

adherence to IHL by NATO. In this particular incident the civilian passenger train was hit

by two bombs due to human limitation that automated systems or ‘if/then’ programing

could overcome.

Automated systems are limited by the data capability, a weapon system may encounter an

‘if’ but may take to long or jam up trying to find the ‘then’. Increasingly systems are

expected to preform tasks with greater complexity and the limitations of automation have

led developers towards autonomous weapons. It is this development that should be of

great concern to IHL.

                                                                                                               28 Human Rights Watch, above n 3, 10. 29 Alton Parrish, Gee Whiz! Sea Whiz 4500 Rounds Per Minute For Close In Naval Fighting (10th September 2012) Before it’s News <http://beforeitsnews.com/science-and-technology/2012/09/gee-whiz-sea-whiz-4500-rounds-per-minute-for-close-in-naval-fighting-2467644.html>?>. 30 ITCY, Final report to the Prosecutor by the Committee Established to Review the NATO Bombing Campaign Against the Federal republic of Yugoslavia. (Initiated, 14th May 1999) International Criminal Tribunal for the for the former Yugoslavia) <http://www.icty.org/sid/10052#pagetop>.  31 Ibid.

Autonomous  Weapons,  are  they  Legal?     a1630695    

  13  

Autonomy Correctly Defined

What then is an autonomous weapon? It is important to note that the term autonomy

begins at one end of a scale, at the other end is artificial intelligence. Perhaps the most

defining factor illustrating autonomy is the capacity of the machine to establish and

peruse its own goals thus attempting to emulate human process rather than simply

replicating them via automation.32 In short an autonomous robot has independent

decision-making capabilities rather than the simple ‘if/then’ sequence. Determining where

a particular weapon sits on the autonomous scale begins with asking three questions.

The first question is how often does the weapon require human intervention to complete

the given mission?33 The higher the level of independence the machine has the greater the

level of autonomy. Consider the robot picking up the green cube, if the cube is knocked

under the table rather than falling straight down as normal can the robot can move

independently without human intervention to retrieve the green cube? If so it is displaying

a relatively high level of autonomy in this respect.

The second question asks how much tolerance the machine has for environmental

uncertainty or how adaptable is the machine to different environments?34 How would our

robot respond if we placed the green cube in the next room? Our robot may be higher up

the autonomous scale and work out that it can’t pass through walls and it must locate a

open door to retrieve the green cube, other less autonomous robots may simply lose the

green cube at this stage and without human intervention the mission would fail. Those

with a high level of autonomy may learn that door enables passage and the next time the                                                                                                                32 Marra above n 21, 1150. 33 Ibid 1152. 34 Ibid 1153.

Autonomous  Weapons,  are  they  Legal?     a1630695    

  14  

green cube is lost the robot will seek a door as it has learnt that is the best way to pursue

the goal of picking up the green cube.

The third question is the how assertive is the robot?35 Does the machine have the capacity

to change its operating plan to complete the mission without human intervention? Will the

robot use the means it deems necessary to achieve the end result designated by humans?

Returning to our robot, we have told it that keeping the green cube on the table is the

mission. Our robot identifies that another robot knocks the green cube off the table and

moves the green cube from room to room. A particularly assertive robot will attack and

destroy the robot moving the cube thus completing the mission of maintaining the green

cube on the table. A more chilling example is the robot could identify it was a human, not

another robot and use lethal force against that human without emotion or care, to a robot

there is not emotional distinction between a living being or another robot and the green

cube is just as important as the life of a child, a robot does not have the emotion to

differentiate between a green cube and a life.

Robots may become a threat to civilians as they increase in mobility and range.36 South

Korea employs a sentry robot that has heat and motion sensors. The robot from up to two

miles away can detect people and the machine can act without human intervention.37 This

weapon contains a 12.7mm or .50 calibre machine gun with a kill zone of about 1 to 1.5

kilometres.38 Despite the lack of human intervention this is still advanced automated

technology, ‘if’ I detect a human in this range, ‘then’ I fire.

                                                                                                               35 Ibid 1154.  36 Human Rights Watch, above n 3, 13. 37 Ibid 14. 38 Ibid 15.

Autonomous  Weapons,  are  they  Legal?     a1630695    

  15  

Clearly, a large aspect of robotic development centers on robots learning from their

experiences,39 for example our robot may have learnt that the door is the best way to pass

between rooms. Such leaning ability enables a robot to respond to novel situations not

previously envisaged by its developer. 40 The concern for IHL is the level of predictability

regarding what the robot will learn. A fully predictable robot is simply an automated

robot. Autonomous weapons are not at a point where they are ready to hunt and kill the

enemy with humans out of the loop.41 Commentators such as Michael Schmitt and

Thurnher suggest that whilst malfunction is certainly feasible as with any technology the

idea that robots could go rogue should be left to Hollywood,42 other academics are not so

optimistic suggesting that ‘Unfortunately, a full awareness of the risks from autonomous

robots may be impossible,’43 because;

‘While some systems might merely enact pre-programmed moral rules or

principles, autonomous robotic agents might be capable of formulating their own

moral principles, duties, and reasons, and thus make their own moral choices in

the fullest sense of moral autonomy’.44

The capacity of autonomous weapons to make the decision to execute lethal force is

certainly fraught with legal and ethical issues and as will be evidenced in the next chapter

IHL was not designed for the use of autonomous weapons as the law relies on distinctly

human traits.

                                                                                                               39 Gary E Marchant, et al, ‘International Governance of Autonomous Military Robots’ (2011) 12 The Columbia Science and Technology Law Review 272, 275-284. 40 Ibid. 41 Schmitt, above n 25, 241. 42 Ibid 242. 43 Marchant et al, above n 39. 44 Peter Asaro ‘How Just Could a Robot War Be?’ (2008) Proceeding of the 2008 Conference on Current Issues in Computing and Philosophy, 50, 52.  

Autonomous  Weapons,  are  they  Legal?     a1630695    

  16  

Chapter Two. Human’s v’s Autonomous Machines

Proponents of autonomous weapons argue the precision and capacity of robots greatly

exceeds that of humans, there is not doubt this is the case however the law requires so

much more, this chapter will explore the inherently human characteristic components of

IHL which cannot be emulated by a machine.

The Balance Between War and Humanity

The purpose of IHL is to establish a balance between humanitarian concerns and the

military needs of states.45 Article 1 Vienna Convention on the Law of Treaties, opened for

signature 23rd May 1969, 1155 UNTS 331 (entered into force 27th January 1980) informs

us that when interpreting a treaty it should be done inter alia ‘in light of [the Treaties]

object and purpose’.46 Thus it is possible to infer that human’s must be involved in the

decision making process when decisions controlled by IHL are being made as they

provide a unique human factor which cannot be emulated by a machine. The removal of

humans from the decision making process is the problem rather than the introduction of

robotics.47

Henri Meyrowitz identified two senses of humanity the first understood humans to be

‘defining characteristic of the human race’48 and the second, more on point, suggested

                                                                                                               45  International  Committee  of  the  Red  Cross  What  is  International  Humanitarian  Law?  (July  2004)  Advisory  Service  on  International  Humanitarian  Law  <http://www.icrc.org/eng/assets/files/other/what_is_ihl.pdf>.    46Article  1  Vienna  Convention  on  the  Law  of  Treaties,  opened  for  signature  23rd  May  1969,  1155  UNTS  331  (entered  into  force  27th  January  1980).  Also  see  art  2.  47  Vik  Kanwar,  ‘Post-­‐Human  Humanitarian  Law:  The  Law  of  War  in  the  Age  of  Robotic  Weapons’  (2011)  2  Harvard  National  Security  Journal  Vol  616,  618.  48Ibid.  

Autonomous  Weapons,  are  they  Legal?     a1630695    

  17  

humanity meant ‘a feeling of compassion towards other human beings’.49 Decreasing the

human role in decision making and replacing it with robots will see respect for humanity

decrease.50

There are however proponents of autonomous weapons, Gordon Johnson of the Joint

Forces Command at the Pentagon is one, and recently stated ‘The lawyers tell me there

are no prohibitions against robots making life or death decisions.’51 There may not be any

international instruments prohibiting such use, nor is it spelled out in customary law.

However the law inherently requires humans to make life and death decisions either at the

automation programing level or during the OODA loop process in the heat of battle.

Human Weakness and Proponents of Autonomous Weapons

The Asian Experiment52 suggests that when a question or choice is framed with two

possible outcomes, one which is risk free 200 people will definitely be saved opposed to

the other containing risk where by there is one third of a chance that 600 people will be

saved and a two thirds chance no one will be saved. People will generally chose the risk

free option as humans are risk adverse. When a decisional outcome is framed in a way in

which losses are the result humans will prefer a preference for risk. When the scenario

changes to 400 people will die for sure or there is one in three chance no one will die and

two in three chance 600 people will die then human decision makers will chose the risk

whereby no one will die. Machines do not suffer such limitations and they not influenced

by how a question is framed.

                                                                                                               49Ibid.  50Ibid.  51  Noel  Sharkey,  ‘Cassandra  or  False  Prophet  of  Doom:  AI  Robots  and  War’  (2008)  23(4)  IEEE  Intelligent  Systems  14,15.  52  A  Tversky  and  D  Kahneman  ‘The  framing  of  decisions  and  the  psychology  of  choice’  (1981)  211(4481)  Science  453,  453-­‐458.  

Autonomous  Weapons,  are  they  Legal?     a1630695    

  18  

Significant bias also exists in human decision-making, humans use a system of cognitive

processing which ignores some information enabling us to cope with the massive amount

of information surrounding us, this is known as heuristics. 53 Additionally humans

encounter certain bias in decision-making such as availability, hindsight and fundamental

attribution biases. 54 These shortcuts and bias experienced by humans can lead to

decisional errors. Machines are not susceptible to such bias and can process information

more quickly and often more accurately avoiding human psychological problems.55

Technology continues to develop rapidly, information-based systems are producing data

overload making it more difficult if not impossible for humans to directly intervene in

decision making.56 For example giving humans the capacity to veto a machine decision

can be problematic due to bias. Often decisions must be made in a split second and

through bias humans will usually trust a machine.57 In such a scenario humans don’t

really have the capacity to override the machine’s decision due to bias.

Autonomous weapons undoubtedly have additional advantages, as Gordon Johnson

pointed out,

‘They don’t get hungry. They’re not afraid. They don’t forget their orders. They

don’t care if the guy next to them has just been shot.’58

                                                                                                               53  Gerd  Gigerenzar  and  Wolfgang  Gaissmaier  ‘Heuristic  Decision  Making’  (2011)  62  Annual  Review  of  Psychology  451.  54  Stephen  Choi  and  AC  Pritchard,  ‘Behavioural  Economics  and  the  SEC’  (2003)  56(1)  Stanford  Law  Review  1,  27-­‐33.  Other  bias  include  overconfidence  and  Groupthink.  55  Marchant  et  al  above  n  39,  280.  56  Adams,  above  n  4,  2.  57  Human  Rights  Watch,  above  n  3,  13.  58  Sharkey,  above  n  51.  

Autonomous  Weapons,  are  they  Legal?     a1630695    

  19  

Unless development is restricted autonomous weapons may take over the more dangerous

jobs saving military lives.59 Autonomous robots can be self-sacrificing and proponents

argue they will act without negative emotion such as fear or anger.60 At first glance

autonomous weapons do appear to offer valuable assistance to military operations.

Unfortunately the disadvantages far outweigh the advantages discussed above.

Humanity – an Inherent Aspect of IHL.

Combatant Human Emotion.

Autonomous weapons advocates suggest human emotion leads to error and violation of

the law but in actual fact the opposite appears to be the case.

‘One of the greatest restraints for the cruelty in war has always been the natural

inhibition of humans not to kill or hurt fellow human beings. The natural

inhibition is, in fact, so strong that most people would rather die than kill

somebody…Taking away the inhibition to kill by using robots for the job could

weaken the most powerful psychological and ethical restrain in war. War would be

inhumanely efficient and would no long be constrained by the natural urge of

soldiers not to kill.’61

Human emotion acts as a safeguard to the protection of life during times of war as humans

find it difficult to kill their fellow humans. Psychological studies reveal that

                                                                                                               59Marchant  et  al,  above  n  39,  275-­‐276.  60  Marchant  et  al,  above  n  39,  280.  61  Krishnan,  Armin  ‘Killer  Robots  Legality  and  Ethicality  of  Autonomous  Weapons’  (Ashgate  Publishing  Limited,  2009)  130.  

Autonomous  Weapons,  are  they  Legal?     a1630695    

  20  

‘the average and healthy individual…..has such an inner and usually unrealized

resistance towards killing a fellow man that he will not of his own volition take

life if it is possible to turn away from that responsibility’.62

An apt example was provided by the United States Army Air Corps during the Second

World War. It was discovered that only one precent of enemy aircraft destroyed could be

attributed to being shot down by the United States fighter pilots. Some believe this was

due to fear, others more accurately identify that when the time to kill eventuated the

pilots,

‘looked into the cockpit at another man, a pilot, a flier, one of the brotherhood of

the air, a man frighteningly like themselves; and when faced with such a

man……..could not kill him’.63

War veteran Lieutenant Colonel Dave Grossman believed that identifying the ‘resistance

to kill one’s fellow man is…..strong, and it gives us cause to believe that there just may be

hope for mankind after all’. 64 The introduction of autonomous weapons onto the

battlefield will rapidly erode such hope.

A killer robot does not feel restraining emotions and it will have no problem killing

civilians or combatants. Killing to an autonomous weapon is no different to navigating its

way around a building it feels nothing. Human emotion enables a combatant to show

characteristics such as compassion, which in turn protects life. For example a robot for

                                                                                                               62  Dave  Grossman,  ‘On  Killing  The  psychological  Cost  of  Learning  to  Kill  in  War  and  Society’  (Hachette  Book  Group,  revised  ed,  2009)  30.  63  Ibid  31.  64  Ibid  40.  

Autonomous  Weapons,  are  they  Legal?     a1630695    

  21  

may shoot a child and the attack may be lawful. A combatant on the other hand may find

a more compassionate or merciful solution because of the emotion felt. Alternatives may

include the capture of the child or advancement by the combatant in a different direction

keeping the child alive.65 The law inherently relies on this restraining emotion without

which additional law would be required to ensure the safeguards that are currently

provided by human combatants.

Commentators have suggested that war via autonomous weapons would become no

different to a video game,66 where killing is not refrained because human emotion is

lacking. Presently humans are in the OODA loop and operate automated systems such as

drones, from a distance. A United States Air force Pilot who operates the predator drones

says,

‘People think we’re sitting here with joysticks playing a video game, but that’s

simply not true….These are real situations and real-life weapons systems. Once

you launch a weapon, you can’t hit a replay button to bring people back to life’.67

Indeed psychologist and professor of cognitive science suggests that the detail and level

of monitoring prior to a drone attack means combatants are more emotionally impacted as

they see the before and after the detonation of the weapon clearly on a computer screen.68

The removal of the emotional restraint by replacing pilots with autonomous weapons, so

that only the weapon sees the screen before and after detonation would be emotionless

                                                                                                               65  Human  Rights  Watch,  above  n  3,  38.  66  Ibid  40.  67  Denise  Chow,  Drone  Wars:  Pilots  Reveal  Debilitating  Stress  Beyond  Virtual  Battlefield  (5th  November  2013)  Live  Science  <http://www.livescience.com/40959-­‐military-­‐drone-­‐war-­‐psychology.html>.  68  Ibid.  

Autonomous  Weapons,  are  they  Legal?     a1630695    

  22  

and certainly mimic a video game where the loss of human life is of no consequence. IHL

relies on these emotional restraints.

Public Emotion.

The development of autonomous weapons may also decrease the political barriers of entry

into war. The citizens of most nations are usually against going to war when it can be

avoided and almost always against any type of war that is deemed unjust.69 Due to these

attitudes when democratic nations opt to partake in armed conflict certain amounts of war

propaganda is required.70 Technologies, which decrease the risk of nations military

personnel, are popular propaganda choices, which often lead to significant funding of

such technologies. Autonomous weapons are deemed to be a ‘safe’ method of fighting.71

An example of response to political pressure and war was evident when NATO

responding to political pressure and utilised the idea of ‘safe’ fighting during the Kosovo

conflict in 1999 implementing the zero casualties policy.72 NATO jets flew at heights

which were out of enemy missile range ensuring pilots were not shot down.73 Arguably

this policy led to increased numbers of civilian casualties74 because the pilots were too

high to see targets adequately. NATO attacks involving civilian casualties were not found

to violate the principle of distinction or proportionality.75 But it should be noted that

during this NATO campaign humans were clearly in the OODA loop so emotional

restraint was present. The situation may become very different if humans are outside the

                                                                                                               69  Asaro,  above  n  44,  54.  70  Ibid.  71  Ibid.  72  ICTY,  above  n  30.    73Asaro,  above  n  44.  74  Ibid.  75  ICTY  Above  n  30.  

Autonomous  Weapons,  are  they  Legal?     a1630695    

  23  

loop and autonomous weapons were utilised to make a war more palatable to a nations

population.

Furthermore decreasing combatant casualties may be domestically popular but for how

long? At what point does a war become one-sided killing with no risk to one side and only

economic cost?76 There is a difference between decreasing risk and the removal of

combatants being placed in any danger.77 Political barriers must remain in tact to ensure

wars are only ever fought when there is no other feasible option.

Additional Concerns

Autonomous weapons wielded by a repressive autocrat could be catastrophic for civilians

because there would be no emotional constraint on killing. Combatants find it

exceptionally difficult to fire against their own people for illegitimate reasons, in such

cases the most hardened combatant can turn on their leader. Such is not the case with

autonomous robots again the lack of human emotion endangers more life that it can ever

protect.78

The battle area could also be expanded as robots can travel and strike further.79 When the

autonomous weapon is deployed as part of a hybrid machine human mix the robot can

independently and objectively monitor behaviour on the battlefield,80 and report any

perceived violations of IHL to superiors. Some advocates for autonomous weapons go as

                                                                                                               76  Christof  Heyns  Report  of  the  Special  Rapporteur  on  Extrajudicial  Summary  or  Arbitrary  Executions  Agenda  Item  3,  A/HRC/23?47/Add.4,  (23rd  of  April  2013),  Agenda  Item  3,  12.    77  Ibid.  78  Human  Rights  Watch,  above  n  3,  38.  79  Marchant,  et  al,  Above  n  39,  275-­‐276.  80  Ibid  280.  

Autonomous  Weapons,  are  they  Legal?     a1630695    

  24  

far to suggest that if a robot is so good it may be able to refuse an order,81 such a

suggestion offends humanity as it elevates the status of a robot to higher level than that of

a combatant.

Robotic Behaviors in Humans that lead to IHL Violations.

Certain robotic type emotions can be identified in people who violate IHL. Violations of

IHL have been found to occur when a combatant experiences a moral disengagement

from the enemy. This often happens when the combatant identity’s themselves as the

victim of the enemy. 82 A robot will not experience feelings of victimisation and it will

never experience a moral disengagement because it is purely amoral.

Combatants have also been found to violate the law when they become part of a group

and their self-identity is within that group rather than as an individual. The group

membership serves to limit their feelings of personal responsibility and the value

judgements they make about their actions.83 The notion of personal responsibility or the

need to assess whether a particular decision was right or wrong without humanity is not

possible, autonomous weapons could not preform this task as they do not have a moral

conviction.

The enemy is often dehumanised in training and the combatant accepts this as

dehumanisation as legitimate,84 killer robots by their very nature do not feel anything, any

target by a robot is dehumanised. On the basis that these characteristics within humans

                                                                                                               81Ibid  281.  82Daniel  Munoz-­‐Rojas  and  Jean  Jacques  Fresard,  ‘Roots  of  Behavior  in  War’  (2004)  86(853)  IRRC  189,  198.  83  Ibid  194.  84  Ibid.  

Autonomous  Weapons,  are  they  Legal?     a1630695    

  25  

have been shown to lead to violations of IHL robots on the battle field are even more

concerning.

These human traits or lack there of, moral disengagement, personal responsibility, lack of

value judgement and a dehumanised enemy allegedly lead to violations of IHL.

Programming an autonomous weapon with these uniquely human traits is impossible and

it is incomprehensible that autonomous weapons, with the capacity to think for

themselves to target and fire are lacking in such fundamental human restraints. Violation

of IHL by autonomous robots appears inevitable and the idea they will one day be on the

battlefield is simply inconceivable.

Many experts share these concerns. Last year over 270 artificial intelligence experts,

robotics experts and scientists from 37 countries submitted an open letter requesting the

development and deployment of inter alia, autonomous weapons to be prohibited.85 Neol

Sharkey robotics expert states, ‘We don’t want [unmanned aerial vehicle’s] selecting

targets and working out how best to carry out an attack’.86 The ICRC warns ‘[A]ll

predictions agree that if man does not master technology, but allows it to master him, he

will be destroyed by technology’.87 The Special Rapporteur reiterates this warning

suggesting some action is required immediately by way of transparency, accountability

and the rule of law and these items must be placed on the agenda.88

                                                                                                               85  Noel  Sharkey,  Computing  Experts  from  37  Countries  Call  for  Ban  on  Killer  Robots  (16th  October  2013)  International  Committee  for  Robot  Arms  Control  <http://icrac.net/2013/10/computing-­‐experts-­‐from-­‐37-­‐countries-­‐call-­‐for-­‐ban-­‐on-­‐killer-­‐robots/>  86  Ibid.  87  International  Committee  of  the  Red  Cross,  Commentary  on  the  Additional  Protocols  of  8  June  1977  to  the  Geneva  Conventions  of  12  August  1949  (Geneva:  Martinus  Nijhoff  Publishers,  1987).    88  Heyns,  above  n  74.  

Autonomous  Weapons,  are  they  Legal?     a1630695    

  26  

This chapter has illustrated the benefits of uniquely human emotions and the role these

sentiments plays in armed conflict. The story is not complete, another reason for the

violation of law is that whilst combatants may have knowledge of the law when it comes

to the application in real life situations, respect for the law and understanding of the law

may become blurred.89 Autonomous weapons are likely to suffer similar issues, as the

application of IHL often requires a subjective analysis of which a robot is incapable, this

is discussed further in the next chapter.

                                                                                                               89  Munoz-­‐Rojas,  above  n  82,  197.  

Autonomous  Weapons,  are  they  Legal?     a1630695    

  27  

Chapter 3 Targeting law

Targeting law, an aspect of IHL consists of two fundamental principles, distinction and

proportionality. Both require a subjective human element to ensure they are not breached

this is why autonomous weapons will not be able to adhere to IHL to the same degree

combatants can.

Distinction

Customary Law

The ICRC considers the principle of distinction to be customary law90 because many

military manuals identify the principle, various states have passed legislation ensuring

adherence and several official statements have been made acknowledging and adopting

this rule.91 The International Court of Justice identified distinction as one of the ‘cardinal’

and ‘intransgressible principles of international customary law’.92 Once a principle or law

is identified as customary it means that all States are bound to comply, thus States must

comply with the principle of distinction.

The Codification of Distinction.

The principle of distinction was codified in 1977 with the ratification of Additional

Protocol I.93 Distinction is a fundamental rule which states civilians and civilian objects

must be distinguished from military objectives and civilians or their objects must not be

                                                                                                               90Jean-­‐Marie  Henckaerts  and  Lousie  Doswald-­‐Beck,  Customary  International  Humanitarian  Law  Volume  I:  Rules  (Cambridge  University  Press,  2005)  25.  91  Ibid.  92Nuclear  Weapons  (Advisory  Opinion)  [  1996]  ICJ  Rep  225  No  93  [79].  93  Protocol  Additional  to  the  Geneva  Conventions  of  12  August  1949,  and  relating  to  the  Protection  of  Victims  of  International  Armed  Conflicts  (Protocol  I)  opened  for  signature  8th  June  1977,  1125  UNTS  3  (entered  into  force  7th  December  1979).  

Autonomous  Weapons,  are  they  Legal?     a1630695    

  28  

the object of attack.94 This concept is stated in Article 51(2)95 and Article 52(2)96

reiterating the above points. In practice this means that the aiming point of the attack must

be a military target although some collateral damage is acceptable97 the Rome Statute

holds violation of the principle of distinction to be a war crime.98

Combatants and Civilians

The International Criminal Tribunal for the former Yugoslavia (ICTY) asserts the

principle of distinction, stating that a ‘person shall not be made the object of attack when

it is not reasonable to believe……that the potential target is a combatant’.99

Autonomous expert Dr Peter Asaro suggests it would be extremely difficult for an

autonomous weapon to adhere to the principle of distinction.

‘The relevance of the civilian-combatant distinction to robotic soldiers is that if

they are to be autonomous in choosing their targets, they will have to be able to

reliably distinguish enemy combatants from civilians……it is extremely difficult

to identify particular people, or even types of people, much less to categorize them

reliably into groups such as “friend” or “foe,” the boundaries of which are often

poorly defined and heavily value-laden.’100

                                                                                                               94  Ibid  Article  48.  95  Ibid.  96  Ibid.  97  Krishnan,  above  n  61,  93.    98  Art  8(2)(b)(ii)  Rome  Statute  of  the  International  Criminal  Court,  opened  for  signature  17th  July  1998,  2187  UNTS  90  (entered  into  force  1  July  2002).    99  Prosecutor  v  Galic,  (Trial  Judgment)  (International  Criminal  Tribunal  for  the  Former  Yugoslavia,  Trial  Chamber,  IT-­‐98-­‐29-­‐A  5th  of  December  2003)[56].    100    Asaro,  above  n  44,  63.  

Autonomous  Weapons,  are  they  Legal?     a1630695    

  29  

The following scenario illustrates the types of failures that would be made by autonomous

weapons trying to apply the principle of distinction. There may be children playing with

toy or even real guns near a soldier. A robot would easily mistake the children as

combatants with disastrous consequence. The intent of the children is easily recognised by

the human soldier because humans understand each others emotional states in ways robots

cannot. Indeed in this situation the combatant may empathise with the child remembering

their youth or children he or she is close to at the present time. The ICTY asks what is

‘reasonable’101 to believe. A person can only answer this question. There is no such thing

as a reasonable or unreasonable robot, autonomous weapons acting outside acceptable

parameters set by humans are broken and personification should not be attempted.

Direct Part in Hostilities

Article 51(3) of Additional Protocol I102 provides that civilians who take ‘direct part in

hostilities’ are not protected by the principle of distinction.103 The ICTY endorsed the

exception,

‘The protection of civilian and civilian objects provided by modern international

law may cease entirely to be reduced or suspended….if a group of civilians takes

up arms….and engage in fighting against the enemy belligerent, they may be

legitimately attacked by the enemy belligerent…’104

                                                                                                               101  Prosecutor  v  Galic,  (Trial  Judgment)  (International  Criminal  Tribunal  for  the  Former  Yugoslavia,  Trial  Chamber,  IT-­‐98-­‐29-­‐A  5th  of  December  2003)[56].  102  Protocol  Additional  to  the  Geneva  Conventions  of  12  August  1949,  and  relating  to  the  Protection  of  Victims  of  International  Armed  Conflicts  (Protocol  I)  opened  for  signature  8th  June  1977,  1125  UNTS  3  (entered  into  force  7th  December  1979).  103  Ibid  Art  51(3).  104  Prosecutor  v  Kupreskic  (Trial  Judgment)  (International  Criminal  Tribunal  for  the  Former  Yugoslavia,  Trial  Chamber  IT-­‐95-­‐16-­‐T,  14th  of  January  2000)  205[522-­‐523].  

Autonomous  Weapons,  are  they  Legal?     a1630695    

  30  

Modern warfare has taken place in urban surrounds amongst the civilian population.105

‘Direct participation in hostilities’ rather than a distinguishing uniform often identifies

combatants. 106 The meaning of ‘direct participation in hostilities’ is uncertain and

presently there is no consensus,107 despite the ICRC drafting of a set of controversial

guidelines, a summary of which suggests it is to engage in or directly support military

operations.108

Despite this uncertainty ascertaining the intent would play a key role in complying with

the direct participation in hostilities aspect of distinction. An autonomous weapon would

struggle to determine direct participation in hostilities due to its inability to determine the

intent of an individual. Paul Bello suggests that,

‘[i]n context where we cannot assume that everyone present is a combatant, then

we have to figure out who is a combatant and who is not. This frequently requires

the attribution of intention’.109

Humans use an emotional state to interpret intention but

‘a system without emotion….could not predict the emotions or action of others

based on its own states because it has no emotional states’.110

                                                                                                               105  Consider  the  wars  in  Iraq,  Afghanistan  and  the  current  Ukraine  crisis,  all  of  these  wars  have  seen  fighting  in  urban  settings.    106  Human  Rights  Watch,  above  n  3,  30.  107  Ibid.  108Nils  Melzer,  Interpretive  Guidance  on  the  Notion  of  Direct  Participation  in  Hostilities  under  International  Humanitarian  Law  Page  International  Committee  of  the  Red  Cross  22-­‐24  <http://www.icrc.org/eng/assets/files/other/icrc-­‐002-­‐0990.pdf  >.  109  Human  Rights  Watch,  above  n  3,  31.  110  Ibid.  

Autonomous  Weapons,  are  they  Legal?     a1630695    

  31  

Clearly autonomous weapons will struggle with the most basic forms of distinction. The

principle itself becomes more complex further supporting the fact that autonomous

weapons have no role in modern day warfare because they cannot adhere to basic IHL

principles.

Dual Use Objects

The principle of distinction also applies to dual use objects, that is objects which are used

by both civilians and the military. Subjective human interpretation is required to ensure

respect for the law and the following example illustrates just how inept autonomous

weapons would be at the application of a slightly more complex decision requiring

consideration of the principle of distinction.

The NATO bombing of the Serbian TV and Radio Station in Belgrade on the 23rd of

April 1999 saw NATO claim that the bombing of the target was a legitimate military

target because it ‘disrupted and degraded the C3 (Command, Control and

Communications) network’. The report found the station was a legitimate military target

on the C3 grounds.111

However NATO military spokesperson added that they needed to stop the propaganda of

the Milosovic regime, and this was done through the bombing of the civilian television

and radio network,112 British Prime Minister Tony Blair endorsed the bombing stating that

the civilian network,

                                                                                                               111  ICTY,  above  n  28,  [75].  112  Ibid  [74].  

Autonomous  Weapons,  are  they  Legal?     a1630695    

  32  

‘is the apparatus that keeps him [Milosevic] in power and we are entirely justified

as NATO allies in damaging and taking on those targets’.113

The report into NATO’s actions stated that disrupting propaganda in this instance was

insufficient to constitute military advantage, had there been no C3 justification the attack

would have been illegal. The report continued informing us that the propaganda being

aired in Rwanda facilitating genocide may have made the radio station a legitimate

military target.114

An autonomous weapon will struggle to understand these arguments and distinctions

because they are inherently human and require a thought process with various shades of

grey rather than black and white programing of a machine. Determining the meaning of

radio broadcasts in a foreign language and then ascertaining the difference between

potentially attacking civilian moral or attacking a broadcast designed to incite violence

would be extremely challenging and very easy for an autonomous weapon to make an

error. What the law inherently requires is a human’s understanding of the language and

the messages conveyed, a human would have to remain in the loop to ensure compliance,

as it would be reckless to let a killer robot make such a decisions.

                                                                                                               113  ICTY,  above  n  28,  [74].  114  Ibid  [76].  

Autonomous  Weapons,  are  they  Legal?     a1630695    

  33  

Cultural Property

Customary law protects cultural property, ICRC Customary IHL Rule 39115 says

‘The use of property of great importance to the cultural heritage of every people

for purposes which are likely to expose it to destruction or damage is prohibited,

unless imperatively required by military necessity.’116

The Convention for the Protection of Cultural Property in the Event of Armed Conflict

with Regulation for the Execution of the Convention , opened for signature 14th of May

1954 UNESCO (entered into force 7th of August 1956) provides a special protection regime

that has been initiated. The regime enables property ‘of very great importance’ to be

placed under special protection.117 Such property must be marked with a special emblem

during hostiles.118 States have obligations under this regime to ensure potential military

objectives are not placed within range of cultural property.119 Few States have signed up

for the scheme.120

Provided States were to adhere to the obligation to ensure military obligations were not

stored near the protected site this concept may be workable concept for autonomous

weapons. Autonomous weapons may actually be able to comply with IHL in this instance.

Indeed initiative could take this concept even further and such sites may be shelter for

                                                                                                               115  Henckaerts,  above  n  88,  131.  116  Ibid.    117  Article  8  Convention  for  the  Protection  of  Cultural  Property  in  the  Event  of  Armed  Conflict  with  Regulation  for  the  Execution  of  the  Convention,  opened  for  signature  14th  of  May  1954  UNESCO  (entered  into  force  7th  of  August  1956).  118  Ibid  Article  16.    119  Ibid  Article  9  and  Article  8(5).  120  CJS  Forrest,  ‘The  Doctrine  of  Military  Necessity  and  the  Protection  of  Cultural  Property  During  Armed  Conflicts’  (2007)  37(2)  California  Western  International  Law  Journal  177,  208.  

Autonomous  Weapons,  are  they  Legal?     a1630695    

  34  

civilians. One possible means of compliance however does not negate a multitude of

failures.

Hors De Combat

ICRC Customary Rule 47121 prohibits the attacking of persons who are hors de combat or

out of the fight and this includes ‘anyone who is in the power of an adverse party; anyone

who is defenceless because of unconsciousness, shipwreck, wounds or sickness; or

anyone who clearly expresses an intention to surrender, provided he or she abstains from

any hostile act and does not attempt to escape’.

Article 41122 codified hors de combat. A person is hors de combat if they are ‘in the power

of an adverse Party’,123 intent to surrender has been plainly expressed124 or a person is

‘rendered unconscious or is otherwise incapacitated by wounds or sickness, and therefore

is incapable of defending [themselves]’.125

Intent has been discussed above and its application to the intent to surrender is clearly

fraught with difficulties for an autonomous weapon. The robot could either be tricked or

not understand the intent to surrender and instead use inappropriate lethal force.

Autonomous weapons will struggle with incapacitated combatants. Sufficient funding

could add medical equipment to the robot enabling it to easily scan the heart rate, monitor

breathing and other vital signs to determine whether the combatant is out of the fight or                                                                                                                121  Henckaerts,  above  n  88,  164.  122  Protocol  Additional  to  the  Geneva  Conventions  of  12  August  1949,  and  relating  to  the  Protection  of  Victims  of  International  Armed  Conflicts  (Protocol  I)  opened  for  signature  8th  June  1977,  1125  UNTS  3  (entered  into  force  7th  December  1979).  123  Ibid  Article  41(2)(a).  124  Ibid  Article  41(2)(b).  125  Ibid  Article  41(2)(c).  

Autonomous  Weapons,  are  they  Legal?     a1630695    

  35  

just pretending. What the robot cannot do is account for shear will power, or mind over

matter that is unique to the human psyche. The robot will always be vulnerable, it will

either have to use lethal force to ensure an injured combatant does not find strength when

medics say there is no strength to find or place itself in a vulnerable position. Furthermore

medical equipment can be sensitive to certain devices such as mobile phones, a relatively

easy way for the enemy to confuse the robot.

At a more simplistic level where a autonomous weapon does not carry a medical triage a

robot may have great difficulty determining whether or not a person it shot has been killed

or mealy fallen to the ground faking the injury.126 How can a robot make decisions when

it is incapable of instinct, fear and survival and as such unable to emulate human emotion?

Hors de combat like all the other aspects of the principle of distinction relies on these

instincts.

Proportionality

Customary Law

Initially proportionality developed as customary law contributing to the just war theory,

requiring the overall evil of the war to be proportionate to the overall good. A just war

was a proportional war.127 Current customary law pertaining to proportionality can be

defined as

‘Launching an attack which may be expected to cause incidental loss of civilian

life, injury to civilians, damage to civilian objects, or a combination thereof, which

                                                                                                               126  Human  Rights  Watch,  above  n  3,35.  127  Judith  Gardam,  Proportionality  as  a  Restraint  on  the  Use  of  Force  (1999)  20  Australian  Year  Book  of  International  Law,  161,  163.  

Autonomous  Weapons,  are  they  Legal?     a1630695    

  36  

would be excessive in relation to the concrete and direct military advantage

anticipated, is prohibited.’ 128

Combatants received customary law protections via limitations on the weaponry an

opposing party could use. Methods of harm that would impose unnecessary suffering or

superfluous injury were not endorsed.129 In other words combatants are not able to inflict

whatever damage they like on each other.130

Many military manuals incorporate the principle of proportionality131 so does some State

legislation that makes violations of the principle of proportionality an offence, this

legislation is supported by official statements who were not party to Additional Protocol

I132 at the time. The International Court of Justice received State submissions regarding

the Nuclear Advisory Opinion,133 in some of those submissions States who were not a

party to Additional Protocol I134 endorsed proportionality.135 The International Court of

Justice identified the principle of proportionality and deemed ‘respect for the

environment’ as one of the elements that go to assessing whether an action is in

conformity with the principles of necessity and proportionality’.136

                                                                                                               128  Henckaerts,  above  n  88,  46.    129  Judith  Gardam,  above  n127,  165.  130  Henckaerts,  above  n  88,  237.  131  Ibid  46.  132  Protocol  Additional  to  the  Geneva  Conventions  of  12  August  1949,  and  relating  to  the  Protection  of  Victims  of  International  Armed  Conflicts  (Protocol  I)  opened  for  signature  8th  June  1977,  1125  UNTS  3  (entered  into  force  7th  December  1979).  133  [  1996]  ICJ  Rep  225  No  93.  134  Protocol  Additional  to  the  Geneva  Conventions  of  12  August  1949,  and  relating  to  the  Protection  of  Victims  of  International  Armed  Conflicts  (Protocol  I)  opened  for  signature  8th  June  1977,  1125  UNTS  3  (entered  into  force  7th  December  1979).  135  Nuclear  Weapons  (Advisory  Opinion)  [  1996]  ICJ  Rep  225  No  93.  136  Nuclear  Weapons  (Advisory  Opinion)  [  1996]  ICJ  Rep  225  No  93  [41].  

Autonomous  Weapons,  are  they  Legal?     a1630695    

  37  

Proportionality Codified

Additional Protocol I137 codified proportionality,

‘[A]n attack which may be expected to cause incidental loss of civilian life, injury

to civilians, damage to civilian objects, or a combination thereof, which would be

excessive in relation to the concrete and direct military advantage anticipated.’138

The Rome Statute139 provides that failure to comply with the principle of proportionality

is a war crime.

The principle of proportionality is entrenched as a part of International Humanitarian Law

but it must be noted that it ‘is easily stated, [but]there is no question that proportionality is

among the most difficult of [the Law of Armed Conflict] norms to apply’.140 ‘The military

says [calculating proportionality] is one of the most difficult decisions that a commander

has to make.’141

The Challenge of Proportionality

The proportionality test is not about balancing quantitative data. The Prosecution v

Gotovina (Trail Judgment) (International Criminal Tribunal for the Former Yugoslavia,

Trial Chamber IT-06-90-T 15th of April 2011) case provides an excellent example of how

trying to balance quantitative data fails to adequately measure proportionality. Through

                                                                                                               137  Article  51(5)(b)  Protocol  Additional  to  the  Geneva  Conventions  of  12  August  1949,  and  relating  to  the  Protection  of  Victims  of  International  Armed  Conflicts  (Protocol  I)  opened  for  signature  8th  June  1977,  1125  UNTS  3  (entered  into  force  7th  December  1979).  138  Ibid,  Also  see  Article  57(2)(iii)  Precautions  in  attack.  139  Article  8(b)(iv)  Rome Statute of the International Criminal Court, opened for signature 17th July 1998, 2187 UNTS 90 (entered into force 1 July 2002). 140  Michael  N.  Schmitt,  ‘Essays  on  Law  and  War  at  the  Fault  Lines’  (Asser  Press,  2012)  190.    141  Noel  Sharkey,  ‘Killing  Made  Easy:  from  Joysticks  to  Politics’  in  Patrick  Lin  and  George  Bekey  and  Keith  Abney  (eds)  The  Ethical  and  Social  Implications  of  Robotics  (Cambridge:  MIT  Press,  2012)  123.  

Autonomous  Weapons,  are  they  Legal?     a1630695    

  38  

expert witness’s the Trial Chamber examined all the quantitative data pertaining to

artillery fire. Elements such as the type of guns, potential firing distances as well as

external factors such as muzzle velocity, wind speed, air temp and density were all

accounted for by the formula.

Using this data the Trial Chamber concluded that if a shell landed further than 200 meters

from a lawful target the attack was an intentional or indiscriminate attack on civilians.142

This assumption was the keystone in the guilty verdict for Gotovina and others.143

Prior to the appeal an Amicus Curiae,144 comprising of experts in the practical application

of the law of war in military operations found the 200 meter Trial Chamber finding to be

‘inconsistent with the realities of operational employment of artillery and other indirect

fire assets’.145 The decision was subsequently overturned by the Appeal Chamber on the

basis that the 200-meter standard was unanimously held to be an invalid legal standard146

and once removed the attacks found not to be in violation of IHL.147

Purely quantitative measurements such as the 200 meter standard are inadequate, a unique

psychological input is also required. The ICRC suggests the proportionality tests allow for

a ‘fairly broad margin of judgement…[and]….must above all be a question of common

                                                                                                               142  Prosecution  v  Gotovina  (Trail  Judgment)  (International Criminal Tribunal for the Former Yugoslavia, Trial Chamber  IT-­‐06-­‐90-­‐T  15th  of  April  2011)[1898].  143  Gary  D  Solis,  ‘The  Gotovina  Acquittal:  A  sound  Appellate  Course  Correction’  (2013)  215  Military  law  Review  78,  86.  144  Amicus  curiae  is  Latin  for  ‘a  friend  of  the  court.  A  person,  usually  a  barrister  who,  with  the  court’s  permission  may  advise  the  court  on  a  point  of  law  or  on  a  matter  of  practice’.  They  have  no  personal  interest  in  the  case.  Peter  E  Nygh  and  Peter  Butt  (eds)  Butterworths  Australian  Legal  Dictionary  (Butterworths  1997)  52.  145The  Prosecutor  v  Gotovina  and  Markac,  (Ante  Gotovina’s  Response  to  ‘Application  and  Proposed  Amicus  Curiae  Brief’)(International  Criminal  Court  for  former  Yugoslavia)  IT-­‐06-­‐90-­‐A  (13th    January  2012)  [9(4)].    146  Solis,  above  n  141,  94.    147Ibid,  87.  The  joint  criminal  enterprise  charge  was  also  overturned.  

Autonomous  Weapons,  are  they  Legal?     a1630695    

  39  

sense and good faith for military commanders’.148 The Red Cross has accurately identified

that subjective human input is required.

Additionally the ICTY advised that,

‘In determining whether an attack was proportionate it is necessary to examine

whether a reasonably well informed person in the circumstances of the actual

perpetrator, making reasonable use of the information available to him or her,

could have expected excessive civilian casualties to result form the attack’.149

Again, reasonableness is a uniquely human quality not something that can be assigned to

an autonomous weapon. The difficulties humans have with proportionality in practice

were highlighted in the NATO bombing campaign against the Federal Republic of

Yugoslavia.

Allegations were made during the NATO campaign in Kosovo that the rule of

proportionality had been breached by NATO in the form of environmental damage.150 The

report suggests,

‘It is difficult to assess the relative values to be assigned to the military advantage

gained and harm to the natural environment, and the application of the principle of

proportionality is more easily stated than applied in practice’.151

                                                                                                               148  International  Committee  of  the  Red  Cross,  Commentary  –  Precautions  in  attack  International  Committee  of  the  Red  Cross  <http://www.icrc.org/ihl.nsf/COM/470-­‐750073?OpenDocument>    149  Prosecutor v Galic, (Trial Judgment) (International Criminal Tribunal for the Former Yugoslavia, Trial Chamber, IT-98-29-A 5th of December 2003) [58]. 150Article  8(b)(iv)  Rome Statute of the International Criminal Court, opened for signature 17th July 1998, 2187 UNTS 90 (entered into force 1 July 2002).

Autonomous  Weapons,  are  they  Legal?     a1630695    

  40  

The report found that whilst a commander has the obligation to use weaponry least likely

to minimize incidental damage, operational reality is recognized by the Rome Statute.152

Long term and severe damage to the natural environment was not found to have

occurred.153 Again this is not a simple concept that can easily be emulated, indeed it is

‘abstract, not easily quantified, and highly relative to specific contexts and subjective

estimates of value’.154

Why will an Autonomous Weapon struggle with Proportionality?

Human’s clearly struggle with the application of proportionality and the notion that a

autonomous weapon would be more able is somewhat ludicrous. Even the most war

seasoned commander has a sense of compassion for the taking of a human life however

justified and this acts as a natural constraint when determining the proportionate number

of civilian casualties. A robot cannot emulate compassion.

Targeting decision also require value judgements heavily based on context,155 Constant

change is a predominant factor of battle and requires the good faith and common sense

elements and experts question the ability of autonomous robots to contend with these

factors.156 By way of example an enemy leader may be detected by an autonomous

weapon in a city, the robot has to contend with constant movement, busses, cars, trains,

                                                                                                                                                                                                                                                                                                                                           151  ICTY,  above  n,  28[19].  152  Rome  Statute  Article  8(b)(iv)  Rome Statute of the International Criminal Court, opened for signature 17th July 1998, 2187 UNTS 90 (entered into force 1 July 2002). 153  ICTY,  above  n,  28  [21].  154  Peter  Asaro,  Modeling  the  Moral  User  (16th  March  2009)  IEEE  Explore  Digital  Library  21.  <http://peterasaro.org/writing/Asaro%20Modeling%20Moral%20User.pdf>  155  Human  Rights  Watch,  above  n  3,  32.  156  Noel  Sharkey,  “Automated  Killers  and  the  Computing  Profession’  (2007)  40(11)  Computer,  122-­‐124.  

Autonomous  Weapons,  are  they  Legal?     a1630695    

  41  

planes and pedestrians as well as the fact that the value of the detected leader could

change as the conflict changes a value judgement at this point is required, by a human.157

Experts in the field of autonomy question the ability of an autonomous weapon to comply

with proportionality test.158

A practical example illustrates that programing an autonomous weapon to determine

proportionality becomes more complicated when the limited arsenal the robot can carry is

taken into consideration. One of the big decisions that must to be made when analysing

proportionality is determining the appropriate level of force.159 Autonomous weapons will

struggle to apply the correct level of force needed on the battlefield. Judgements of this

nature need humans because the decision relies on reason, reflection and the interpretation

of information available to formulate an opinion. 160

Returning to our simple autonomous robot at the beginning of this dissertation. It may

decide a door must be opened due to ‘military necessity’ if it is only armed with a heavy

artillery fire there can clearly be no proportionality. The robot’s taking down the door will

engage in a disproportionate act. Alternatively if the robot does not fire the ‘military

necessity’ will not be achieved. Robots will generally have to be limited by what they are

able to carry and such limitation will make adhering to proportionality all the more

difficult.

                                                                                                               157  Human  Rights  Watch,  above  n  3,  34.  158  Human  Rights  Watch,  above  n  3,  33.  159  Human  Rights  Watch  ‘Shaking  the  Foundations,  The  Human  Rights  Implications  of  Killer  Robots’  (12th  May  2014)  Human  Rights  Watch  12.  <http://www.hrw.org/reports/2014/05/12/shaking-­‐foundations>.  160Mary-­‐Ann  Russon,  ‘Should  Killer  Robots  be  Banned  in  Policing?’  International  Business  Times  May  12th  2014.  Published  in  the  ‘International  Business  Times’  (United  Kingdom)  May  12th  2014.    <http://www.ibtimes.co.uk/should-­‐killer-­‐robots-­‐be-­‐banned-­‐policing-­‐1448189>.  

Autonomous  Weapons,  are  they  Legal?     a1630695    

  42  

Thus it is clearly evident that an autonomous weapon is not able to conform to the IHL

principle of proportionality due to its incapacity to emulate human decision making by

way of value judgements and reasonableness in context. Furthermore the weaponry it is

able to carry will limit the choice of the autonomous weapon pertaining to the level of

force applied.

Proposed Solutions

Krishnan161 suggests prior to releasing autonomous weapons onto the battle field they

undergo a reliability rate in the development stage such as those used by the United States

in relation to cluster weapons. According to the moratorium on the export of cluster

munitions a 99 precent reliability rate is required that the bomb lets will explode in the

blast is required, anything less is unacceptable.162 Understandably this is appealing for

proponents of autonomous weapons but the controls determining the reliability of cluster

munitions are relatively narrow compared to the immense number of scenarios that would

need to be tested prior to ascertaining the 99 precent reliability. It is questionable that

testing would even be effective, this is a weapon with an independent though capacity not

an automated bomb that is programed that ‘if’ I am detonated ‘then’ 100 precent of my

bomb lets must explode.

Autonomous weapons will violate IHL, development should be banned however the

amount of money being poured into this technology is staggering. In the event that it is

not stopped States who use weaponry that breaches their international obligations, one of

which is compliance with IHL, can be held to account. Individuals will not escape

accountability for war crimes where the autonomous weapon is the principle perpetrator

                                                                                                               161  Krishnan,  above  n  61,  98.  162  Ibid.  

Autonomous  Weapons,  are  they  Legal?     a1630695    

  43  

in the commission of a crime. Corporations may escape accountability, which is discussed

in the next chapter.

Autonomous  Weapons,  are  they  Legal?     a1630695    

  44  

Chapter 4 Accountability

Accountability is an important aspect of law as it seeks to deter future violations and

provides victims with a sense of retribution.163 When autonomous robots violation the

law, and they will, who will we hold to account for the violation? There are three potential

means of accountability. The first is State Responsibility, provided the State has an

obligation and the breach can be attributed to the State the State using the autonomous

weapon may be held to account. Deliberate interference with an autonomous weapon by

individuals leading to the robot being the principle perpetrator of crimes may invoke

individual accountability through Joint Criminal Enterprise. These areas of law are not

clear cut, issues certainly exist but the law can accommodate autonomous weapons. An

accountability gap exists when there is malfunction or the machine acts unpredictably

causing a violation due to the fault of the corporation who manufactured the machine,

accountability in this instance is a matter of luck.

Autonomous weapons will violate IHL they will not necessarily act the way they have

been programed to or the way we want them to. Unpredicted robotic behaviors have

arisen simply via the complexity of programs interacting within the robot.164 Additionally

‘Complex systems are prone to component failures and malfunctions, and to intermodal

inconsistencies and misunderstandings’.165 Such complexity means it is not possible for any

one person to accurately predict with 100 percent certainty the way the robot will respond

to a given command.166

                                                                                                               163  Human  Rights  Watch,  above  n  3,  42.  164  Marchant  et  al,  above  n  39,  275-­‐284.  165  Roger  Clarke  Asimov's  Laws  of  Robotics  Implications  for  Information  Technology-­‐Part  II,  (1994).  27(1)  Computer  57,  65.  166  Ibid.    

Autonomous  Weapons,  are  they  Legal?     a1630695    

  45  

Autonomous Weapons and State Responsibility

States are the principle players in international law167 and as such should be held

accountable for any violations. The following fictitious scenario illustrates State

Responsibility. The United States purchases an autonomous weapons from the United

Kingdom and some months later becomes involved in a war in which the autonomous

weapons is deployed by two combatants despite the United States rules of engagement

stating that deployment of the weapon requires Commander approval. The deployment

violated the principle of distinction, which the United States had an obligation to uphold

furthermore the violation can be attributed to the United States.

Article 1 of the Draft Articles on Responsibility of States for Internationally Wrongful

Acts GA Res 56/83, UN GAOR, 53rd Sess , Agenda Item 162, Supp No 10, UN Doc

A/RES/56/83 (adopted November 2001) states that ‘any violation by a State of any

obligation, of whatever origin, gives rise to State responsibility’. The elements consist of

firstly that the violation or conduct must be attributable to the State and secondly an

international obligation must be breached.168

The fictitious scenario illustrates a breach of an international obligation to distinguish

between civilians and combatants. Attribution elements may give rise to holding the

United Kingdom who sold the weapon and the United States to utilised it to account.

                                                                                                               167  Stephen  Hall,  Principles  of  International  Law  (LexisNexis,  2nd  ed,  2011)  238  [5.1].  168  Article  2  of  the  Draft  Articles  on  Responsibility  of  States  for  Internationally  Wrongful  Acts  GA  Res  56/83,  UN  GAOR,  53rd  Sess  ,  Agenda  Item  162,  Supp  No  10,  UN  Doc  A/RES/56/83  (adopted  November  2001).  

Autonomous  Weapons,  are  they  Legal?     a1630695    

  46  

Article 4 of the Draft Articles169 ensures that ‘the conduct of any State organs shall be

considered an act of that State…..’ Public military personal are considered organs of the

state 170 and violations of domestic military policy should not inhibit attribution. Article

7171 states that a person exercising government authority continues to do so even if they

exceed their authority. The Caire Claim (France v Mexico) (Arbitration Tribunal) (1929)

5 RIAA 516 is the case on point where the International Court of Justice stated that the

officers ‘acted in their capacity as officers and used the means placed at their disposition

by virtue of that capacity’. 172 The officers in this case were Mexican and the wrong was

attributed to Mexico. The wrong in our scenario should be attributed to the United States

with relative ease. Reparations are payable by the violating State173 and can take the form

of restitution174 compensation175 and satisfaction176 provided the victim State has not met

any of the exceptions which permit the failure to meet the obligation by the purported

offending State.

Attribution to the United Kingdom for supply of the weapon that caused the harm would

not be feasible. Article 6177 asserts that the conduct of organs placed at the full disposal of

a State by another State negates attribution provided the state using the organ is exercising

all the elements of governmental authority.178 Thus the nation who funds the research and

                                                                                                               169  Draft  Articles  on  Responsibility  of  States  for  Internationally  Wrongful  Acts  GA  Res  56/83,  UN  GAOR,  53rd  Sess  ,  Agenda  Item  162,  Supp  No  10,  UN  Doc  A/RES/56/83  (adopted  November  2001).  170  Hall,  Above  n  167,  246  [5.19].    171  Draft  Articles  on  Responsibility  of  States  for  Internationally  Wrongful  Acts  GA  Res  56/83,  UN  GAOR,  53rd  Sess  ,  Agenda  Item  162,  Supp  No  10,  UN  Doc  A/RES/56/83  (adopted  November  2001).  172  Caire  Claim  (France  v  Mexico)  (Arbitration  Tribunal)  (1929)  5  RIAA  516.  173  Article  31  Draft  Articles  on  Responsibility  of  States  for  Internationally  Wrongful  Acts  GA  Res  56/83,  UN  GAOR,  53rd  Sess  ,  Agenda  Item  162,  Supp  No  10,  UN  Doc  A/RES/56/83  (adopted  November  2001).  174  Ibid  Article  35.  175  Ibid  Article  36.    176  Ibid  Article  37.    177  Ibid.  178  Report  of  the  Commission  to  the  General  Assembly  on  the  Work  of  its  fifty-­‐third  session  [2001]  II(2)  Yearbook  of  the  International  Law  Commission  1,  44[1].  

Autonomous  Weapons,  are  they  Legal?     a1630695    

  47  

development and capitalises from autonomous weapons would not be accountable under

State Responsibility.

The use of autonomous weapons which violate IHL should not prevent accountability via

State Responsibility.

Autonomous Weapons and Joint Criminal Enterprise (JCE).

International criminal law and human rights law are unique exceptions in international

law because they are concerned with individuals rather than States.179 Modes of liability

are used to hold individual perpetrators to account for various roles played in the

commission of substantive crimes. They are recognised as customary law180 and despite

some differences are also contained in the Rome Statue.181

The autonomous weapon was the principle perpetrator in our story but the two combatants

have deployed the weapon against regulations. Furthermore the over ride of the built in

safety was preformed by a robotics engineer in a private company in the United Kingdom

and the change in programing to enable the weapon to move over different terrain was

carried out by a disgruntled employee from a company in the United States. All

participants were aware that the weapon would be utilised in a crime to further their

beliefs. All of these individuals can be held to account, the autonomous weapon as the

principle perpetrator should not be significant.

                                                                                                               179Hall,  above  n  167,  448  [10.1].  180  International  Criminal  Law  Services,  International  Criminal  Law  &  Practice  Training  Materials  Modes  of  Liability:  Commission  &  Participation  ICL  Practice  and  Training  Materials  United  Nations  Interregional  Crime  and  Justice  Research  Institute  3[9.2.1]  <http://wcjp.unicri.it/deliverables/docs/Module_9_Modes_of_liability.pdf>  181Article  25(3)  Rome  Statute  of  the  International  Criminal  Court,  opened  for  signature  17th  July  1998,  2187  UNTS  90  (entered  into  force  1  July  2002).    

Autonomous  Weapons,  are  they  Legal?     a1630695    

  48  

The complexity of autonomous weapons suggests that Joint Criminal Enterprise (JCE)

should ensure individual accountability. Many years ago computer programs could be

written and interpreted by individuals this is no longer the case and many teams of

programmers work on writing codes for different functions of the one robot.182 There is a

strong possibility that an autonomous weapon used in the commission of a crime will

require several perpetrators.

IHL ensures there are mechanisms in place to hold orchestrators to account. Criminal

masterminds often deliberately distance themselves geographically from the actual

crime,183 and autonomous weaponry acting as the principle perpetrator further facilitates

the ability of one to distance themselves. JCE, derived from the term ‘committed’ in the

ICTY statute,184 establishes an accountability mechanism. The ICTY utilised JCE which

has its foundations in customary law185 and The Rome Statue has incorporated the

concept in co-perpetration,186 indirect co perpetration187 and other types of common

purpose liability.188

                                                                                                               182  Marchant,  Above  n  39,  275-­‐284.    183  Katrina  Gustafson  ‘The  Requirement  of  an  ‘Express  Agreement’  for  Joint  Criminal  Enterprise  Liability’  (2007)  5(1)  Journal  International  Criminal  Justice  134,135.    184  Prosecutor  v  Tadic  (Appeals  Judgment)  (International  Criminal  Tribunal  for  the  Former  Yugoslavia,  Appeal  Chamber,  Case  No  IT-­‐94-­‐1-­‐A,  15th  July  1999)  [190]  referring  to  Art  7(1)  Statute  of  the  International  Criminal  Tribunal  for  the  Former  Yugoslavia  SC  Res  808,  UN  SCOR,  3175th  mtg,  (22nd  February  1993).  The  term  committed  was  interpreted  to  mean  JCE.  185  Prosecutor  v  Tadic  (Appeals  Judgment)  (International  Criminal  Tribunal  for  the  Former  Yugoslavia,  Appeal  Chamber,  Case  No  IT-­‐94-­‐1-­‐A,  15th  July  1999)  [190]  JCE  first  emerged  here  and  was  subsequently  used  in  ICTY  decisions.  186  Article  25(3)(a)  Rome  Statute  of  the  International  Criminal  Court,  opened  for  signature  17th  July  1998,  2187  UNTS  90  (entered  into  force  1  July  2002).  187  Ibid,  Article  25(3)(a).  188  Ibid,  Article  25(3)(d).  

Autonomous  Weapons,  are  they  Legal?     a1630695    

  49  

Three types of joint criminal enterprise189 are recognised and all three require the same

actus reas elements.190 Firstly the ‘plurality of persons’,191 secondly the ‘existence of a

common plan, design or purpose which amounts to or involves the commission of a

crime’,192 and thirdly ‘[p]articpation of the accused in the common design’.193 The

accused does not have to be present during the commission of the crime ensuring

accountability exists when the crime is carried out by an autonomous weapon.

The mens rea of the crime varies between the three categories;194 at the most basic level,

JCEI criminal intent to commit crimes that form part of a common plan must exist195 and

the accused must intend to partake in the plan.196 The second JCE requires that the

accused ‘have personal knowledge of an organized criminal system and intent to further

the criminal purpose of the system’.197 The final JCE;

‘[T]he accused can only be held responsible for a crime outside the common

purpose if, under the circumstances of the case: (i) it was foreseeable that such a

crime might be perpetrated by one or other members of the group and (ii) the

                                                                                                               189  Prosecutor  v  Tadic  (Appeals  Judgment)  (International  Criminal  Tribunal  for  the  Former  Yugoslavia,  Appeal  Chamber,  Case  No  IT-­‐94-­‐1-­‐A,  15th  July  1999)  [195-­‐226].  190  Ibid  227.  191  Ibid.    192  Ibid.  193  Ibid.  194  Prosecutor  v  Vasiljevic  (Appeals  Judgment)  (International  Criminal  Tribunal  for  the  Former  Yugoslavia,  Appeal  Chamber,  Case  No  IT-­‐98-­‐32,  23rd  February  2004)  IT-­‐98-­‐32  [101].  195  Ibid  [97].  196  Prosecutor  v  Kvocka  and  others  (Appeals  Judgment)  (International  Criminal  Tribunal  for  the  Former  Yugoslavia,  Appeal  Chamber,  Case  No  IT-­‐98-­‐30/1-­‐A,  28th  February  2005)  [82].  197Prosecutor  v  Tadic  (Appeals  Judgment)  (International  Criminal  Tribunal  for  the  Former  Yugoslavia,  Appeal  Chamber,  Case  No  IT-­‐94-­‐1-­‐A,  15th  July  1999)  [202-­‐203  and  228].  

Autonomous  Weapons,  are  they  Legal?     a1630695    

  50  

accused willingly took that risk’.198 The accused in particular must also be

evidenced to have seen that the crime was foreseeable.199

The fact that an autonomous weapon was the principle perpetrator and not a member of

the JCE should be of no consequence. The issue turns on whether or not the crime itself

was found to be within the common purpose of the JCE and that the crime could be

attributed to at least one member of the JCE.200 Furthermore the principle perpetrator of

the crime, in our case the autonomous weapon, does not have to share the mens rea of the

of the JCE member,201 this is advantageous to a prosecutor as establishing mens rea of an

autonomous weapon would be difficult if not impossible because as discussed above killer

robots do emulate the human emotion of intent. Despite not being tested before a legal

authority there is no obvious reason why an autonomous weapon could not act as the

principle perpetrator and the JCE members be held liable for those crimes.

Other modes of liability are available when JCE elements cannot be met.202

Private Corporations and Accountability

A prominent accountability gap exists with the private companies who develop and

promote autonomous weapons. Experts predict autonomous weapons will have a level of

unpredictability,203 Robert Sparrow a professor of philosophy states

                                                                                                               198  Ibid  [228].  199  Ibid  [220].  200  Prosecutor  v  Martic  (Appeals  Judgment)  (International  Criminal  Tribunal  for  the  Former  Yugoslavia,  Appeal  Chamber,  Case  No  IT-­‐95-­‐11-­‐A,  8th  October  2008)  [168].  201  Prosecutor  v  Bradanin  (Appeals  Judgment)  (International  Criminal  Tribunal  for  the  Former  Yugoslavia,  Appeal  Chamber,  Case  No  IT-­‐99-­‐36-­‐A,  3rd  April  2007)[410].  202  See  for  example  aiding  and  abetting,  planning,  instigating  or  inciting  and  others.  203  Clarke,  above  n  165.    

Autonomous  Weapons,  are  they  Legal?     a1630695    

  51  

‘[T]he possibility that an autonomous system will make choices other then those

predicted and encouraged by its programmers is inherent in the claim that it is

autonomous’.204

The use of private firms in the design and manufacture of weapons has significantly

expanded following the second World War.205 There appears to be a distorted notion that

holding a programmer or developer to account is some how analogous to the idea that

parents can be held responsible for the actions of their adult children simply because a

weapon may have the capacity to think for itself. Weapons don’t have rights as children

do, they can be locked in a shed for all eternity. Obviously there is a reasonably

foreseeable risk that an unpredictable product, designed to kill will not act in accordance

with programing or law. It would be nothing short of negligent to release such

unpredictability.

Philosopher Peter Asaro argues,

Corporations can be held legally responsible for their practices and products,

through liability laws and lawsuits. If their products harm people through poor

design, substandard manufacturing, or unintended interactions or side effects that

corporation can be compelled to pay damages to those who have been harmed, as

well as punitive damages.206

Civil actions attracting large amounts in damages would certainly induce a high level care

from private companies.207 There would be additional benefits for the victims who would

                                                                                                               204  Peter  Sparrow,  ‘Killer  Robots’  (2007)  24(1)  Journal  of  Applied  Philosophy  62,  69-­‐70.  205  Marcus  Sumner,  ‘Studies  of  the  Defense  Contracting  Process’  (1964)  Law  and  Contemporary  Problems  19,  20.  206  Krishnan,  above  n  59,  104.  207  Eric  Mongelard  ‘Corporate  Civil  Liability  for  Violations  of  International  Humanitarian  Law’  (2006)88(863)  International  Review  of  the  Red  Cross  665,  666.  

Autonomous  Weapons,  are  they  Legal?     a1630695    

  52  

receive financial compensation. International law can be applied to non-state actors as

seen with criminal responsibility, it can also extend to corporations and international law

can also determine what constitutes a civil wrong by a company.208 In other words an

obligation can be imposed on corporations by international law. Accountability becomes

problematic because enforcement of these obligations is left to individual States.

The United States provides an excellent example of States ensuring their private

corporations are accountable. It has enacted the Alien Tort Claims Act 28 USC § 1350.

This part of the code enables,

‘The district courts [to] have original jurisdiction of any civil action by alien for a

tort only, committed in violation of the law of nations or a treaty of the United

States’.

This means a non-foreign national can sue an American corporation. Doe v Unocal Corp

110F Supp2d 1294 (CD Cal 2000) US Federal District Court, judgement of 31 August

2000, is an excellent example of how a private corporation can be held to account. This

case was held within the United States judicial system. A group of Myanmar citizens sued

Unocal Corp, an American based company, for aiding and abetting the Myanmar military

in human rights violations including rape, torture and forced displacement, all of which

would be violations of IHL had they occurred during war rather than peace. A pipeline

was being constructed in partnership between the Myanmar government and Unocal in

which the Myanmar government was to supply security and labour.

The court found in the first instance that responsibility could be attached to Unocal under

joint action theory, which is not dissimilar to JCE above. One of the requirements was the

                                                                                                               208  Ibid  670.    

Autonomous  Weapons,  are  they  Legal?     a1630695    

  53  

working towards a common design, the court identified this common design to be a

profitable project. The court also applied the ‘proximate cause’ test that meant the

plaintiff had to establish that the company exercised control over the government, the

court held that whilst Unocal was aware of the slave labour and utilised it to achieve the

common design it was not enough to hold the company liable.

The decision was overturned on appeal, the Court of Appeal found aiding and abetting,

borrowed from the criminal jurisdiction, should have been applied. The Appeals Court

said that in international cases such as this international law either United States treaty or

customary law was preferable to domestic.209 The court provided sound reasoning for this

decision,

‘First, the needs of the international system are better served by an international

standard.210 Second, the relevant policies of the forum cannot be ascertained by

disregarding decision that has favoured international law.211 Third, reliance on

international standards of secondary liability promote consistency, uniformity,

predictability, and protection of justified expectations. 212 And fourth, the

overarching purpose of the ATCA is to provide a civil remedy for violations of

international law, a goal furthered by application of international standards.213’.

Unocal was held to be liable under the Alien Tort Claim Act.

                                                                                                               209  Doe  v  Unocal,  395  F.3d  932,  42  (9th  Circuit)  (2002).  210  Ibid.  211  Ibid.  212  Ibid.  213  Ibid  [42-­‐43].  

Autonomous  Weapons,  are  they  Legal?     a1630695    

  54  

Unfortunately this policy by the United States is not mimicked globally. Reasons for this

include the fact that some of the corporate giants wield more economic power than some

individual states.214 Better economic yields area often achieved by corporations than the

nation states in which they operate.215 The new millennium mantra to ‘deregulate’ has

seen States vying for investment by multinational companies 216 and this places

governments at a disadvantage, leading to a decrease in control. Additionally many large

multinational companies are international making it difficult for domestic national law to

effectively govern them. In some instances the labour and component market is divided

between nations,217 as in our case where the autonomous weapons component for

movement was in one country and another component in another. Again States have the

power to create agreements within international law to combat such difficulties and ensure

these companies are held to account. States have the requisite power to enable victims to

hold corporations to account through civil litigation218 they just need the will to do so.

In the event States are prepared to risk autonomous weapons on the battlefield it may be

in their best interest to ensure the firms they purchase such weapons from can be held to

account for violations of IHL.

Where to from Here?

To date there are no current international instruments placing a prohibition on

autonomous weapons but the time has come to discuss the implementation of laws

banning them. The urgently required prohibitions may be in the form of amendment by

                                                                                                               214  Mongelard,  above  n  207,  669.  215  Beth  Stephens  ‘The  amorality  of  profit’  (2002)  20  Berkeley  Journal  of  International  Law  45,  57.  In  2002  only  seven  nations  had  revenues  greater  than  General  Motors.  216  Stephens,  above  n,  215,  58.      217  Ibid  54.  218  Ibid  60.    

Autonomous  Weapons,  are  they  Legal?     a1630695    

  55  

way of an additional protocol to the  Convention  on  Prohibitions  or  Restrictions  on  the  

Use   of   Certain   Conventional   Weapons   Which   May   be   Deemed   to   be   Excessively  

Injurious   or   to   Have   Indiscriminate   Effects,   Opened   for   signature   10th   of   October  

1980,  1342  UNTS  132  (entered   into   force  2nd  December  1983). The preamble states

‘that prohibits the employment in armed conflicts of weapons, projectiles and material

and methods of warfare of a nature to cause superfluous injury or unnecessary

suffering’219 This dissertation has made it clear that killer robots will not be able to

emulate human emotion which is necessary not only to restrain acts of war but also to

accurately apply the touchstone principles of IHL, distinction and proportionality which

should satisfy the preamble requirements. An expert panel has met under this

Convention220 to discuss autonomous weapons.

A meeting of experts on Lethal Automatous Weapons Systems was held at the United

Nations in Geneva from the 13th to 16th of May 2014221 as part of this convention 87

countries participated.222 Problems can arise with such instruments as they only bound

states that choose to be bound leaving insurgent groups not bound by these agreements223

and may be able to purchase such weapons from private corporations who will most likely

not be held to account for any violations their product causes. States should ensure their

domestic legislation enables aliens to hold private companies to account for breaches of

IHL.

                                                                                                               219  Ibid  220  Ibid.  221  United  Nations  Office  at  Geneva,  Disarmament,  Lethal  Autonomous  Weapons  (2013)  United  Nations  Office  at  Geneva  <http://www.unog.ch/80256EE600585943/%28httpPages%29/6CE049BE22EC75A2C1257C8D00513E26?OpenDocument>  222Human  Rights  Watch,  UN:  ‘Killer  Robots’  on  the  Run.  (May  16,  2014)  Human  Rights  watch  <http://www.hrw.org/news/2014/05/16/un-­‐killer-­‐robots-­‐run>    223  Marchant  et  al,  above  n  32,  305.  

Autonomous  Weapons,  are  they  Legal?     a1630695    

  56  

These measures are probably to extreme for the pallets of most states. Perhaps prior to the

introduction of international instruments soft law should be utilised to begin discussions

and awareness of the issues autonomous weapons create. Soft law approaches aim to

create substantive principles and norms which are not legally binding.224 Additionally

they allow flexibility as new issues become better understood.225 The following have been

used for emerging technologies and may be suited to autonomous weaponry, although an

outright ban is preferred. These soft law instruments include, codes of conduct226 which

are often industry based and can link into liability.227 Trans governmental dialogue which

are informal flexible arrangements encouraging states to meet and discuss developments

and policies. International sharing and confidence building measures the aim of which is

to increase trust by sharing information and finally framework conventions that at the

onset may be very basic and may start simply by agreeing an agreement is required.

Conclusion

The removal of humans from decision-making during times of war takes away important

restrictions such as the desire not to kill. Indeed the more robotic humans become the

greater the chance of IHL violation. Human decision-making may appear erroneous but

overall the decisions made by humans include very important emotions providing an

imperative restraining factor. Decisions made without emotion by way of autonomous

weapons will require new law, the inherent restraints which form part of IHL will need to

be explicitly contained within it.

                                                                                                               224  Ibid  306.  225  Ibid  305.    226  Ibid  307.  227  Ibid  310.  

Autonomous  Weapons,  are  they  Legal?     a1630695    

  57  

Autonomous weapons cannot replace humans on the OODA loop decision-making

process. The thought capacity of these weapons could not emulate humans as such their

legality becomes questionable. Killer robots will not be able to adequately identify a

combatant or civilian and as such one of the cardinal principle of IHL, the principle of

distinction would be difficult to comply with. The principle of proportionality requires a

subjective judgement and cannot be quantified again it will be impossible for a killer

robot to apply.

Finally autonomous weapons themselves cannot be held to account for violation of IHL,

unpredictability is an inherent aspect of autonomous robots and the result, should private

corporations chose to release them, is an accountability gap unless the weapon was

manufactured in the United States. The use of an autonomous weapon to act as the

principle perpetrator in the commission of a crime will not preclude the modes of liability

from being applied or State Responsibility, both individuals and States can be held to

account for crimes committed by the autonomous weapon.

Autonomous  Weapons,  are  they  Legal?     a1630695    

  58  

Bibliography A Articles/Books/Reports Adams, Thomas K, ‘Future Warfare and the Decline of Human Decision making’ (2001) 31(4) Parameters: US Army War College 57 Asaro, Peter ‘How Just Could a Robot War Be?’ (2008) Proceeding of the 2008 Conference on Current Issues in Computing and Philosophy, 50 Choi, Stephen and AC Pritchard, ‘Behavioural Economics and the SEC’ (2003) 56(1) Stanford Law Review 1 Clarke, Roger, Asimov's Laws of Robotics Implications for Information Technology-Part II, (1994) 27(1) Computer 57, 65 Creamer, Mark, Ric Marchall and Anne Goyne Military Stress and Performance The Australian Defence Force Experience (Melbourne University Press, 2003) Farnsworth, Allen and Steve Sheppard (ed) An Introduction to the Legal System of The United States (Oxford University Press, 4th ed, 2010) Forrest, CJS, ‘The Doctrine of Military Necessity and the Protection of Cultural Property During Armed Conflicts’ (2007) 37(2) California Western International Law Journal 177 Gardam, Judith, ‘Proportionality as a Restraint on the Use of Force’ (1999) 20 Australian Year Book of International Law, 161 Gigerenzar, Gerd and Wolfgang Gaissmaier ‘Heuristic Decision Making’ (2011) 62 Annual Review of Psychology 451 Grossman, Dave, On Killing The psychological Cost of Learning to Kill in War and Society (Hachette Book Group, revised ed, 2009) Gustafson, Katrina ‘The Requirement of an ‘Express Agreement’ for Joint Criminal Enterprise Liability’ (2007) 5(1) Journal International Criminal Justice 134 Hall, Stephen, Principles of International Law (LexisNexis, 2nd ed, 2011) Henckaerts, Jean-Marie and Lousie Doswald-Beck, Customary International Humanitarian Law Volume I: Rules (Cambridge University Press, 2005) Human Rights Watch, ‘Losing Humanity The Case against Killer Robots’ (2012) International Human Rights Clinic 1 International Committee of the Red Cross, Commentary on the Additional Protocols of 8 June 1977 to the Geneva Conventions of 12 August 1949 (Geneva: Martinus Nijhoff Publishers, 1987) Kanwar, Vik, ‘Post-Human Humanitarian Law: The Law of War in the Age of Robotic Weapons’ (2011) 2 Harvard National Security Journal Vol 616

Autonomous  Weapons,  are  they  Legal?     a1630695    

  59  

Krishnan, Armin ‘Killer Robots Legality and Ethicality of Autonomous Weapons’ (Ashgate Publishing Limited, 2009) Marchant, Gary E, et al, ‘International Governance of Autonomous Military Robots’ (2011) 12 The Columbia Science and Technology Law Review 272 Marra, William C and Sonia K McNeil ‘Understanding “The Loop” Regulating the Next Generation of War Machines’ (2013) 36(3) Harvard Journal of Law and Public Policy 1139, 1146. McManners, Hugh The Scars of War (Harper Collins Publishers, 1994). Mongelard, Eric ‘Corporate Civil Liability for Violations of International Humanitarian Law’ (2006)88(863) International Review of the Red Cross 665 Munoz-Rojas, Daniel and Jean Jacques Fresard, ‘Roots of Behavior in War’ (2004) 86(853) IRRC 189 Nygh, Peter E and Peter Butt (eds) Butterworths Australian Legal Dictionary (Butterworths 1997) Report of the Commission to the General Assembly on the Work of its fifty-third session [2001] II(2) Yearbook of the International Law Commission 1 Schmitt, Michael ‘Essays on Law and War at the Fault Lines’ (Asser Press, 2012) Schmitt, Michael and Jeffrey S Thurnher ‘Out of the Loop: Autonomous Weapon Systems and the Law of Armed Conflict’ (2012-2013) 4 Harvard National Security Journal 213 Sharkey, Noel, ‘Killing Made Easy: from joysticks to politics’ in Patrick Lin and George Bekey and Keith Abney (eds) The Ethical and Social Implications of Robotics (Cambridge: MIT Press, 2012) Sharkey, Noel ‘Cassandra or False Prophet of Doom: AI Robots and War’ (2008) 23(4) IEEE Intelligent Systems 14 Sharkey, Noel, “Automated Killers and the Computing Profession’ (2007) 40(11) Computer, 122-124. Solis, Gary, ‘The Gotovina Acquittal: A sound Appellate Course Correction’ (2013) 215 Military law Review 78, Sparrow, Peter, ‘Killer Robots’ (2007) 24(1) Journal of Applied Philosophy 62 Stephens, Beth ‘The amorality of profit’ (2002) 20 Berkeley Journal of International Law 45

Autonomous  Weapons,  are  they  Legal?     a1630695    

  60  

Sumner, Marcus ‘Studies of the Defense Contracting Process’ (1964) Law and Contemporary Problems 19 Tversky, A and D Kahneman ‘The framing of decisions and the psychology of choice’ (1981) 211(4481) Science 453 B Cases Caire Claim (France v Mexico) (Arbitration Tribunal) (1929) 5 RIAA 516. Doe v Unocal, 395 F.3d 932, 42 (9th Circuit) (2002) Nuclear Weapons (Advisory Opinion) [ 1996] ICJ Rep 225 No 93. Prosecutor v Bradanin (Appeals Judgment) (International Criminal Tribunal for the Former Yugoslavia, Appeal Chamber, Case No IT-99-36-A, 3rd April 2007) Prosecutor v Galic, (Trial Judgment) (International Criminal Tribunal for the Former Yugoslavia, Trial Chamber, IT-98-29-A 5th of December 2003) Prosecution v Gotovina (Trail Judgment) (International Criminal Tribunal for the Former Yugoslavia, Trial Chamber IT-06-90-T 15th of April 2011)[ Prosecutor v Kupreskic (Trial Judgment) (International Criminal Tribunal for the Former Yugoslavia, Trial Chamber IT-95-16-T, 14th of January 2000) Prosecutor v Kvocka and others (Appeals Judgment) (International Criminal Tribunal for the Former Yugoslavia, Appeal Chamber, Case No IT-98-30/1-A, 28th February 2005) Prosecutor v Martic (Appeals Judgment) (International Criminal Tribunal for the Former Yugoslavia, Appeal Chamber, Case No IT-95-11-A, 8th October 2008) Prosecutor v Tadic (Appeals Judgment) (International Criminal Tribunal for the Former Yugoslavia, Appeal Chamber, Case No IT-94-1-A, 15th July 1999) Prosecutor v Vasiljevic (Appeals Judgment) (International Criminal Tribunal for the Former Yugoslavia, Appeal Chamber, Case No IT-98-32, 23rd February 2004) Rainbow Warrior (New Zealand v France) (Arbitration Tribunal) (1986) 19 RIAA 199, 251[75]. D Treaties Articles on Responsibility of States for Internationally Wrongful Acts GA Res 56/83, UN GAOR, 53rd Sess , Agenda Item 162, Supp No 10, UN Doc A/RES/56/83 (adopted November 2001) Convention on Cluster Munitions, opened for signature 3rd December 2008, 2688 UNTS (entered into force 1st August 2010) Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May be Deemed to be Excessively Injurious or to Have Indiscriminate Effects,

Autonomous  Weapons,  are  they  Legal?     a1630695    

  61  

Opened for signature 10th of October 1980, 1342 UNTS 132 (entered into force 2nd December 1983). Convention for the Protection of Cultural Property in the Event of Armed Conflict with Regulation for the Execution of the Convention , opened for signature 14th of May 1954 UNESCO (entered into force 7th of August 1956) Protocol Additional to the Geneva Conventions of 12 August 1949, and relating to the Protection of Victims of International Armed Conflicts (Protocol I) opened for signature 8th June 1977, 1125 UNTS 3 (entered into force 7th December 1979) Rome Statute of the International Criminal Court, opened for signature 17th July 1998, 2187 UNTS 90 (entered into force 1 July 2002). Statute of the International Criminal Tribunal for the Former Yugoslavia SC Res 808, UN SCOR, 3175th mtg, (22nd February 1993) Vienna Convention on the Law of Treaties, opened for signature 23rd May 1969, 1155 UNTS 331 (entered into force 27th January 1980). E Other Asaro, Peter, Modeling the Moral User (16th March 2009) IEEE Explore Digital Library 21. <http://peterasaro.org/writing/Asaro%20Modeling%20Moral%20User.pdf> Alton Parrish, Gee Whiz! Sea Whiz 4500 Rounds Per Minute For Close In Naval Fighting (10th September 2012) Before it’s News <http://beforeitsnews.com/science-and-technology/2012/09/gee-whiz-sea-whiz-4500-rounds-per-minute-for-close-in-naval-fighting-2467644.html>?>. Chow, Denise, Drone Wars: Pilots Reveal Debilitating Stress Beyond Virtual Battlefield (5th November 2013) Live Science <http://www.livescience.com/40959-military-drone-war-psychology.html> Heyns, Christof, Report of the Special Rapporteur on Extrajudicial Summary or Arbitrary Executions Agenda Item 3, A/HRC/23?47/Add.4, (23rd of April 2013), Agenda Item 3 Horowitz, Michael C, The looming Robotics Gap Why America’s Global Dominance in Military Technology is Starting to Crumble (5th May 2014) Foreign Policy The Magazine <http://www.foreignpolicy.com/articles/2014/05/05/the_looming_robotics_gap_us_military_technology_dominance> Human Rights Watch, UN: ‘Killer Robots’ on the Run. (May 16, 2014) Human Rights watch <http://www.hrw.org/news/2014/05/16/un-killer-robots-run> Human Rights Watch ‘Shaking the Foundations, The Human Rights Implications of Killer Robots’ (12th May 2014) Human Rights Watch 12. <http://www.hrw.org/reports/2014/05/12/shaking-foundations> Karl Sims, Evolved Virtual Creatures You Tube <http://www.youtube.com/watch?v=F0OHycypSG8>

Autonomous  Weapons,  are  they  Legal?     a1630695    

  62  

Lawand, Kathleen and Robin Coupland and Peter Herby, A Guide to the Legal Review of New Weapons, Means and Methods of Warfare. Measures to Implement Articles 36 of API of 1977 November 2006 International Committee of the Red Cross 1,10. < http://www.icrc.org/eng/assets/files/other/icrc_002_0902.pdf > Melzer, Nils, Interpretive Guidance on the Notion of Direct Participation in Hostilities under International Humanitarian Law Page International Committee of the Red Cross 22-24 <http://www.icrc.org/eng/assets/files/other/icrc-002-0990.pdf> Ministry of Defense ‘The UK Approach to Unmanned Aircraft Systems’ (Joint Doctrine Note 2/11 The Development, Concepts and Doctrine Centre, 30th of March 2011) Moynihan, Tim, Watch the Astounding Dexterity of Honda’s Dancing Humanoid Robot (19th of April 2014) Wired <http://www.wired.com/2014/04/honda-asimo/> International Committee of the Red Cross, Commentary – Precautions in attack International Committee of the Red Cross <http://www.icrc.org/ihl.nsf/COM/470-750073?OpenDocument> International Committee of the Red Cross What is International Humanitarian Law? (July 2004) Advisory Service on International Humanitarian Law <http://www.icrc.org/eng/assets/files/other/what_is_ihl.pdf> International Criminal Law Services, International Criminal Law & Practice Training Materials Modes of Liability: Commission & Participation ICL Practice and Training Materials United Nations Interregional Crime and Justice Research Institute ITCY, Final report to the Prosecutor by the Committee Established to Review the NATO Bombing Campaign Against the Federal republic of Yugoslavia. (Initiated, 14th May 1999) International Criminal Tribunal for the for the former Yugoslavia) <http://www.icty.org/sid/10052#pagetop> The Prosecutor v Gotovina and Markac, (Ante Gotovina’s Response to ‘Application and Proposed Amicus Curiae Brief’)(International Criminal Court for former Yugoslavia) IT-06-90-A (13th January 2012) [9(4)]. Russon, Mary-Anne, ‘Should Killer Robots be Banned in Policing?’ International Business Times May 12th 2014. Published in the ‘International Business Times’ (United Kingdom) May 12th 2014. <http://www.ibtimes.co.uk/should-killer-robots-be-banned-policing-1448189> Sharkey, Noel, Computing Experts from 37 Countries Call for Ban on Killer Robots (16th October 2013) International Committee for Robot Arms Control <http://icrac.net/2013/10/computing-experts-from-37-countries-call-for-ban-on-killer-robots/> United Nations Office at Geneva, Disarmament, Lethal Autonomous Weapons (2013) United Nations Office at Geneva

Autonomous  Weapons,  are  they  Legal?     a1630695    

  63  

<http://www.unog.ch/80256EE600585943/%28httpPages%29/6CE049BE22EC75A2C1257C8D00513E26?OpenDocument> Worsnip, Patrick, ‘UN Official Calls for Study of Ethics, Legality of Unmanned Weapons’ Washington Post, (Washington USA), October 24, 2010