Into the Caves of Steel : Precaution , Cognition and Robotic Weapon Systems Under the International Law of Armed Conflict

The pace of development with respect to robotic weapon systems is staggering. Often formulated in the context of a desire of the ‘haves’ States to minimise battlefield casualties and to reduce monetary costs, technological advancement holds a number of ramifications for the law of armed conflict. Specifically, as technology introduces the possibility of increasingly autonomous forms of robotic weapon systems, the implications of augmenting precision while removing, for all intents and purposes, direct control by or involvement of human beings must be examined, along with differentiated responsibilities of the ‘haves’ States versus the ‘have-nots’ States. This article frames the discussion in the international humanitarian law principle of precaution, as codified in Article 57 of Additional Protocol I, to assess various aspects of the applicability of the relevant provisions to these new weapon systems, and in particular draws conclusions as to how precaution could influence future developments.


Introduction
By now, most people are aware of the operations that involved the use of unmanned combat aerial vehicles (UCAVs), also known as drones. 1 Though deployment of these machines has increased over the past few years, particularly as part of counter-terrorism operations under the umbrella of Operation Enduring Freedom, the origins of UCAVs can be traced back roughly 30 years.However, the modern incarnation of the drone, the Predator, was introduced in the mid-1990s during the conflicts in the former Yugoslavia.In the relatively short time since, technological advancement on the battlefield has only picked up pace, to the point that the US Congress set a goal for the Department of Defense to replace a third of its armed air and ground combat vehicles and weaponry by unmanned systems by the year 2015. 2 In 2010 the US military possessed nearly 7 500 unmanned aircraft, compared to only 167 in 2002. 3The US is not alone in its efforts to augment the share of its military weapon systems comprised of unmanned technologies.More than 40 States have known military robotics programs, 4 and such programs are not restricted to operations in the air but also comprise unmanned vehicles and systems for operations at sea and on the ground. 5manned, in the sense used here, specifically refers to the fact that no person is present aboard the vehicle, although it does not mean that humans have been entirely removed from operations that make use of unmanned technologies.The MQ-I Predator aircraft, for instance, is flown remotely by human operators ostensibly fulfilling the piloting role within the military chain of command structure.In this case, a person who is able to review incoming data remains 'in the loop' to make decisions based on the data, and to act upon and be responsible for subsequent actions, such as carrying out an attack.This scenario as such does not, in principle, cause any legal difficulty. 6This is not to say that the use of UCAVs to target, for example, militants and alleged terrorists in Pakistan and Yemen is not without controversy.It is one thing to maintain clear accountability for operations using unmanned military vehicles, but this does not resolve due process concerns pertaining to targeted killings nor does it address the legal implications of using battlefield weaponry outside of armed conflict. 7These latter two issues are highly relevant, but fall outside the scope of this article, which only looks at situations of international and non-international armed conflict governed by the rules of international humanitarian law (IHL), and these topics have been dealt with extensively in a number of strong studies. 8e increasing deployment of unmanned weapon systems results not only from technological development, but also from the changing nature of 21 st century armed conflicts.Targeted enemies are more mobile, more difficult to identify, and are often ensconced among the civilian population within populated urban areas, a situation in which the elements of the principle of precaution are of the highest importance. 9The time-lag from detection to engagement means that targets may slip away, eluding military action. 10In addition, populations of States involved in armed conflict are often less tolerant of military casualties, which has led to an increased use of robots to perform tasks of the 'dull, dirty and dangerous variety', including, as part of explosive ordinance disposal (EOD), units which disarm e.g.improvised explosive devices (IEDs) in Iraq.In Singer's estimation, the continuing focus on military operations aimed at terrorists will only serve to intensify the development of robotic technologies resulting in robots, such as UCAVs, becoming the 'new normal'. 11Continuing leaps in the research and development sector related to artificial intelligence has catalysed the shift toward increasingly autonomous robotic weapon systems.'Weapon system' in this context is defined as 'a combination of one or more weapons with all related equipment, materials, services, personnel, and means of delivery and deployment (if applicable) required for self-sufficiency'. 12The logic behind using the term 'weapon system' for the purposes of the present study is to demonstrate that developments in automated robotic technologies are creating means of warfare that are platforms for carrying out operations, as opposed to simply 'weapons', or currently deployed UCAVs that are labelled as 'carriers'.Modified by the term robotic, then, means that reference is made to a programmable, self-controlled device that monitors, detects changes in and reacts to its environment.A certain level of autonomy is inherent in this definition, though it is the precise extent of the autonomy that is changing.Take, for instance, the X-47B: a UAV which flies solely by means of onboard computers. 13Admittedly, human operators establish the flight plan and are able to override decisions, though that does not erase the fact that for the duration of the flight, from take off to touch down, the X-47B acts independently.
As technological progress moves in the direction of excising humans from the loop, the question becomes how to apply the rules of IHL to these new technologies.Technological change is, of course, meant to be harnessed in such a way as to adhere to the law.The remainder of this article will discuss the use of (semi-)autonomous robotic weapon systems in armed conflict specifically with respect to the principle of precaution, which relates fundamentally to the discrimination between civilians and combatants and between civilian and military objects in military operations.As Boothby has pointed out, it is against the backdrop of precaution under international law that one should assess the requirements pertaining to the use of robotic combat forces, particularly when the software itself determines essential elements of the operation, from what and when to attack to perhaps even deciding upon the appropriate weapon to use. 14e following section of this article will examine the precaution principle, first as laid down in Article 57 of the 1977 Protocol Additional to the Geneva Conventions of 12 August 1949 relating to the Protection of Victims of International Armed Conflicts (Protocol I), including the notion of 'those who plan or decide upon an attack', and secondly as a rule of customary international (humanitarian) law.After laying out the general rules, the article will focus in particular on the concept of 'feasibility' and how precision engagement capabilities of robotic weapon systems impact the feasibility calculation.The subsequent section of the article will look at the technologies, in light of certain cognitive elements inherent in IHL principles in order to examine whether 'de-humanisation'15 is possible while safeguarding adherence to the pertinent rules.The final section will draw conclusions, including as to the potential influence of the principle of precaution on the future development and use of robotic warfare technologies.It is not the intention of this article to be an exhaustive study of all aspects of the law of armed conflict relevant to the military use of robot technology, which would require a dissertation-sized manuscript, but rather to contribute to the international legal discourse.

I. The Principle of Precaution under IHL
The provisions pertaining to precautions to be taken before or in the course of conducting a military operation, as applied under IHL, and which for the purposes of the current study are highly relevant as they include the requirement to determine the appropriate means and methods of warfare, are set forth in article 57 of Protocol I.The provisions in Article 57 follow from the fundamental rule of distinction as found in Article 48 of Protocol I, which requires States parties to distinguish between civilians and combatants and between civilian and military objects, directing military actions only at military objectives.In addition, Article 57 is related to other rules contained in Protocol I that provide inter alia for the protection of the civilian population (Article 51) 16 and objects (Article 52), as well as for the 'protection of objects indispensible to the survival of the civilian population' (Article 54).Article 57 reads: 1.In the conduct of military operations, constant care shall be taken to spare the civilian population, civilians and civilian objects.
2. With respect to attacks, the following precautions shall be taken: (a) those who plan or decide upon an attack shall: (i) do everything feasible to verify that the objectives to be attacked are neither civilians nor civilian objects and are not subject to special protection but are military objectives within the meaning of paragraph 2 of Article 52 and that it is not prohibited by the provisions of this Protocol to attack them; (ii) take all feasible precautions in the choice of means and methods of attack with a view to avoiding, and in any event to minimizing, incidental loss or civilian life, injury to civilians and damage to civilian objects; (iii) refrain from deciding to launch any attack which may be expected to cause incidental loss of civilian life, injury to civilians, damage to civilian objects, or a combination thereof, which would be excessive in relation to the concrete and direct military advantage anticipated; (b) an attack shall be cancelled or suspended if it becomes apparent that the objective is not a military one or is subject to special protection or that the attack may be expected to cause incidental loss of civilian life, injury to civilians, damage to civilian objects, or a combination thereof, which would be excessive in relation to the concrete and direct military advantage anticipated; (c) effective advance warning shall be given of attacks which may affect the civilian population, unless circumstances do not permit.
3. When a choice is possible between several military objectives for obtaining a similar military advantage, the objective to be selected shall be that the attack on which may be expected to cause the least danger to civilian lives and to civilian objects.
4. In the conduct of military operations at sea or in the air, each Party to the conflict shall, in conformity with its rights and duties under the rules of international law applicable in armed conflict, take all reasonable precautions to avoid losses of civilian lives and damage to civilian objects.

5.
No provision of this article may be construed as authorizing any attacks against the civilian population, civilians or civilian objects.
Precaution, by definition, is a measure taken in advance of a particular action in order to prevent or avoid harm foreseeable to be caused by that action.17Therefore, precaution, similar to the use of the same term in the field of international environmental law,18 incorporates uncertainty, meaning that the risk of harm or undesired results is the measuring stick rather than the certainty of outcomes.Article 57 in particular was subject to a number of interpretive declarations submitted by States upon ratification, accession or succession.For example, a number of States entered interpretive declarations to make clear that information available and circumstances prevailing at the time upon which decisions are based will be determinative of whether the legal obligations in Article 57 are met by the military planners and others responsible for planning, deciding upon or executing attacks. 20Several States also employed interpretive declarations to expound upon the notion of 'military advantage', viewing it as the advantage that is anticipated from an attack in its totality, instead of only from isolated or certain parts of the attack. 21Interestingly, New Zealand pointedly included in its declaration that the military advantage under Article 57(2)(a)(iii) must also take account of security of the attacking forces, among a number of other, unnamed considerations.Sub-paragraph (a)(iii) contains an iteration of the proportionality requirement mandating that if anticipated harm to protected persons and objects is excessive relative to the expected concrete and direct military advantage, the attacking State will refrain from launching an attack.New Zealand, at the time, most likely wished to clarify that not only the achievement of striking a significant military target is included in the proportionality calculation, but that avoiding the placement of its military forces unnecessarily in harm's way is as well.Read in that context, one could point to unmanned combat vehicles as being the most effective way of ensuring security of the attacking forces, though there would also need to be other metrics of military advantage, otherwise any incidental loss of civilian life or collateral damage to civilian objects could be deemed disproportionate.

I.1 Those Who Plan or Decide Upon an Attack
The language used in Article 57 to establish precautions required in an attack leaves substantial discretion to the attacking State, for example in determining excessiveness of incidental loss of civilian life relative to military advantage, while at the same time Protocol I makes failure to comply with the terms of Article 57 a potential grave breach which may duly be prosecuted. 22A number of delegations to the Diplomatic Conference expressed concern at the lack of precision in the wording of the Article which would be necessary for enacting any sort of penal legislation and prosecuting alleged grave breaches, some going so far as to deem the Article dangerously imprecise. 23The imprecision referred to has to do with the deferral on most accounts to the judgment of the commanders, decision-makers or planners of an attack.Switzerland went some way toward clarifying the scope of paragraph 2 in particular in its reservation that obligations created by this paragraph -the text states 'those who plan or decide upon an attack' -are only applicable to commanding officers at the level of battalion or group and above. 24e wording singling out planners or those who decide upon the attack could lead to added complications with respect to robotic weapon systems.Even in the extreme case of technological autonomy, in which a decision to initiate an attack would ostensibly be made by means of artificial intelligence, robotic systems would still require initial decisions regarding, inter alia, the programming of algorithms for area of movement and targeting.Such activities undertaken in the attack preparation phase, prior to attack execution, would foreseeably fall under the scope of planning.However, equating operation programming of a military combat robot with attack planning might draw civilian technicians into non-civilian roles, or in other words result in civilians taking direct part in hostilities, as the armed forces may not have the knowledge or training to carry out the more technical tasks on increasingly advanced systems.The AMW Manual helps to clarify such issues with respect to aerial combat vehicles by including in the definition of 'military aircraft' those aircraft that are 'pre-programmed by a crew subject to regular armed forces discipline', 25 though this definition is subject to fairly restrictive application.As one can imagine, the lines could get quite convoluted as military planning must be translated, perhaps by non-military IT specialists, into concrete attacks relying on the exactitude of the self-controlling robot's execution.

I.2 Precaution as Customary International Humanitarian Law
Though some 170 States are party to Protocol I, those that are not party to it include States with active, and in some cases significant, robotic weapon system programs, such as the U.S., Israel, Pakistan and Iran. 26To ordain whether these States are also bound by the rules as laid out in Article 57 of Protocol I, one must examine the customary nature of the principle of precaution.Without consent to be bound, treaties such as Protocol I do not create rights and obligations for a State. 27However, customary international law is, in principle, binding for all States regardless of whether or not there is explicit consent, lest there are instances of persistent objection.For this reason, it is necessary to first determine whether the provisions as laid out in Article 57 of Protocol I as discussed above are rules of customary international law before applying them generally to robotic weapon systems.
For a rule to enter the corpus of customary international law, two general elements must be present: virtually uniform State practice and opinio juris, or the sense that a State is bound by the rule as a matter of law. 28Luckily, with respect to IHL, the task of determining whether or not a rule is customary law has been made exponentially easier thanks to the Customary International Humanitarian Law database (CIHL database) launched by the International Committee of the Red Cross (ICRC) in August 2010. 29According to the CIHL database, Article 57 of Protocol I is essentially a codification of customary international law, albeit again only with respect to international armed conflicts, including, of particular relevance here, the provisions of paragraphs (2)(a)(i), (2)(a)(ii) and (2)(a)(iii). 30In order to support the determination of customary law status for these rules, the CIHL database not only examines internationally binding instruments, such as Protocol I, but also State military manuals, domestic case law if available, case law of international judicial bodies such as the International Criminal Tribunal for the former Yugoslavia, and governmental reports and statements.
The CIHL database extends the rules with respect to the principle of precaution to reflect customary international law pertaining to non-international armed conflict as well.These rules, according to the CIHL database, are rooted in the principle of distinction.Because distinction is customary in both international and non-international armed conflicts, and because respect for the precautionary rules is inherently required by the principle of distinction, the reasoning goes that the rules are customary law.In other words, distinction between military and civilian persons and objects is the central rule, and precautionary measures are actions prescribed to ensure adherence to this rule, in effect making precaution inextricable from distinction.The fact that contrary practice by States is rare is further put forward to support this argument.These rules are, as mentioned above, indeed integrally related to distinction.However, as the rules are not expressly laid out in provisions of the counterpart to Protocol I applicable to non-international armed conflicts, namely Protocol Additional to the Geneva Conventions of 12 August 1949, and relating to the Protection of Victims of Non-International Armed Conflicts (Protocol II), it is more difficult than with respect to international armed conflicts to articulate the precise customary rules when it comes to precaution.It does seem clear, at least, that customary law which is also applicable to non-international armed conflicts requires a reasonable amount of care (precaution) to be taken in the selection of means and methods of warfare and in the general conduct of (2)(a)(iii) in requiring States to 'do everything feasible to assess whether an attack may be expected to cause incidental loss of civilian life…' as opposed to the obligation to 'refrain from deciding to launch any attack which may be expected to cause incidental loss of civilian life…' (emphasis added).
operations to minimise or avoid unnecessary harm to civilians and civilian objects. 31dditionally, it can be concluded that the principle of proportionality, which entails that injury and death of civilians and damage to civilian objects must not be out of proportion with the military advantage, and the illegality of using indiscriminate means or methods of warfare even against legitimate military objectives, or using means and methods of warfare indiscriminately thereby causing damage to civilians, are rules of customary international law also pertaining to non-international armed conflict. 32 general, therefore, there are rules of IHL that bind all States and are of direct relevance to the use of robotic weapon systems in international and non-international armed conflict alike, and founded upon the 'bedrock of modern humanitarian law': protection of civilians. 33o this point, unmanned combat vehicles, namely UCAVs, have proven to be relatively discriminate means of warfare, as evidenced by an ongoing study conducted by the New America Foundation investigating the rate of 'militant to non-militant' deaths in UCAV strikes carried out in Pakistan. 34Since 2004, the civilian fatality rate in these strikes has been estimated at 16 percent, though the exact number of civilian deaths from such strikes is far from uncontroversial, as the distinction between the two categories, for example, cannot always be easily delineated on the ground, and as information from areas of armed conflict is not always complete in any case.Whether or not these numbers reach the level of excessiveness is not an exact science, an aspect of the law that will be elaborated upon below.Other investigations have, however, called the discriminate use of UCAVs which are remotely controlled by human operators into question.They claim, for instance, that such operations have targeted civilians arriving at the scene of previous strikes to aid victims as well as mourners at subsequent funerals. 35This would seem an example of indiscriminate use of an otherwise discriminate means of warfare, thereby contravening customary international law.That said, such attacks could be made in error due to certain lapses in the decisionmaking, precautionary chain.The question with respect to autonomous robotic weapon systems, which will be further examined below, is whether the norm of distinction which forms the basis for requirements under the principle of precaution can be (perhaps better) adhered to by such self-controlled systems.It should be mentioned here that a human 'in the loop' may establish accountability much more evidently.

II. The Feasibility Requirement and Robotic Precision
The concept of feasibility is used as a determinative benchmark when it comes to the choice of means and methods of warfare under both the terms of Article 57 paragraph (2)(a)(ii) of Protocol I and, identically, Rule 17 of the CIHL database.In particular, the obligation placed on those who decide upon or plan an attack to take all feasible precautions is formulated as a fundamental element in avoiding or minimising incidental loss of civilian life, injury to civilians or damage to civilian objects, again entwining the feasibility analysis with the cardinal 36 humanitarian law principle of distinction.With respect to robotic weapon systems  The requirement to undertake feasible measures, again, represents the recognition of differences in capabilities among States.Robotic warfare technologies, and the continuing development thereof, have created a stratified system of 'haves' and 'have-nots', not dissimilar to the situation with nuclear weapons in which certain States are recognised as nuclear weapons possessing States, whereas others are non-nuclear weapon States and have committed to not obtaining nuclear weapons. 37The use of the term 'feasible', therefore, does not denote a specific obligation of result, but rather one of effort, or due diligence, in accordance with military capabilities.In interpretive declarations at the time of ratification or accession to Protocol I, a number of States, including Italy, Germany, Ireland, Canada, the Netherlands, Spain and the UK, made their viewpoints clear that 'feasible' is meant to be understood as practicable or practically possible, taking into account all circumstances, including humanitarian and military considerations prevailing at the time that the plans or decisions are made or the actions undertaken. 38In other words, feasibility is a determination made in context. 39The reference to military consideration could be read as encompassing everything from conservation of munitions to the availability of weapon systems to the security of military forces.These interpretive declarations mirror nearly to a word the definition of 'feasible precautions' given in Protocol II to the 1980 UN Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May be Deemed to be Excessively Injurious or to Have Indiscriminate Effects. 40This understanding of the term feasibility serves the dual purpose of further highlighting the adaptability of the requirements to changes, as feasibility has much to do with the state of and developments in technology, while also qualifying the notion of feasibility with the caveat of practicability.Using practicability as a measuring stick for determining what is feasible underlines the substantial discretion left to military commanders by not setting unequivocal benchmarks for assessing compliance with the relevant obligations, but rather incorporating subjective standards, which a determination of practicality surely is, for ensuring adherence to the rules.
What is deemed feasible, however, has been and will assuredly continue to be affected by the development of robotic weapon systems, in large part due to the enhanced capacity to carry out precision engagement.First, a point of clarification is necessary.As Schmitt has explained, precision is not a synonym of accuracy; accuracy in the military context refers to the ability of a weapon to strike a specific location at which it is aimed, while precision involves the ability of target location and identification, the timely and accurate strike of said targets, and the determination of whether the desired effects have been accomplished or 37 This stratification is codified under the 1968 Treaty on the Non-Proliferation of Nuclear Weapons (NPT), 729 UNTS 161.Three States that possess nuclear weapons remain outside of the NPT regime -Pakistan, India and Israel -and North Korea announced its withdrawal from the NPT under the Article X procedure in 2003 and has since conducted two nuclear weapons tests. 38Roberts & Guelff 2000, supra note 20.whether another strike is necessary. 41Precision engagement, then, is an operational strategy based on this concept of precision, meant to both reduce risks to the attacking State's forces and to minimise collateral damage, which certainly can be enabled by technological innovation. 42Therefore, if the fundamental humanitarian law principle of distinction begets the obligation to take precautions in an attack, precision capabilities of available weapon systems must play a role in the operational calculations of 'all feasible precautions'.Does the availability of precision robotic weapon systems represent a limitation to the discretion of those who plan or decide upon an attack?In 1999, for example, in the context of the NATO air operations against the Federal Republic of Yugoslavia, NATO Secretary-General Lord Robertson seemed to indicate that precision weaponry is required under international law. 43To evaluate this statement in the present-day context, it is important to briefly consider the nature of 21 st century conflict.As mentioned above, the battlefield in modern-day conflicts often extends to, or is primarily located within, urban populated environments.Attack tactics frequently involve quick strikes, whereupon the opposing forces, not necessarily wearing clearly marked uniforms or even openly carrying arms, immediately slip back into the civilian populace.There is, in other words, not a situation of the battlefield being removed from civilians and civilian objects.
Contemplating additional precautionary requirements with respect to military operations in highly populated areas is not a new phenomenon.The 1956 New Delhi Draft Rules (New Delhi Draft Rules) for the Limitation of the Dangers incurred by the Civilian Population in Time of War, for instance, explicitly linked such combat to an obligation to employ precision weaponry and methods of attack. 44Article 9 of the New Delhi Draft Rules in relevant part states: "In particular, in towns and other places with a large civilian population, which are not in the vicinity of military or naval operations, the attack shall be conducted with the greatest degree of precision". 45This would be a clear, extended legal obligation to choose the most precise means and methods of warfare, though determining 'the greatest degree of precision' leaves room for discretion.As the New Delhi Draft Rules predate Protocol I, it is interesting to note how the elements of the principle of precaution with regard to the choice of means and methods of attack changed, with the separate specific reference to and respective obligation in regard to population centres removed, leaving rules of more general application instead.The New Delhi Draft Rules do, though, demonstrate that the role of precision has long been contemplated in the context of legally mandated precautionary measures.
It has been argued, admittedly in reference to the first Persian Gulf War in 1991, that the use of precision munitions, even when attacking opposing forces in civilian populated centres, is neither obligatory nor mandated under the laws of armed conflict. 46However, this analysis considered the specific case of precision-guided missiles, and took into account certain inhibitive aspects such as cost per missile, the interference of the weather with pilots' visual recognition capabilities, technological malfunctions and the issues encountered with antiaircraft defences. 47Of course, a number of those issues, including cost and malfunction, could similarly apply to any range of robotic weapon systems.These are the military considerations that are to be considered in addition to humanitarian concerns when determining what is feasible.Other scholars though would qualify this absolute argument by conceding that precision weapons may in fact be obligatory if their use would be likely to result in less damage to civilians or civilian property than alternative non-precision weapons. 48This is also the reasoning subscribed to by the present author.This, in any case, is predicated on an attacking force possessing such weapons.
Following from the foregoing, in order to evaluate the potential obligation to employ robotic weapon systems, when possessed by the attacking forces, as precision platforms in the context of the 'feasible precautions' requirement of IHL, 49 one must address the particular relevant humanitarian and military considerations.Take, as an example, the Modular Advanced Armed Robotic System (MAARS) developed by the company QinetiQ North America, a remote-controlled robot capable of carrying an M240B machine gun, among other accessories. 50MAARS is the new updated version of a weaponised robot called the Special Weapons Observation Reconnaissance Detection System (SWORDS) which could reportedly determine the type of weapon carried by an enemy as well as read a nametag at a distance of 300 to 400 metres, in addition to having been proven 100 percent accurate in tests with its equipped weaponry. 51MAARS certainly falls within the category of precision weapons.It is designed for reconnaissance, surveillance and target acquisition tasks, as it can quite simply confirm the open carrying of arms as well as what type of weapon it is, which already makes it a strong tool for the determination of combatant status, in addition to being able to engage the enemy with the utmost accuracy.As such, the characteristics of this weapon system fulfil certain aspects of Article 57(2)(a)(ii) of Protocol I: "as regards weapons, their precision and range should be taken into account; such precautions coincide with the concerns of military commanders wishing to economise on ammunition and avoid hitting points of no military interest". 52e could reason, then, that with such pinpoint recognition capabilities, essential to distinguishing between combatants and civilians (provided the recognition is somehow confirmed prior to engagement), plus the engagement accuracy, that in an urban environment where the military objective is to kill enemy combatants, the attacking State could be obliged to use such a robot under the principle of precaution if available.Also, in terms of military considerations, employing MAARS as opposed to soldiers increases the security of military personnel.Of course, MAARS would not be an appropriate means of attack if the military objective is, for instance, the destruction of an enemy encampment.Broadening this scenario to make a statement of more general application, when it is practicable or practically possible, taking into account all circumstances including the availability of such robotic weapon systems, the evidence would support the contention that the attacker is obliged to use the robotic weapon system as the means of attack.This 47 Idem, p. 131. 48See Schmitt 2010, supra note 38, p. 261. 49To take 'all feasible precautions' in choosing the means and methods of warfare to avoid or otherwise minimise incidental loss or civilian life, injury to civilians and damage to civilian objects.50 For more, information see http://www.qinetiq-na.com/products/unmanned-systems/maars/(accessed on 4 July 2012). 51Singer 2009, supra note 10, pp.30-31.obligation, of course, would continue to depend on the total context in which plans and decisions are made.
One potentially prohibitive element in choosing a robotic weapon system as the means of attack could be the costs.The costs associated with the necessary technical innovation make such weapon systems unattainable for many states, 53 let alone non-state actors that may be belligerents in an international or non-international armed conflict.The question here, however, is whether the costs of robotic systems compared to other possible means of attack affects the feasibility calculation. 54The F-35 Joint Strike Fighter, the U.S.'s next-generation manned combat aircraft, now has an estimated baseline cost per plane, including research and development, of a little over $110 million plus more than $20 million for the engine (for the single-engine plane). 55Compare this to the MQ-9 Reaper UCAV with a cost of roughly $13 million per aircraft 56 and the relative cost could ostensibly tip decision-makers in favour of the unmanned system.That said, what is relevant for the feasibility assessment is the Reaper's hovering ability rather than expense -the "capability to precisely designate targets for employment of laser-guided munitions", 57 and the fact that no human pilot can be injured or killed.With this in mind, it would be difficult to argue that in choosing between these two types of aircraft the latter would not be the one that most closely fulfils the requirements of precaution in an attack.Again, though, these considerations are part of the full range of circumstances, including the best way to achieve the military objective, applicable at any given time upon which operational choices are to be made.
To sum up this section, robotic weapon systems' increasing capability to carry out all manner of operations on land, in the air and at sea with a high level of precision -from reconnaissance to surveillance to targeting-creates further varying legal obligations for the 'have' States.In other words, "belligerents bear different legal burdens of care determined by the precision assets they possess". 58One can compare this scenario to the concept of 'common but differentiated responsibilities' under international environmental law. 59The choice of means or methods may be left up to the discretion of those who plan or decide upon an attack, but failure to use robotic weapon systems that can increasingly ensure, or are at least the most likely to provide for, the minimisation or avoidance of incidental loss of life or harm to civilians or damage to civilian objects would be in contravention of legal obligations under IHL.That way, precision robotics have a profound effect on the notion of feasible precautions and on the conduct of feasibility assessments.

III. Robotic Autonomy and the Cognitive Dilemma of IHL
The previous section extolled the precision capacity of robotic weapon systems as perhaps being the determinative factor in feasibility assessments pursuant to the precautionary obligations of attacking forces.However, the robotic weapon systems discussed thus far have been semi-autonomous, meaning they are either remotely controlled or piloted.Interestingly, the description of the MQ-9 Reaper on the U.S. Air Force Fact Sheet contains the following statement: "it [Reaper] provides a unique capability to autonomously execute the kill chain (find, fix, track, target, execute, and assess) against high value, fleeting, and time sensitive targets (TSTs)". 60The MQ-9 Reaper in fact has a two-member remote crew, pilot and sensor operator.The term 'autonomous' as used here seems to indicate that the Reaper is a self-contained system, including its crew, capable of carrying out the 'kill chain' without further outside input.Control of the aircraft and command of the mission are the responsibility of the pilot.As such, humans remain in the loop, and determining accountability for violations of the law should not be overly complicated.That might not always be the case.
Looking ahead, as was done by the Joint Forces Command in its 2010 report Joint Operating Environment (JOE 2010), not only will advanced weaponry become available to more actors but robotic systems will increasingly take up a share of the 'standard military toolkit'. 61The JOE 2010 also acknowledges that as the robotic weapon systems are integrated into the military, they will become progressively capable of a certain level of autonomy, whether it be "adjustable autonomy, or supervised autonomy, or full autonomy -within mission bounds". 62ith the amount of talk and speculation, it seems to be a matter of 'when' robotic systems that are, to an extent, autonomous are rolled out widely within the military rather than 'if' this will happen.In assessing the potential legal implications of this development, Boothby points out that current legal requirements comprise the precautions that must be taken in an attack, such as distinguishing between military and civilian objects, but remain silent on who or what fulfils the requisite tasks. 63The basic question must then be asked: are autonomous robotic weapon systems able to in fact effectuate the precautionary tasks in accordance with the law?

III.1 The Awareness Necessary to Discriminate
Having the capacity to avoid or minimise harm to civilians and civilian objects through precision engagement is not the same as being able to independently discriminate between combatants and civilians, or between military objectives and civilian objects.This is the foundation for what can be termed the cognitive dilemma of IHL.First, consider the definition of civilians and the civilian population under the law of armed conflict, as codified in Protocol I, Article 50: 1.A civilian is any person who does not belong to one of the categories of persons referred to in Article 4 A (1), (2), (3) and ( 6) of the Third Convention 64 and in Article 43 of this Protocol. 65In case of doubt 60 Shalal-Esa 2012, supra note 54, emphasis added. 61'The Joint Operating Environment', Joint Forces Command, 18 February 2010, pp.55-56. 62Ibid.Citing P.W. Singer, supra note 10, p. 133 (emphasis added). 63B. Boothby, supra note 6, p. 85. (1) Members of the armed forces of a Party to the conflict as well as members of militias or volunteer corps forming part of such armed forces.whether a person is a civilian, that person shall be considered to be a civilian.

The civilian population comprises all persons who are civilians.
3. The presence within the civilian population of individuals who do not come within the definition of civilians does not deprive the population of its civilian character.
The Article above extends protection to all persons who do fall into certain defined combat roles in a catch-all manner.The characteristics of the various actors, or groups, who may be present on the battlefield is the key to discriminating between legitimate targets and protected persons.It is the concept of doubt in paragraph 1 of this formulation, denoting a situation of uncertainty upon which decisions must be made, that raises particular questions (2) Members of other militias and members of other volunteer corps, including those of organized resistance movements, belonging to a Party to the conflict and operating in or outside their own territory, even if this territory is occupied, provided that such militias or volunteer corps, including such organized resistance movements, fulfil the following conditions: (a) that of being commanded by a person responsible for his subordinates; (b) that of having a fixed distinctive sign recognizable at a distance; (c) that of carrying arms openly; (d) that of conducting their operations in accordance with the laws and customs of war.
(3) Members of regular armed forces who profess allegiance to a government or an authority not recognized by the Detaining Power.
(...) (6) Inhabitants of a non-occupied territory, who on the approach of the enemy spontaneously take up arms to resist the invading forces, without having had time to form themselves into regular armed units, provided they carry arms openly and respect the laws and customs of war. 65Armed Forces: 1.The armed forces of a Party to a conflict consist of all organized armed forces, groups and units which are under a command responsible to that Party for the conduct of its subordinates, even if that Party is represented by a government or an authority not recognized by an adverse Party.Such armed forces shall be subject to an internal disciplinary system which, ' inter alia ', shall enforce compliance with the rules of international law applicable in armed conflict.
2. Members of the armed forces of a Party to a conflict (other than medical personnel and chaplains covered by Article 33 of the Third Convention) are combatants, that is to say, they have the right to participate directly in hostilities.
3. Whenever a Party to a conflict incorporates a paramilitary or armed law enforcement agency into its armed forces it shall so notify the other Parties to the conflict.VOL 4:3 as to whether autonomous robotic weapon systems would possess the requisite abilities of judgment and restraint in order to adhere to the law. 66milarly, recall that Article 57(2)(a)(iii) of Protocol I, which represents the principle of proportionality as a function of precaution, is also formulated in such a way as to take account of uncertainties on the battlefield.It uses the phrase 'may be expected' in the sense that an attack must not be undertaken if it "may be expected to cause incidental loss of civilian life, injury to civilians, damage to civilian objects, or a combination thereof, which would be excessive in relation to the concrete and direct military advantage anticipated".Schmitt notes that this provision necessitates a subjective evaluation comparing dissimilar values -civilian harm and military gain -by means of a comparative concept -'excessive'as opposed to an absolute one. 67Article 57(2)(b) subsequently restates the proportionality requirement, but this time in terms of control during the actual execution of the attack, by mandating that the attack must be aborted or suspended if the objective turns out to be not a military one or, again, if it 'may be expected' that the attack will cause 'incidental loss of civilian life, injury to civilians, damage to civilian objects, or a combination thereof, which would be excessive in relation to the concrete and military advantage anticipated'.Robotic weapon systems, in order to fulfil the requirements laid out in the law of armed conflict would need to be capable of taking courses of action in light of uncertainties and based on subjective analyses of specific situations.Otherwise, violations are more likely than not to occur, thus contravening the object and purpose of the provisions.It is worth mentioning that maintaining a formalised system of human control and oversight is not a panacea for ensuring the protection of civilians and civilian objects, as evidenced by the failure to override an automated strike sequence resulting in the downing of a civilian passenger airliner in the Persian Gulf in 1988. 68However, humans in the loop could be the only way to ensure the basic ability to adhere to legal rules in armed conflict.
Such concerns have prompted a number of studies regarding essential system requirements for unmanned systems partially to ensure compliance with IHL.For instance, certain prescribed design specifications provide for moments in which an authorised entity, meaning "an individual operator or control element authorized to direct or control system functions or mission", 69 must play a role.The concept of 'authorised entity' according to such studies, however, is sufficiently broad so as to incorporate systems of which a human operator is not in control.Reports have acknowledged the added complications inherent to the development of increasingly autonomous systems, such as how to deal with elements of uncertainty and subjective decision-making in operations.Relevant recommendations have included, among other elements, 1) the requirement that levels of uncertainty in identification of enemy combatants and military objectives be presented to the authorised entity and assurances of incorporating this requirement in the decision process, and 2) the mandate that "whenever a non-human authorised entity cannot interpret the available information and meet the proportionality criteria at, or above, a pre-determined confidence level, it must refer the decision to a higher authorized entity in the command chain". 70imilarly, but significantly more vague, the commentary on the AMW Manual sets the requisite technological threshold at programming that allows engagement of potential targets solely based on reliable information that they are lawful targets, the programming of lawful target identification needing to be comparable to that of manned aircraft or remotely piloted UCAVs. 71It is clear from the existence of these types of reports that research is under way as to how to take advantage of the benefits presented by increasingly autonomous weapon systems while at the same time recognizing the limitations and potential aspects that will contravene legal obligations.
The primary issue is whether or not (foreseen) autonomous robotic weapon systems have the awareness necessary to discriminate between combatants (and military objectives) and civilians (and civilian objects).This has an impact on both the determination of excessiveness of civilian harm as compared to the realisation of a military objective under the principle of proportionality, though admittedly there is no concrete threshold established for what attains the excessive level, and, more fundamentally, the basic act of distinction, particularly under combat circumstances in which enemy combatants are insurgents not wearing identical clearly-marked uniforms but rather blend in with the civilian population.Noel Sharkey describes what would be required of robotic systems in order to fulfil the legal obligations as 'situational awareness' which involves, for example, "understanding someone else's intentions and predicting their likely behaviour in a particular situation". 72Potential ambiguities on the battlefield, such as a child picking up an abandoned rifle, for example, are nearly infinite and often subtle, and thus require a level of human understanding that robots currently do not possess. 73Awareness is what allows for adaptation and reaction in the case of doubt, doubt being a fundamental factor in the catch-all reach of distinction.Without complete knowledge of what is being attacked and an adequate understanding of how civilians are affected, the result is typically collateral damage and injury to protected persons. 74How far will the robotic 'sense-think-act paradigm', 75 on its own, allow for compliance with the applicable rules?
In practical terms, therefore, under most circumstances the current state of technological advancement would make the fielding of autonomous combat robots -those meant to identify, track, target and strike opposing forces -illegal under the law of armed conflict.This is due to the fact that States must "never use weapons that are incapable of distinguishing between civilian and military targets", 76 and currently robotic weapon systems are not independently capable of such distinction.One could of course envision a scenario in which, for instance, opposing military forces are removed from the vicinity of civilians or civilian objects.With precise programming, in this example meaning defining location parameters, an autonomous robotic system could carry out the operation in accordance with legal rules.Under more typical circumstances characteristic of 21 st century conflict as described above, though, human targeting and command judgments 77 would be required to ensure compliance VOL 4:3 with discrimination and proportionality obligations under IHL.With distinction as its foundation, the principle of precaution would dictate that belligerents not field autonomous combat robots as a means of warfare because the prospects for avoidance or minimisation of incidental injury to or loss of civilian life or damage to civilian objects could simply not be gauged in the feasibility calculation.Therefore, with respect to autonomous military robots under the principle of precaution only non-combat variations (e.g.surveillance and reconnaissance systems) can clearly be legally deployed under most circumstances at this time.

Conclusions
Pilots flying sorties on another continent before going home for dinner and EOD units dismantling IEDs by remote-control are no longer confined to the realm of science fiction.What some commentators have referred to as the most significant development relevant to the conduct of war since nuclear weapons, advancements in the technology of robotic weapons systems are challenging the traditional interpretation of certain rules of the law of armed conflict.It is important that legal requirements of IHL be taken into account even in the technological development stage thus preventing, to the greatest possible extent, future ambiguities or compliance concerns.
It is clear that, from a humanitarian law perspective, robotic weapon systems already hold great value and perhaps even greater potential for safeguarding the cardinal principles of the law of armed conflict, namely discrimination between combatants and non-combatants and, more generally, the protection of civilians and civilian objects.The capability to carry out operations with precision, from intelligence-gathering to surveillance and reconnaissance to striking and analysing the aftermath of an attack, means in practice that any errors, even those involving minimal collateral damage, are becoming less likely in theory, and therefore less tolerable.Of course, in principle, this is a positive development in terms of humanitarian concerns.From a military operational point-of-view, the effect is a narrowing of the planning and decision-making discretion when it comes to feasibility assessments required under the principle of precaution.Similar to the case of nuclear weapons, however, developments in robotics technology for use on the battlefield are creating a dichotomy of 'have' and 'have-not' States, solidifying a normative relativism under IHL that accounts for differences in capabilities among States.
The future is pointing in the direction of autonomous robotic systems.In particular, this has to do with the drastic increase in the number of deployed systems and the lack of human resources to control each one individually; human operators have already been found to have difficulty controlling multiple robotic units at once. 78Not all, of course, will be used or equipped to carry out attacks.The points raised in this article with respect to autonomous robotic weapon systems have to do with the awareness necessary to adhere to the legal rules, both pursuant to Article 57 of Protocol I, the related customary norms, and more generally to the principle of distinction that underlies the precautionary elements of IHL.Current technologies placed in an urban combat environment, as is typical in 21 st century conflict, would simply be incapable of distinguishing between combatants and the civilian population.The ability to assess and react to ambiguities, which is only available to humans at the moment, is necessary to comply with IHL principles such as distinction and proportionality.This means that either the objective of developing autonomous robotic combat systems is abandoned, or that the international law requirements, for example pursuant to the principle of precaution, play an integral role in guiding the technological research and development.
The latter is the more likely and desirable scenario.This, in practice, would mean that robotic weapon systems are developed in line with the central guiding principle of fulfilling military needs while minimising civilian impact.Programming adaptable to various possible battlefield scenarios and streamlined in order to allow control of increasingly advanced technology by military on the ground could also greatly influence the feasibility calculation under the principle of precaution in favour of robotic weapon systems.The benefits of such systems in terms of both humanitarian and military considerations warrant continued development.However, in so doing, the issues raised in this article must be taken up and resolved.
Luckily, we have not yet reached the apocalyptic future imagined by Isaac Asimov in The Caves of Steel where tension and distrust prevail, as humans are relegated to an underground existence while highly capable humanoid robots replace them in their societal roles.That does not mean that several writers are not inclined to quickly jump to conclusions based on portrayals of robots from popular culture.Through sober analysis, it will be possible to harness the significant (potential) benefits of technological advancement in armed conflict while effectively upholding the fundamental protections afforded by international humanitarian law.

31
See Prosecutor v. Kupreškić et al., Judgment, Case No. IT-95-16-T, T.Ch.II, 14 January 2000, at 205, §524. 32Ibid. 33Idem, at 204, § 521. 34New America Foundation, The Year of the Drone: An Analysis of U.S. Drone Strikes in Pakistan, 2004-2012, at: http://counterterrorism.newamerica.net/drones(accessed on 29 June).Again, the strikes in Pakistan are being used to illustrate UCAVs as discriminate weapons, not to make any sort of claim as to the legality of carrying out strikes in Pakistan. 35C. Woods & C. Lamb, "Obama terror drones: CIA tactics in Pakistan include targeting rescuers and funerals", The Bureau of Investigative Journalism, 4 February 2012. 36As described by the International Court of Justice in Legality of the Threat or Use of Nuclear Weapons, Advisory Opinion of 8 July 1996, [1996] ICJ Reports, at 257, §78.
in a State's arsenal, with the capability of such systems to engage in precision operations, the question becomes in what way and to what extent possession of and access to these weapon systems must impact a State's feasibility calculations.This section, therefore, will look more closely at what is meant by the term 'feasible' and what role (robotic) precision plays in fulfilling the related precautionary obligations.
39 M. Schmitt, 'Targeting in Operational Law', in T. Gill and D. Fleck (eds.),The Handbook of the International Law of Military Operations, Oxford: Oxford University Press 2010, p. 260. 401980 Protocol on Prohibitions or Restrictions on the Use of Mines, Booby-Traps and Other Devices (1980 Protocol II), 1342 UNTS 168, Article 3(4).

64
Convention (III) relative to the Treatment of Prisoners of War, Geneva, 12 August 1949, Article 4: A. Prisoners of war, in the sense of the present Convention, are persons belonging to one of the following categories, who have fallen into the power of the enemy: Article 57 essentially obliges the attacking State to choose the method or means of attack which minimises or avoids, to the greatest possible extent, collateral damage or incidental injury to and loss of civilian life relative to anticipated 'direct and concrete' military advantage.It is formulated in such a way as to apply different requirements to different States based on capabilities, or in other words, it establishes a normative relativism.19Creatinggeneral standards that encompass relative capabilities is logical, and is certainly apt when one looks at the disparities among States in terms of technological development.As will be discussed in more detail below, what is, in the parlance of Article 57, feasible for one State may not be for another.