Killer Robots
eBook - ePub

Killer Robots

Lethal Autonomous Weapon Systems Legal, Ethical and Moral Challenges

  1. 260 pages
  2. English
  3. ePUB (mobile friendly)
  4. Available on iOS & Android
eBook - ePub

Killer Robots

Lethal Autonomous Weapon Systems Legal, Ethical and Moral Challenges

Book details
Book preview
Table of contents
Citations

About This Book

Nearly 45 countries are at different stages of developing robotic weapons or lethal autonomous weapon systems (LAWS). The United States, for example, has recently test launched its robotic vessel Sea Hunter, a self-driving, 132-foot ship designed to travel thousands of miles without a single crew member on board. As reported, the vessel has the capability to detect and destroy stealth diesel-electric submarines and sea mines. However, though the militaries of the developed countries are in a race to develop LAWS to perform varied functions on the battlefield, a large section of robotic engineers, ethical analysts, and legal experts are of the firm belief that robotic weapons will never meet the standards of distinction and proportionality required by the laws of war, and therefore will be illegal. This book provides an insight into lethal autonomous weapon systems and debates whether it would be morally correct to give machines the power to decide who lives and who dies on the battlefield.

Frequently asked questions

Simply head over to the account section in settings and click on “Cancel Subscription” - it’s as simple as that. After you cancel, your membership will stay active for the remainder of the time you’ve paid for. Learn more here.
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
Both plans give you full access to the library and all of Perlego’s features. The only differences are the price and subscription period: With the annual plan you’ll save around 30% compared to 12 months on the monthly plan.
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Yes, you can access Killer Robots by Dr. U C Jha in PDF and/or ePUB format, as well as other popular books in Technology & Engineering & Military Science & Technology. We have over one million books available in our catalogue for you to explore.
IV LAWS and International Law
International humanitarian law (IHL) is a branch of international law which limits the use of violence in armed conflicts. The basic principle of IHL is that in any armed conflict, the right of the parties to conflict to choose methods and means of warfare is not unlimited. IHL has developed over more than a century with a two-fold aim: to save civilians from the consequences of armed conflict, and to protect soldiers from cruelty and unnecessary suffering. The rapid advancement in autonomous technologies, in particular lethal autonomous weapons systems (LAWS), presents certain challenges to the basic tenets of IHL. Though there are international agreements to specifically ban or regulate a number of inherently problematic weapons, such as expanding bullets, poisonous gases, antipersonnel landmines, biological and chemical weapons, blinding lasers, incendiaries, and cluster munitions, there is no regime for LAWS.
The Latin phrases jus ad bellum and jus in bello1 describe the law governing resort to force and the law governing the conduct of hostilities. These are recognized branches of international law and are generally independent. The morality and legality of a state deciding to go to war (jus ad bellum) is something that is decided by a state’s political leadership. LAWS are likely to present complex jus ad bellum issues related to lowering the moral, political and financial cost of warfare for those States that have the capability to develop and deploy them in an armed conflict. The use of LAWS in armed conflict or in enforcement operations could be incompatible with international human rights law, and may lead to unlawful killings, injuries and other violations of human rights.
This chapter will address the issue of jus ad bellum proportionality, and three key concerns related to LAWS: their implications for the principles of distinction, proportionality and precautions in attack. It will also discuss the challenge the posed by LAWS to accountability and enforcement. The likely impact of the recently adopted Arms Trade Treaty (ATT) in regulating LAWS will also be covered in this chapter. The human rights implications of LAWS will be dealt with in the last part of the chapter.
Jus Ad Bellum Proportionality
Jus ad bellum comprises six principles: just cause, right intention, proper authority, last resort, the probability of success, and the response of declaring war being proportionate. 2 The principle of ‘just cause’ relates to the normative reasons for waging war, such as self-defence or defence of others, while ‘right intention’ prescribes the proper reasons for acting. The principle of ‘proper authority’ dictates that only legitimately recognized authorities may declare a war. Last resort requires states to attempt all reasonable alternatives available, such as diplomacy or arbitration before resorting to hostilities. Probability of success requires states to assess whether an actor is able to achieve its just cause through fighting a war. The principle of proportionality dictates that we must consider the overall consequences of a proposed war.
LAWS pose a distinct challenge to the jus ad bellum proportionality principle. Proportionality is closely linked to other ad bellum principles and it requires that we consider the overall consequences of a proposed war. If we cannot satisfy the principle of proportionality, the other principles can never meet the obligations of a just war.3 A State that has LAWS in its arsenal will have advantage of using them in defence, particularly in cases where the other side lacks the same level of technology.4 LAWS saves soldiers’ lives and this would weigh heavily in favour of deploying such weapons, for the harms caused would be blamed on the unjust aggressor and not on the defending State. In case LAWS cause collateral harm, the State could justify it by declaring that it was unintended and that LAWS were used in pursuance of legitimate military objectives. The ability to use LAWS against an unjust threat must be seen as a benefit in one’s proportionality calculation.5
The presence of LAWS might influence the choice of a nation to go to war in two ways: (i) it could directly threaten the sovereignty of a nation, and (ii) it could make it easier for leaders who wish to start a conflict to actually start one. In other words, the availability of LAWS would lower the barrier to initiate an armed conflict. The presence of LAWS is likely to favour enhanced potential for armed deployments. The current use of armed drones by the US amplifies this. It is thought that future LAWS would be capable of learning and adapting their functioning in response to changing circumstances in the environment in which they are deployed, as well as be capable of making firing decisions on their own. Such systems could be directly responsible for starting an armed conflict accidently. The use of LAWS in armed conflict is also likely to increase the arms race. 6
Distinction
Distinction is one of the most important principles of IHL. It requires combatants to direct their attacks solely at other combatants and military targets and to protect civilians and civilian property.7 Under the principle of distinction, indiscriminate attacks are prohibited. Indiscriminate attacks are those that are not directed at a military object,8 or employ a method or means of combat the effects of which cannot be directed or restricted as required. The principle of distinction also necessitates that defenders must distinguish themselves from civilians and refrain from placing military personnel or material near civilian objects.9 A major IHL issue is that LAWS cannot discriminate between combatants and non-combatants or other persons likely to be present at the place of conflict. The list includes civilian workers, aircrew members, war correspondents, military doctors, religious personnel, civilian drivers, porters as well as combatants who are unwilling to fight or are wounded or sick.
According to Sharkey (2012) LAWS lack three of the main components required to ensure compliance with the principle of distinction: (i) adequate sensory processing systems for distinguishing between combatants and civilians; (ii) programming language to define a non-combatant or person hors de combat; and (iii) battlefield awareness or common sense reasoning to assist in discrimination decisions. Even if LAWS have adequate sensing mechanisms to detect the difference between civilians and combatants, they would lack ‘common sense’ which is used by an experienced soldier on the battlefield for taking various decisions.10 At present there is no evidence to suggest that a computer has independent capability to operate on the principle of distinction similar to that of a human soldier.
Due to the vagueness of the legal definitions contained in the Geneva Conventions of 1949 and AP I, it is not possible to incorporate the essence of the principle of discrimination into the programming language of a computer. A human combatant has to take positive steps to understand a situation; develop his/her own mental model of offence or defence; and then recommend or demand engagement, a process that is extremely difficult to incorporate in LAWS. Even if we equip LAWS with mechanisms to distinguish between civilians and military combatants, these devices lack the capacity to reach the human level of common sense that is indispensable for the correct application of the principle of discrimination. The application of the principle of proportionality is more difficult than distinction since it involves comparing an action’s potentially excessive collateral damage to its anticipated military benefits. It requires a case- by-case strategic and military evaluation, which a machine simply cannot comprehend.11
Sparrow has highlighted an important shortcoming of LAWS, i.e., the question of the capacity of LAWS to recognize surrender, and the implication of this for the ethical deployment of such weapon systems.12 A fundamental requirement of the principle of distinction is that combatants should not attack enemy units that have clearly indicated their desire to surrender.13 By ceasing to participate in hostilities and signalling surrender, military units can acquire the moral status of non-combatants, such that deliberate attacks on them are no longer permissible.
It can be argued theoretically that if a combatant can recognize a signal to surrender then it should not be impossible for LAWS to do so. However, Sparrow has advanced two reasons why recognizing surrender is likely to be difficult for robots. The first relates to the fact that ‘perception’ would be a formidable task for computers. In spite of recent developments in robotic technology, real-time recognition of objects in motion across a range of environments remains beyond the capacity of even the most sophisticated computer vision systems.14 The second relates to the contextual nature of the means used to signal surrender in different circumstances and the problem of distinguishing between surrender and perfidy. Human beings have a tremendously sophisticated and powerful capacity to interpret the actions of other human beings and to identify their intentions. It will be extremely challenging to design a machine that comes close to replicating these capabilities.15
It may be argued that though an autonomous system might be unlawful because of its ability to distinguish civilians from combatants in the operational conditions of infantry urban warfare, for example, it may be lawful in battlefield environments with very few if any civilians present.16 However, the casualty figures of the post-World War II armed conflicts indicate that an increasing number of civilians are becoming victims of modern warfare. Therefore, LAWS may not be suitable weapons of future armed conflict.
Proportionality
The principle of proportionality is related to the principle of distinction. It prescribes that belligerent parties in an armed conflict are not to inflict collateral damage that is excessive in relation to the military advantage they seek with any hostile action. This principle is considered part of customary international law, which binds all states.17 It provides that an attack which may be expected to cause incidental loss of civilian life, injury to civilians, damage to civilian objects, or a combination thereof, which would be excessive in relation to the concrete and direct military advantage anticipated is prohibited.18 In order to make the principle of proportionality more effective, Article 57 of AP I obliges belligerents to take all feasible precautions and constant care with a view to implementing proportionality and distinction.19 If damage to civilian objects or civilian death or injury is anticipated prior to targeting a military object, then an assessment must be undertaken in which the anticipated military advantage to be gained is weighed against the anticipated “collateral” damage to protected civilians or civilian objects. Viewed from this perspective, LAWS, which are incapable of making a distinction between combatants and civilians, would not be able to follow the principle of proportionality.
Arkin (2009), suggests that it is possible to develop proportionality optimization algorithm for LAWS. The algorithm would select the weapon system that ensures that it would not violate any prop ortionality prohibitions or IHL. It would calculate the potential unintended noncombatant carnage and civilian property damage (collateral damage) that would result from available combinations of weapon systems and release positions, choosing the most effective weapon that would cause the lowest acceptable collateral damage.20 On the other hand, Human Rights Watch has serious doubts about whether LAWS could exercise comparable judgment to assess proportionality in complex and evolving situations. It would be difficult to programme LAWS to carry out the proportionality test or to visualize every situation of armed conflict because there are an infinite number of possible situations.21
Professor Sharkey is of the opinion that though it might be possible for LAWS to be programmed to observe the principle of proportionality in a limited way, or to minimize collateral damage by selecting appropriate weapons and properly directing them, it would not be possible to guarantee respect for the principle of proportionality in the near future. Only a human being can make qualitative and subjective decisions on when damage to civilians would exceed the anticipated military advantage provided by an attack.22 Kastan is of the firm belief that whatever technological advances may come about, the relevant analysis and assessment of the principle of proportionality would have to be left to human beings. 23
The US Air Force maintains that proportionality in attack is an inherently subjective determination that could be resolved on a case-by- case basis.24 In reality, it would be nearly impossible to pre-programme LAWS to handle the infinite number of scenarios it might face in a fog of war. It would therefore be extremely difficult to programme a machine to replicate the decision-making capabilities of a military commander or a combatant. Therefore, non-compliance with the principle of proportionality, in addition to failure to distinguish between civilians and combatants, could lead to an unlawful loss of innocent lives.25 A breach of the rule of distinction or proportionality is regarded as a serious violation of law of armed conflict. It is listed as a grave breach of the 1977 AP I and as a war crime under the 1998 Rome Statute of the International Criminal Court.26
It would be very challenging to program a machine to: measure anticipated civilian harm and measure military advantage; subtract and measure the balance against some determined standard of “excessive”; if excessive, not to attack an otherwise lawful target. From a programming point of view, this would require attaching values to various targets, objects, and categories of human beings, and making probabilistic assessments based on many complex contextual factors. It may also include inductive machine learning from human examples of judgments about proportionality, seeking...

Table of contents

  1. Cover
  2. Title Page
  3. Copyright
  4. Contents
  5. Preface
  6. List of Abbreviations
  7. I    Introduction
  8. II   Lethal Autonomous Weapon Systems: Definition
  9. III  Autonomous Weapons in Use
  10. IV   Lethal Autonomous Weapon Systems (LAWS) and International Law
  11. V    LAWS: Ethical and Moral Issues
  12. VI   LAWS: Legal Review
  13. VII  LAWS: International Concerns
  14. VIII Conclusion and the Way Ahead
  15. Appendices A : Relevant International Humanitarian Law Provisions
  16. Bibliography
  17. Index