The Applied Ethics Of Emerging Military And Security Technologies 135189482X, 9781351894821, 1351894838, 9781351894838

The essays in this volume illustrate the difficult real world ethical questions and issues arising from accelerating tec

714 109 37MB

English Pages 0 [531] Year 2016

Report DMCA / Copyright

DOWNLOAD FILE

Polecaj historie

The Applied Ethics Of Emerging Military And Security Technologies
 135189482X,  9781351894821,  1351894838,  9781351894838

Table of contents :
pt. I. Changing context and overview --
pt. II. Robots and autonomous systems --
pt. III. Unmanned aerial vehicles and the transition from military to civilian systems --
pt. IV. Cyberconflict and cybersecurity --
pt. V. Genomics and neuroscience engineering.

Citation preview

The Applied Ethics of Emerging Military and Security Technologies

The Library of Essays on the Ethics of Emerging Technologies Series Editor: Wendell Wallach Titles in the series: The Applied Ethics of Emerging Military and Security Technologies Braden R. Allenby The Ethics of Biotechnology Gaymon Bennett The Ethical Challenges of Emerging Medical Technologies Arthur L. Caplan and Brendan Parent Emerging Technologies Gary E. Marchant and Wendell Wallach The Ethics of Nanotechnology, Geoengineering, and Clean Energy Andrew Maynard and Jack Stilgoe The Ethics of Information Technologies Keith W. Miller and Mariarosario Taddeo The Ethics of Sports Technologies and Human Enhancement Thomas H. Murray and Voo Teck Chuan Machine Ethics and Robot Ethics Wendell Wallach and Peter Mario Asaro

The Applied Ethics of Emerging Military and Security Technologies

Edited by

Braden R. Allenby Arizona State University, USA

Routledge Taylor & Francis Group LONDON AND NEW YORK

First published 2015 by Ashgate Publishing Published 2016 by Routledge 2 Park Square, Milton Park, Abingdon, Oxon OX 14 4RN 711 Third Avenue, New York, NY 10017, USA Routledge is an imprint of the Taylor & Francis Group, an informa business Copyright © Braden R. Allenby 2015. For copyright of individual articles please refer to the Acknowledgements. All rights reserved. No part of this book may be reprinted or reproduced or utilised in any form or by any electronic, mechanical, or other means, now known or hereafter invented, including photocopying and recording, or in any information storage or retrieval system, without permission in writing from the publishers. Notice: Product or corporate names may be trademarks or registered trademarks, and are used only for identification and explanation without intent to infringe. British Library Cataloguing in Publication Data A catalogue record for this book is available from the British Library. Library of Congress Control Number: 2014951750

ISBN 13: 978-1-4724-3003-8 (hbk)

Contents Acknowledgements Series Preface Introduction

ix xi xiii

PART I CHANGING CONTEXT AND OVERVIEW 1 Brad Allenby (2013), 'The Implications of Emerging Technologies for Just War Theory', Public Affairs Quarterly, 27, pp. 49-68. 3 2 P.W. Singer (2010), The Ethics of Killer Applications: Why is it so Hard to Talk about Morality When it Comes to New Military Technology?', Journal of Military Ethics, 9, pp. 299-312. 23 3 National Research Council and National Academy of Engineering (2014), 'Summary', in Emerging and Readily Available Technologies and National Security - A Framework for Addressing Ethical, Legal, and Societal Issues, Washington DC: The National Academic Press, pp. 1-13. 37 4 Qiao Liang and Wang Xiangsui (1999), Unrestricted Warfare, Beijing: People's Liberation Army Literature and Art Publishing House, translated by Central Intelligence Agency Foreign Broadcast Information Service, pp. 1-35, 204-27. 51 5 International Committee of the Red Cross (2011), 'International Humanitarian Law and the Challenges of Contemporary Armed Conflicts', Geneva: International Conference of the Red Cross Red Crescent, pp. 3-53. Ill 6 Noetic Corporation (2013), 'Technology as Dialectic: Understanding Game Changing Technology', paper prepared for the Emerging Capabilities Division, Rapid Fielding, Office of the Secretary of Defense, pp. 1-10. 163 7 Alan Backstrom and Ian Henderson (2012), 'New Capabilities in Warfare: An Overview of Contemporary Technological Developments and the Associated Legal and Engineering Issues in Article 36 Weapons Reviews', International Review of the Red Cross, 94, pp. 483-514. 173 PART II ROBOTS AND AUTONOMOUS SYSTEMS 8 Gary E. Marchant, Braden Allenby, Ronald Arkin, Edward. T. Barrett, Jason Borenstein, Lyn M. Gaudet, Orde Kittrie, Patrick Lin, George R. Lucas, Richard O'Meara and Jared Silberman (2011), 'International Governance of Autonomous Military Robots', Columbia Science and Technology Law Review, 12, pp. 272-315. 9 Wendell Wallach (2013), 'Terminating the Terminator: What to do about Autonomous Weapons', Science Progress, pp. 251-54.

207 251

vi

The Applied Ethics of Emerging Military and Security Technologies

10 Peter Asaro (2012), 'On Banning Autonomous Weapon Systems: Human Rights, Automation, and the Dehumanization of Lethal Decision-Making', International Review of the Red Cross, 94, pp. 687-709. 255 11 Human Rights Watch (2012), 'Losing Humanity: The Case against Killer Robots', International Human Rights Clinic, pp. 1-5. 279 12 Ronald C. Arkin (2010), 'The Case for Ethical Autonomy in Unmanned Systems', Journal of Military Ethics, 9, pp. 332-41. 285 PART III UNMANNED AERIAL VEHICLES AND THE TRANSITION FROM MILITARY TO CIVILIAN SYSTEMS 13 Stuart Casey-Maslen (2012), 'Pandora's Box? Drone Strikes under jus adbellum, jus in bello, and International Human Rights Law', International Review of the Red Cross, 94, pp. 597-625. 14 Bradley Jay Strawser (2010), 'Moral Predators: The Duty to Employ Uninhabited Aerial Vehicles', Journal of Military Ethics, 9, pp. 342-68. 15 US Department of Justice (2011), 'Lawfulness of a Lethal Operation Directed against a US Citizen who is a Senior Operational Leader of Al-Qa'ida or an Associated Force', pp. 1-16. 16 Daniel Rothenberg (2013), 'What the Drone Debate is Really About: It's not Privacy or State Power', Slate, pp. 1-3. 17 Brad Allenby (2013), 'The Golden Age of Privacy is Over: But Don't Blame Drones', Slate, pp. 1-3. PART IV

297 327 355 371 375

CYBERCONFLICT AND CYBERSECURITY

18 Patrick Lin, Fritz Allhoff and Neil Rowe (2012), 'Computing Ethics War 2.0: Cyberweapons and Ethics', Communications of the ACM, 55, pp. 24-6. 19 Herbert Lin (2012), 'Cyber Conflict and International Humanitarian Law', International Review of the Red Cross, 94, pp. 515-31. 20 George R. Lucas Jr (2013), 'Jus in Silico: Moral Restrictions on the Use of Cyberwarfare', in F. Allhoff, N. Evans and A. Henschke (eds), Routledge Handbook of Ethics and War, New York: Routledge, pp. 367-81. 21 Randall R. Dipert (2010),' The Ethics of Cyberwarfare', Journal of Military Ethics, 9, pp. 384^10. 22 James Cook (2010), '"Cyberation" and Just War Doctrine: A Response to Randall Dipert', Journal of Military Ethics, 9, pp. 411-23.

381 385 403 419 447

PART V GENOMICS AND NEUROSCIENCE ENGINEERING 23 Maxwell J. Mehlman, Patrick Lin and Keith Abney (2013), 'Enhanced Warfighters: A Policy Framework', in Michael L. Gross and Don Carrick (eds), Military Medical Ethics for the 21st Century, Farnham: Ashgate, pp. 113-26. 463

The Applied Ethics of Emerging Military and Security Technologies

vii

24 Gary Marchant and Lyn Gulley (2010), 'National Security Neuroscience and the Reverse Dual-Use Dilemma', American Journal ofBioethics Neuroscience, 1, pp. 20-22. 25 Victoria Sutton (2005), 'A Multidisciplinary Approach to an Ethic of Biodefense and Bioterrorism', Journal of Law, Medicine and Ethics, pp. 310-22.

477

Name Index

495

481

This page intentionally left blank

Acknowledgements Ashgate would like to thank the researchers and the contributing authors who provided copies, along with the following for their permission to reprint copyright material. BradAllenby for the essay: BradAllenby (2013), 'The Implications of Emerging Technologies for Just War Theory', Public Affairs Quarterly, 27, pp. 49-68. Copyright © 2013 by the Board of Trustees of the University of Illinois. Cambridge University Press for the essays: Alan Backstrom and Ian Henderson (2012), 'New Capabilities in Warfare: An Overview of Contemporary Technological Developments and the Associated Legal and Engineering Issues in Article 36 Weapons Reviews', International Review of the Red Cross, 94, pp. 483-514. Copyright © 2012 International Committee of the Red Cross, published by Cambridge University Press, reproduced with permission; Peter Asaro (2012), 'On Banning Autonomous Weapon Systems: Human Rights, Automation, and the Dehumanization of Lethal Decision-Making', International Review of the Red Cross, 94, pp. 687-709. Copyright © 2012 International Committee of the Red Cross, published by Cambridge University Press, reproduced with permission; Stuart Casey-Maslen (2012), 'Pandora's Box? Drone Strikes under jus ad bellum, jus in bello, and International Human Rights Law', International Review of the Red Cross, 94, pp. 597-625. Copyright © 2012 International Committee of the Red Cross, published by Cambridge University Press, reproduced with permission; Herbert Lin (2012), 'Cyber Conflict and International Humanitarian Law', International Review of the Red Cross, 94, pp. 515-31. Copyright © 2012 International Committee of the Red Cross, published by Cambridge University Press, reproduced with permission. International Committee of the Red Cross for the essay: International Committee of the Red Cross (2011), 'International Humanitarian Law and the Challenges of Contemporary Armed Conflicts', Geneva: International Conference of the Red Cross Red Crescent, pp. 3-53. http://www.icrc.org/eng/assets/files/red-cross-crescent-movement/31stinternational-conference/31 -int-conference-ihl-challenges-report-11-5-1 -2-en.pdf. Patrick Lin, Fritz Allhoff and Neil Rowe for the essay: Patrick Lin, Fritz Allhoff and Neil Rowe (2012), 'Computing Ethics War 2.0: Cyberweapons and Ethics', Communications of the ACM, 55, pp. 24-26. Copyright © 2012 the Authors. The National Academies Press for the essay: National Research Council and National Academy of Engineering (2014), 'Summary', in Emerging and Readily Available Technologies and National Security - A Framework for Addressing Ethical, Legal, and Societal Issues, Washington DC: The National Academic Press, pp. 1-13.

x

The Applied Ethics of Emerging Military and Security Technologies

PARS International Corps for the essays: Daniel Rothenberg (2013), 'What the Drone Debate is Really About: It's not Privacy or State Power', Slate, http://www.slate.com/articles/ technology/future_tense/2013/05/drones_in_the_united_states_what_the_debate_is_really_ about.html; Brad Allenby (2013), 'The Golden Age of Privacy is Over: But Don't Blame Drones', Slate http://www. slate, com/articles/technology /future_tense/2013/04/domestic_ drone_surveillance_the_golden_age_of_privacy_is_over.html. Taylor & Francis for the essays: P.W. Singer (2010), 'The Ethics of Killer Applications: Why is it so Hard to Talk about Morality When it Comes to New Military Technology?', Journal of Military Ethics, 9, pp. 299-312. Copyright © 2010 Taylor & Francis; Ronald C. Arkin (2010), 'The Case for Ethical Autonomy in Unmanned Systems', Journal of Military Ethics, 9, pp. 332-41. Copyright © 2010 Taylor & Francis; Bradley Jay Strawser (2010), 'Moral Predators: The Duty to Employ Uninhabited Aerial Vehicles', Journal of Military Ethics, 9, pp. 342-68. Copyright © 2010 Taylor & Francis; George R. Lucas Jr (2013), 'Jus in Silico: Moral Restrictions on the Use of Cyberwarfare', in F. Allhoff, N. Evans and A. Henschke (eds), Routledge Handbook of Ethics and War, New York: Routledge, pp. 367-81. Reproduced by permission of Taylor & Francis Books UK; Randall R. Dipert (2010), 'The Ethics of Cyberwarfare', Journal of Military Ethics, 9, pp. 384^10. Copyright © 2010 Taylor & Francis; James Cook (2010), '"Cyberation" and Just War Doctrine: A Response to Randall Dipert', Journal of Military Ethics, 9, pp. 411-23. Copyright © 2010 Taylor & Francis; Gary Marchant and Lyn Gulley (2010), 'National Security Neuroscience and the Reverse Dual-Use Dilemma', American Journal of Bioethics Neuroscience, 1, pp. 20-22, reprinted by permission of Taylor & Francis LLC (http://www.tandfonline.com). U.S. Department of Defense for the essay: Noetic Corporation (2013), 'Technology as Dialectic: Understanding Game Changing Technology', paper prepared for the Emerging Capabilities Division, Rapid Fielding, Office of the Secretary of Defense, pp. 1-10. Wendell Wallach (2013), 'Terminating the Terminator: What to do about Autonomous Weapons', Science Progress. This article was first published by Science Progress, http:// scienceprogress.org/2013/01/terminating-the-terminator-what-to-do-about-autonomousweapons/. John Wiley & Sons for the essay: Victoria Sutton (2005), 'A Multidisciplinary Approach to an Ethic of Biodefense and Bioterrorism', Journal of Law, Medicine and Ethics, pp. 310-22. Every effort has been made to trace all the copyright holders, but if any have been inadvertently overlooked the publishers will be pleased to make the necessary arrangement at the first opportunity. Publisher's Note The material in this volume has been reproduced using the facsimile method. This means we can retain the original pagination to facilitate easy and correct citation of the original essays. It also explains the variety of typefaces, page layouts and numbering.

Series Preface Scientific discovery and technological innovation are producing, and will continue to generate, a truly broad array of tools and techniques, each of which offers benefits while posing societal and ethical challenges. These emerging technologies include (but are not limited to) information technology, genomics, biotechnology, synthetic biology, nanotechnology, personalized medicine, stem cell and regenerative medicine, neuroscience, robotics and geoengineering. Societal and ethical issues, which arise within those fields, go beyond safety and traditional risks such as public health threats and environmental damage, to encompass privacy, fairness, security and the acceptability of various forms of human enhancement. The Library of Essays on the Ethics of Emerging Technologies demonstrates the breadth of the challenges and the difficult tradeoffs entailed in reaping the benefits of technological innovation while minimizing possible harms. Editors selected for each of the eight volumes are leaders within their respective fields. They were charged to provide a roadmap of core concerns with the help of an introductory essay and the careful selection of essays that have or will play an important role in ongoing debates. Many of these essays can be thought of as 'golden oldies', important works of scholarship that are cited time and again. Other essays selected address cutting-edge issues posed by synthetic organisms, cognitive enhancements, robotic weaponry and additional technologies under development. In recent years information technologies have transformed society. In the coming decades advances in genomics and nanotechnologies may have an even greater impact. The pathways for technological progress are uncertain as new discoveries and convergences between areas of research afford novel, and often unanticipated, opportunities. However, the determination of which technological possibilities are being realized or probable, and which are merely plausible or highly speculative, functions as a central question that cuts across many fields. This in turn informs which ethical issues being raised warrant immediate attention. Calls for precautionary measures to stave off harms from speculative possibilities can unnecessarily interfere with innovation. On the other hand, if futuristic challenges, such as smarter-thanhuman robotic are indeed possible, then it behooves us to invest now in means to ensure artificial intelligence can be controlled and is provably beneficial. Most of the ethical concerns discussed in the volumes are less dramatic, but just as intriguing. What criteria must be met before newly created organisms can be released outside a laboratory? Should fears about the possible toxicity of a few unidentified nanomaterials, among thousands, significantly slow the pace of development in a field that promises great rewards? Does medical research that mines large databases (big data), including the genomes of millions of people, have a downside? Are geoengineering technologies for managing climate change warranted, or more dangerous than the problem they purport to solve? The ethical languages enlisted to evaluate an innovative technology go beyond the utilitarian analysis of costs and benefits. For example, the principles of biomedical ethics and the laws of armed conflict play a central role in judgements made about whether the

xii

The Applied Ethics of Emerging Military and Security Technologies

healthcare industry or the military should adopt a proposed device or procedure. The differing ethical languages underscore different considerations, each of which will need to be factored into a final decision regarding whether to embrace, regulate, or reject the new technology. Scientific discovery and technological innovation proceed at an accelerating pace, while attention to the ethical, societal and governance concerns they raise lags far behind. Scholars have been diligent in trying to bring those concerns to the fore. But as the essays in these volumes will make clear, there is a great deal of work ahead if we, humanity as a whole, are to successfully navigate the promise and perils of emerging technologies. WENDELL WALLACH Series Editor Yale University, USA

Introduction The link between technology and military and security domains has existed throughout history, but is of particular importance today. The accelerating evolution of technology across its entire frontier, frequently driven by military or security applications or research, is combining with an increasing ability of states and violent non-state actors to turn technology to different uses: terrorists use airplanes as bombs, cyber teams only loosely associated with states attack perceived enemies and sometimes augment more traditional combat initiatives, and all elements of society can learn to use drones as platforms for surveillance, stalking and less savoury activities. Moreover, technologies developed for military or security applications can have very different effects when they diffuse throughout civil society: hummingbird drones are of obvious use in combat, especially in counterinsurgency environments where collateral damage must be minimal, but they have far different implications when adopted by divorce lawyers, political parties and partisan news organizations. A social networking function that provides a wide network of friends and information means something very different when adopted by citizens trying to overthrow a despotic government or for disinformation campaigns intent on sowing confusion and misdirection to hide deliberate armed destabilization of a neighbouring country. The importance of emerging military and security technologies is amplified by several basic characteristics of technology systems. First, technologies are not just artefacts: they are social, cultural and economic phenomena. A powerful technology is not just deployed for security or military purposes: it is integrated into society, thereby changing it. Technologies that may be developed for one purpose in military environments - say, to provide a 'smart' prosthesis to a wounded soldier - may be, and often are, turned to other purposes in civil society (for example to augment existing human capabilities). Moreover, technologies of all kinds are being 'democratized'; in particular, that means that technologies of violence and mass destruction are no longer controlled by the state but are available to non-state actors, some of whom are highly ideological and violent. In the past, the most advanced military and security technologies tended to rest with the most powerful states - recently, the Soviet Union, the United States and their principal allies - but now, especially with the rise of asymmetric warfare and the increased difficulty of keeping technical secrets in the era of internet espionage, that is no longer the case. In short, understanding and ethically managing emerging military and security technologies has always been important; it is now critical. Providing a coherent introduction to the literature on the applied ethics of emerging military and security technologies in a single volume presents a number of challenges. The potential domain is vast: because war has been such a constant in human history, people have been thinking about the ethics of it for millennia. Virtually every major religious text, from the Bible to the Koran to the Tao Te Ching, has touched upon the morality of war from many perspectives. Strategic and historical treatments of war, from Sun Tzu's The Art of War (c. 512 BC) and Thucydides' History of the Peloponnesian War (c. 400 BC) to Alfred Thayer Mahan's 1890 The Influence of Sea Power Upon History to Vasily Sokolovsky's 1962

xiv

The Applied Ethics of Emerging Military and Security Technologies

Military Strategy, with its heavy emphasis on pre-emptive and massive nuclear strikes on both military and civilian targets, necessarily engage the complex integration of technology, ethics, values, cultural norms and behaviour during military activities.1 Indeed, the Athenian treatment of the neutral island of Melos in 416 BC, where the Athenians, dependent upon and dominant in naval technology, gave the Melians the choice of surrender and tribute or death, as described in Thucydides, is still used as a touchstone for exploration of deep ethical issues.2 That the Athenians were in that position in the first place is in large part due to their mastery of the trireme ship technology, including the tactics of using massed ships effectively, and the wealth that their control of the oceans provided (not unlike the case of the British Empire and Navy, many centuries later). Moreover, the complexity of conflict in the modern age is also daunting, a point effectively made in the selected essays from Brad Allenby (Chapter 1), the International Committee of the Red Cross (ICRC) (Chapter 5), the National Research Council (NRC) (Chapter 3), and Qiao Liang and Wang Xiangsui (Chapter 4) in Part I of this volume. Leaving aside technical issues of when a conflict was legally a 'war' under either domestic or international law, it was easier in past eras to know when one was being attacked. Conflict was a physical reality, albeit often cloaked in cultural or social patterns and rituals which were and are frequently quite complex. Now, in an age of cyberconflict, it is not so clear: is putting a back door or logic bomb in a software system the equivalent of a kinetic attack? And even if it is, given the anonymity of the internet, who does one react against? Moreover, since 1648 it has generally been the case that nation-states have been the relevant actors under international law and policy;3 the United Nations and many other organizations reflect that assumption. But today, conflict zones often include not just national militaries, governed by prevailing international law, but also non-state actors that may be just as powerful as nation-states, private entities that may be more or less under the contractual control of nation-states, independent private actors and espionage organizations that may be operating under very different rules and laws than anyone else. Especially in an age where international terrorism networks not based in any particular state and strategic doctrines of 'unrestricted warfare', where all elements of cultural competition are engaged (see Liang and Xiangsui, Chapter 4 in this volume), are becoming more widespread, it is not clear that the clear lines between war and peace, and civilian and 1

A good compilation of writings on military strategy from a number of cultures and historical epochs may be found in Chaliand (1994). 2 Melos argued that as a weak, open, peaceful society, justice demanded that they be spared. The Athenians took a 'might is right' position: 'Of the gods we believe, and of men we know, that by a necessary law of their nature they rule wherever they can.' In the event, Melos refused, and Athens destroyed their city, killing all men and enslaving all women and children. The context is complicated: there is, for example, some evidence that Melos was helping Sparta, Athens' enemy. Nonetheless, the Melian dialogue, and the brutality on the part of a society that had heretofore been the model of Greek civilization, continues to be a fertile source of analysis and discussion. 3 The Peace of Westphalia and implementing treaties, which ended the Thirty Years War in Europe, is generally accepted as establishing the modern, Western, international system of states. Whether the Westphalian system, derived from European experience, is still valid is increasingly being questioned, especially in light of global non-state actor terrorist networks, rising multiculturalism and new doctrines such as the 'Responsibility to Protect', which asserts a right on the part of states to intervene in a state's internal affairs if a subgroup is being oppressed or attacked (a significant challenge to state sovereignty).

The Applied Ethics of Emerging Military and Security Technologies

xv

military activities and personnel, are still applicable. More precisely, it is not clear where such assumptions still prevail and where they may be questionable, or even fail. In Chapter 4 Liang and Xiangsui emphasize another point as well: military and security technologies do not exist in a vacuum; they exist in a tactical and strategic environment that they may well cause to evolve in new directions. Asymmetric warfare and unrestricted warfare are attractive strategic options against a dominant conventional power such as the US in part because they are enabled by advances in cyber technology. Similarly, it is also clear that technological evolution is in a period of accelerating change. This dynamic is particularly challenging because it holds not just within technology categories - especially the so-called 'Five Horsemen' of nanotechnology, biotechnology, information and communications technology, robotics and applied cognitive science - but, because of the foundational role such technologies play for all engineered systems, across virtually the entire frontier of technology. This is particularly challenging with regard to the military and security domains for several reasons. Most obviously, new technologies often provide critical military advantage, which is why technological evolution and military and security activities and organizations have been tightly coupled throughout history. But technology is also one of the fundamental links between the military and civil elements of society, as noted by the NRC in Chapter 3 in this volume. Thus, for example, chariots, because they required supporting personnel and money to operate, not to mention the maintenance of horses, with all the logistics that implies, privileged aristocratic warriors. Later, stirrups privileged mounted horsemen by making horses a more stable platform from which to fight and shoot, making the Steppe warrior with his composite bow feared from Europe to China for centuries. The evolution of corned gunpowder had similar far-reaching effects.4 The development of handguns, continuing the impact of crossbow weapons, shifted military and cultural power away from elite mounted knights to infantry. Moreover, because a gunpowder army was both far more expensive than earlier feudal armies and required a significantly more substantial logistics system, gunpowder technology created economies of scale in military activities: feudal manors could field knights, but not effective gunpowder armies. Accordingly, gunpowder technologies were a significant factor privileging large-scale military activities and thus the evolution of nation-states. Indeed, that's why historians have coined the term 'Gunpowder Empire' to refer to empires such as the Turkish Ottoman Empire, the Safavid Empire of Persia, the Mughal Empire in India, and Spain and the Spanish New World, which arose in part as the economies of scale, and potency of gunpowder technology played out over time. Additionally, the integration of advances in ship design, cannon construction and gunpowder technology created the integrated technology platform that enabled European

4 Gunpowder, a combination of sulphur, charcoal and potassium nitrate, was known to the Chinese by the ninth century, but the powder tended to separate out over time, thus reducing its effectiveness, and early mixtures, while they burned in impressive ways, had difficulty providing the rapid energy release necessary for weaponization. 'Corned gunpowder' was developed in Europe in the late fourteenth century: it entailed mixing the materials when they were wet and letting them dry in small pellets, which among other things resulted in far longer storage times with significantly less moisture absorption than powdered formulations and a substantial increase in explosive power per unit of weight (for technical reasons, corned gunpowder was up to three times more powerful).

xvi

The Applied Ethics of Emerging Military and Security Technologies

navies to outcompete all others, leading eventually to a world where the British Navy in particular dominated the Seven Seas.5 Additionally, questions of military advantage and national security take on a different meaning when faced with informally networked terrorism combined with the democratization of potential weapons of mass destruction such as biological agents and chemical and cyberweapons (see, for example, Chapter 15, the US Department of Justice memorandum, Chapter 13, Stuart Casey-Maslen's essay on extraterritorial use of drones to kill American citizens, and Chapters 18 and 19 by Patrick Lin et al. and Herbert Lin, respectively, regarding cyberweapons). Security in a world dominated by nation-states often involved identifiable actors and surrogates; security in today's world, where religious and ideological frameworks support the growth of non-state and sometimes informal actors, is a much more complex question. Technologies can support greater security but usually at a cost to other values, such as privacy, freedom of expression and freedom of religion, a point that has come up in particular with regard to drone technologies and their impact on privacy, as the essays by Daniel Rothenberg (Chapter 16) and Brad Allenby (Chapter 17) in Part III illustrate. Whether and how to deploy technologies then becomes a question of balancing these values, and different cultures will come to different conclusions about what is appropriate. The recent concern with the scope of America's National Security Agency activities is an example of such a dialogue playing out in real time: there is little question that the 'big data' and data mining techniques developed by the NSA have contributed significantly to enhanced security, but also little question that such practices have encroached on individual privacy expectations. Finally, applied ethics is clearly coupled in complicated ways to legal structures and norms. This is especially true in this case because where war and conflict are involved there is a large body of international practice and law that has been developed over millennia with input from many cultures. Most people who have served in traditional military organizations have heard of the Geneva Convention, for example, and have probably had some training in the laws of armed conflict. Behind these formulations is a philosophical and ethical line of inquiry going back to antiquity that is frequently referred to as 'just war theory'. The historical dialogue surrounding such questions as when conflict can be rightfully initiated, how it should be conducted once it is initiated and how it should be terminated has engaged many thinkers, from many different cultures, over the ages and is an important part of, albeit not coextensive with, the applied ethics of conflict and war today. Underlying this legal discourse, and its application under various conditions, are supportive cultural norms, which in turn are underpinned by deeply buried assumptions concerning conflict, which are usually implicit and sometimes not even widely recognized. This is a critical framework to understand because it is possible if not probable that these assumptions have been undermined by technological 5

That technology must always be understood in its cultural, social and economic context is made clear by this example. The Chinese, under their legendary admiral Zheng He, fielded a fleet between 1405 and 1433 that explored India and Africa, and was far larger and more powerful than anything the Europeans could have put together. But, for reasons that historians still argue over but probably had at least some economic basis, the fleet was deliberately destroyed when it returned from its last voyage, and China turned inward, leaving the oceans open for European powers. European ascendancy was supported by their naval technology platforms, but that is not to make the (demonstrably superficial) claim of technological determinism that European technologies ensured the historical dominance of Western Europe in recent centuries.

The Applied Ethics of Emerging Military and Security Technologies

xvii

and geopolitical evolution - and if they indeed have been, then the more explicit legal and behavioural frameworks erected on them are fragile, and perhaps even unrealistic or dysfunctional (see Mattick et a/., 2012; see also Allenby, NRC and ICRC, Chapters 1, 3 and 5, respectively, in this volume). Given all these complications, and the rapid evolution of many of these domains, clarity is difficult to achieve. The first sections of this introduction will therefore begin at the beginning by presenting a basic discussion of the relevant terms 'applied ethics' and 'emerging military and security technologies'. The next section will discuss the laws of armed conflict because without a basic understanding of these institutions it is difficult if not impossible to navigate much of the relevant discussion, both within this volume and in the larger public policy dialogue. Finally, the concluding section will provide a few observations that may prove useful to the reader. Two initial points, however, deserve emphasis. First, most of the material included in this volume either implicitly or explicitly adopts certain normative positions and frameworks. This is clearly appropriate to their context but the reader should remember that, especially in the current geopolitical environment, no perspective can be regarded as universal. What may be enshrined in human rights and United Nations documents and treaties may represent a fairly common viewpoint, but it will likely not represent the perspective of a fundamentalist nonstate terrorist network.6 Second, even though this is a fairly hefty volume, it can only cover selected topics, and then only lightly. Behind each of these essays is a large literature and there are many important topics that cannot be covered in the space allotted. So, for example, even as one considers the numerous and difficult implications of any of the technologies mentioned in this volume, from cyberconflict to unmanned aerial vehicles,7 to increasingly powerful robotic systems, to designed warriors, one must also remember that the real challenge is that the entire frontier of technology is evolving and that the definitional boundaries between civilian and military and security technologies are fuzzy at best and misleading in many cases, so that the real challenge is the integrated complexity of the technological domain. What are Applied Ethics? Large questions of ethics and morality have challenged theologians and philosophers for millennia. The relevant branch of modern philosophy is known as 'moral philosophy'; only more recently has philosophy recognized the domain of 'practical ethics' (see, for example, LaFollette, 2003).

6

Indeed, as many have noted, 'terrorist' is itself a loaded term often used pejoratively to impose a negative status on particular parties or movements whose normative, political and/or cultural positions differ from one's own. 7 Unmanned aerial vehicles may be remotely controlled, such as Predators and Ravens in combat areas, or autonomous under certain conditions (for example once launched and on station). These devices are frequently called 'drones' in civil society, although military writers prefer other terms because a drone in military usage is a robotic device used for target practice. Unless confusion may result, in this introduction the term 'drone' will be used, reflecting common usage as appropriate in a non-military document.

xviii

The Applied Ethics of Emerging Military and Security Technologies

But the complex ethical, social, cultural and practical issues raised by emerging military and security technology are beyond the remit of philosophy, or indeed of any single domain. That's why this volume is about 'applied ethics', a discourse that includes, indeed requires, participation in public debate and dialogue, and deep engagement in political, cultural and industrial systems. Discussions of applied ethics draw on many domains, especially engineering and law, and practitioners other than professional academics. The fundamental difference between applied ethics and practical ethics, moral ethics, normative ethics and other sub-disciplines of philosophy, therefore, is where the discourse is housed and where it is focused. In the case of the latter, the discourse is academic, and it is housed in the discipline of philosophy. In the case of the former, the discourse may be informed by academic study but it is clearly housed in the real world of the practitioner, the professional and the institutions of economic and political power. This is reflected in the choice of essays for this volume: the essays by Rothenberg and Allenby on drones (Chapters 16 and 17) appeared in an online journal Slate and are thus part of an ongoing public dialogue around the applied ethics of drones; Chapter 9, by Wendell Wallach, on autonomous robots, appeared in a lay journal, Science Progress, as part of a public dialogue, while the Human Rights Watch essay (Chapter 11) was self-published as part of an activist campaign and the Noetic Corporation essay on identifying and managing game changing military and security technologies (Chapter 6) was prepared as a report to the Office of the US Secretary of Defense. Others, such as Chapter 8 by Gary E. Marchant et a/., on governance of autonomous military robots, and Victoria Sutton's essay on biodefense and bioterrorism (Chapter 25), appeared in law journals or legal publications. These essays were chosen for inclusion in part because they were not academic but rather reflected the wide variety of inputs that an applied ethics dialogue about a particular technology will often entail. Moreover, the applied ethics discourse does not rely on any single discipline for either its framing or its intellectual content, and it emphasizes action in the world rather than a more removed and contemplative perspective. This is again apparent in the choice of materials for this volume, which includes not just essays from relevant academic journals such as the Journal of Military Ethics and the Columbia Science and Technology Law Review but also high quality journals from practitioners, such as the International Review of the Red Cross, and indeed from non-peer reviewed non-governmental organization (NGO) reports and even public media (for example Slate). In philosophic ethics, the academician educated in classic philosophy is privileged; in applied ethics, no one is privileged. Good argumentation and reason are still critical but the thoughtful military officer is as likely to present useful perspectives on the applied ethics of military and security technologies as the academic philosopher - perhaps even more so, since the officer may well have relevant experience with the technologies, and equally importantly with the culture and institutions within which the technologies are deployed, that the philosopher lacks. In fact, it is important to note up front that from an applied ethics rather than a philosophic perspective, academic critics of the military are becoming further and further separated from that which they critique - a point which concerns many observers. As the then US Secretary of Defense Robert M. Gates noted in a speech at the US Military Academy at West Point, An all-volunteer military is to a large degree self-selecting. In this country, that propensity to serve is most pronounced in the South and the Mountain West, and in rural areas and small towns nationwide -

The Applied Ethics of Emerging Military and Security Technologies

xix

a propensity that well exceeds these communities' portion of the population as a whole. Concurrently, the percentage of the force from the Northeast, the West Coast, and major cities continues to decline ... In addition, global basing changes in recent years have moved a significant percentage of the Army to posts in just five states: Texas, Washington, Georgia, Kentucky, and North Carolina. For otherwise rational environmental and budgetary reasons, many military facilities in the northeast and on the west coast have been shut down, leaving a void of relationships and understanding in their wake. (2011)

For various reasons, the same pattern of increasing separation between the military and civilian populations, especially the highly educated academic population, seems to be occurring, albeit in different ways, in many countries. This, along with the fact that most philosophers have little to no understanding of technological systems, greatly complicates the project of studying and understanding the applied ethics of emerging military and security technologies. Thus, for example, those who have used technology in military or security environments are more likely to have realistic appraisals of the strengths and weaknesses of the technology system in the real world, while those who are inexperienced seem to be more drawn to Utopian or dystopian scenarios. Emerging Military and Security Technologies To begin with, it is important to understand technologies as systems: technologies are not just things; they are complicated patterns of social, institutional and earth systems change (Bijker et a/., 1997). It is essentially a truism that throughout history any technology system of any power whatsoever, from the wheel to the stirrup, to railways to electrification to the internet, has destabilized the economic, social, political, cultural and institutional environment within which it develops (Rosenberg and Birdzell, 1986; Freeman and Louca, 2001). Moreover, while one must always be wary of the tendency to overemphasize the rate of change in the particular era one lives in, there are reasons that suggest that the implications of current emerging technologies differ in kind from those in the historical past (Allenby, 2011; see also NRC, 2005 and 2010, and Chapter 3 in this volume). One of the obvious differences between current and past experience is that past waves of technological innovation have generally revolved around one core system such as textiles, electrification, railways or motor vehicles. Today, however, technological evolution is occurring across the entire technological frontier, driven by, and reflecting, accelerating evolution in at least five core technologies: nanotechnology, biotechnology (including genetic engineering), information and communication technology (ICT), robotics and applied cognitive science. Each of these is not just powerful in itself but is an enabling technology that supports unpredictable innovation in many separate domains: nanotechnology enables more powerful ICT, which in turn supports biological printing capability, which will lead to the development of 3D printed replacement organs. It is also the case that the rate of technological change is accelerating at unprecedented rates (Kurzweil, 2005). Indeed, for most of human history technological evolution was so slow that its effects were mitigated by small increases in human population, leaving virtually the entire world at subsistence levels with no productivity increases, except for the select few (Clark,

xx

The Applied Ethics of Emerging Military and Security Technologies

2007). That changed with the Industrial Revolution, and the rate of technological change continues to accelerate. A third difference arises from the fact that these technologies are now operating at scales from the nano to the planetary. In particular, two critical domains, the planet and the human entity, are increasingly becoming design spaces in themselves (some of the implications of this evolution are discussed by Maxwell J. Mehlman, Patrick Lin and Keith Abney in Chapter 23 and by Gary Marchant and Lyn Gully in Chapter 24). Thus, the global and increasingly interconnected patterns of human technological, economic and social systems mean that human activity now operates at planetary scale. The contemporary concern about global climate change is only the first naive response to the reality of a terraformed planet (indeed, many scientists are increasingly calling the modern age the 'Anthropocene', roughly meaning the Age of Humans). Responses such as ethanol-based biofuels have not just created local impacts, such as land use and water consumption concerns. They have also changed planting patterns and affected food prices around the world, with consequent political disruption, as well as further distorted hydrologic, nitrogen and phosphorous cycles. Other geoengineering proposals, such as releasing sulphur particles into the atmosphere to seed clouds to reflect incoming sunlight back into space, are problematic because even though they are intended to operate at planetary scale, their other impacts, such as major shifts in weather patterns and atmospheric physics, have yet to be determined. This raises the important observation that virtually any potent technology, once it is sufficiently understood, can be weaponized, at least by sophisticated actors. A final complication is that the processes associated with technological change are nonlinear. This is more of a change than most moderns realize: while technological change has always been characteristic of human populations, for virtually all of human history until the last several hundred years, it did not occur at such a rate as to lift the essentially subsistence level of human existence. That changed only with the Industrial Revolution, with what economic historians call 'The Great Divergence': a rapid and growing disparity in income per person and technological capability separating industrial from undeveloped countries (Rosenberg and Birdzell, 1986; Allenby, 2011). The economic, social, cultural and institutional aspects of the waves of technological change that lay behind The Great Divergence are profoundly nonlinear, and, like accelerating change, are not captured by any single discipline, including industrial ecology. Such changes have powerful geopolitical and security implications; although they have not been traditionally considered as part of the military remit that may be changing with the concept of unrestricted warfare (Liang and Xiangsui, Chapter 4 in this volume). At the same time, the evolution of these integrated technologies is making the human a design space in ways that it has never been before. While people have always enhanced themselves - think of coffee or ethanol - the direct interventions that are possible today are far more powerful. Many students, for example, now use widely available drugs that improve concentration and dramatically reduce the need for sleep; others use steroids that help build impressive bodies, sometimes at the cost of physiological and emotional damage. Vaccines routinely create seriously enhanced immune systems, thus extending life, and increasingly powerful prosthetics are created as machine and biology continue to be integrated at the tissue level. Scenarios for future human design include development of computer-to-brain interfaces that someday may enable telepathic technologies (greatly benefitting small unit

The Applied Ethics of Emerging Military and Security Technologies

xxi

operations). Even today, militaries are a leader in 'augmented cognition' or 'augcog', where cognition does not occur at the individual level but increasingly emerges from integrated techno-human networks. One example of this is the US Defense Advanced Research Projects Agency (DARPA) XDATA programme, which funds research on integrated techno-human systems which can process the increasing information streams generated in modern combat far better than either human analysts or computers operating alone (DARPA, 2014). Even more radically, some medical researchers claim that the first people who will have a lifespan of at least 150 years with high quality of life throughout that entire period have already been born in developed countries (de Grey and Rae, 2007). As with all claims of coming technology, such assertions are best viewed as scenarios (albeit perhaps probable scenarios) rather than predictions. Nonetheless, many of the technologies that would support radical life extension are being developed at least in part by military organizations; the United States, for example, supports a wide variety of research intended to create heavily enhanced 'super soldiers' (DARPA, 2013). Not only does this illustrate the important point that technologies developed for specific military applications may have wildly unpredictable effects but it also illustrates how weak the institutional and disciplinary tools are that might allow their reasoned evaluation - think for just a moment about what radical life extension might imply for population control, for environmental systems, for pensions and work patterns or for intergenerational relationships. Indeed, it is fair to argue that right now there are no good ways to perceive, much less understand or manage the serious challenges of understanding what it means for the human to be a design space. Part of the difficulty of defining military and security technologies, therefore, arises from the characteristics of contemporary technological evolution, one reason that the overview essays in Part I, especially those by P.W. Singer (Chapter 2), Brad Allenby (Chapter 1), Noetic (Chapter 6), the NRC (Chapter 3) and Alan Backstrom and Ian Henderson (Chapter 7), were chosen for inclusion in this volume. What a military or security technology was used to be fairly clear: it was something that projected, or protected against, organized physical attack. But any powerful technology can be weaponized: big data powers the security work of the US National Security Agency, airplanes are flown into buildings, cyber networks enable logic bombs or attacks on civilian internet infrastructure and even climate change is a geopolitical weapon. If the US were to unilaterally deploy technology to pull carbon dioxide directly from the atmosphere (and such technologies exist today, although they are expensive), it would likely increase cooling at the poles, affecting the ability of countries such as Russia and Canada to develop their northern reaches. This is a particularly important point when 'unrestricted warfare' - that is, warfare that extends across all civilian and military systems in a clash of cultures - is an explicit strategy of major powers such as China (see Liang and Xiangsui, Chapter 4 in this volume). What is clear is that, by any measure, the world invests a huge amount of money in military, defence and security sectors. In 2012, total direct global military expenditures were approximately US$1,756 billion, which amounted to 2.2 per cent of global GDP. The United States spent US$685 billion, or 4.6 per cent of domestic GDP; China spent US$166 billion, or 2.6 per cent of GDP; Russia spent US$91 billion, or 3.9 per cent of GDP; and the United Kingdom spent US$61 billion, or 2.5 per cent of GDP. Countries in less settled regions spent even more on a per capita basis: in 2012 Israel dedicated 7.4 per cent of its GDP to defence expenditures, while the figure for Saudi Arabia was 9.1 per cent of GDP (CIA, 2013; SIPRI

xxii

The Applied Ethics of Emerging Military and Security Technologies

2013). Even though these figures are approximate, understated in that they reflect direct military expenditures and not the huge industrial establishments that often lie behind the military sectors, and frequently hard to verify because of their sensitive nature, they clearly indicate the primary role that military and security technologies, industries and institutions play in the global economy and global society. Moreover, all indications are that military and security concerns will increase, not decrease, over the coming decades, regardless of whether immediate conflicts, such as those in Syria, the Congo, Ukraine and elsewhere, wind down (Kagan, 2012; Kupchan, 2012). The shifts in relative power associated with the rise of China in contrast to the United States, the evolution of a profoundly multicultural world from the remnants of the Cold War and its bipolar power structure, and supra-state terrorism augmented by a democratization of weaponized technology ensure continued conflict (Huntington, 1996; Boot, 2013). As Trotsky is alleged to have observed, 'You may not be interested in war. But war is interested in you.'8 Thus, military and security technologies deserve consideration in part simply because of the size, economic importance and cultural and social significance of the sector, especially when its technologies are deployed in conflict situations. Second, however, is the always morphing, but always strategic, relationship between technological evolution and military and security activity. Warfare and conflict often pose existential challenges to societies, and technological innovation and diffusion is an important differentiator in such conflicts. But the effects of technological advances are almost never simple: technologies such as stirrups, the composite bow, corned gunpowder, the machine gun and nuclear weapons have not simply changed military balance of power but often significantly altered social systems (Boot, 2007). Gunpowder weapons, for example, played a part in the democratization of conflict in Europe and the shift from a mounted feudal system of personal combat to larger professional armies, and eventually to the universal conscription citizen military that Napoleon deployed. Such technological developments had profound institutional and geopolitical effects. A mounted elite such as European knighthood favoured individual power and the feudal system; the gunpowder revolution not only de-privileged the mounted elite (who could simply be shot out of their saddle, a shift in power that had been signalled by the earlier adoption of the crossbow and longbow) but because of the massive increase in logistics that a gunpowder army implied led to significant governance economies of scale. A manorial system could not fund cannon armies but a royal government could. Geopolitically, technologies such as the machine gun enabled European imperial powers to dominate far larger but less well equipped and trained indigenous military forces, as suggested by the famous doggerel by the British writer and poet Hilaire Belloc (Keegan, 1993; Boot, 2007): Whatever happens, we have got The Maxim gun, and they have not.

But the picture is not simple. Even though the advantages offered by technological advances, such as corned gunpowder, made for rapid diffusion of new technologies (and newer 8 The quote is apparently a very loose translation of an early Trotsky observation perhaps better translated as, 'You may not be interested in the dialectic, but the dialectic is interested in you.' But in any event, it is so apropos that he should have said it even if he didn't, http://en.wikiquote.org/wiki/ Leon_Trotsky.

The Applied Ethics of Emerging Military and Security Technologies

xxiii

technologies in response), the inherent conservatism of military organizations and the high costs of experimentation in conflict environments should anything go wrong have traditionally served to limit technological evolution. This is especially true where technologies are dual use and concern arises not from the military but from the social domain. Railways, a very potent technology usually not thought of in military terms, provide an interesting example. Most Americans, for example, know that the strong railway infrastructure of the North in the Civil War contributed substantially to victory in that conflict, not just because of the obvious advantage rapid rail transport provided in terms of troop movements and logistics, but more generally because the growing rail network, although nowhere near as sophisticated as it was by the turn of the century, constituted a critical infrastructure supporting the North's greater productivity and industrial efficiency (Wolmar, 2012). But it is important to remember that, as with all complex technologies, it was not just the built structures but the technology as embedded in its larger social, cultural and economic context that was critical: the North had a track density (track laid per unit land area) that was three times that of the South - and, as one indication of the importance of culture in technology patterns, the same was true of the other great transportation system of the day, canals, thus demonstrating not only the systemic advantages of the North but, more generally, the strategic and tactical advantages of an industrial versus an agrarian society in a period of rapid technological change (Keegan, 1993; Acemoglu and Robinson, 2012). Importantly, the most significant barriers to Southern US rail development were cultural and political: states' rights philosophies meant that Southern railways were generally not permitted to integrate with each other across state lines, and a fear that rail travel could undermine the agrarian slavery economy of the South led to serious distrust of the technology (a pattern repeated in the Austrian Empire, pre-revolutionary Russia and even pre-First World War France, all worried about the social stability of their large, and somewhat archaic, agrarian systems) (Parker, 2005; Wolmar, 2012). As in this case, technologies often thought of as purely civilian - or for that matter purely military - are in fact dual use, a point made by the NRC in Chapter 3 of this volume. This was true of railways; it is also true of modern road networks, airplanes, robotics and cognitive Pharmaceuticals. It is especially true with the internet and cyber capabilities generally; when countries adopt unrestricted warfare strategies they are not contemplating conventional military action against overwhelming US superiority, but rather the use of integrated infrastructure technology, and especially cyber, in asymmetric warfare that allows them to challenge and wear down otherwise dominant opponents through strategic use of dual technologies (Liang and Xiangsui, 1999, partially reprinted as Chapter 4 in this volume). Indeed, it has been suggested that a 'death of a thousand cuts' strategy is the optimal way to attack the US, especially given the unclear status of cyberconflict options under the prevailing laws of war (Mattick et a/., 2012). With such strategies, the line between military, security and other domains becomes at best fuzzy and, depending on circumstances, essentially irrelevant. As this example also suggests, the applied ethicist must be careful of unconsciously importing a bias towards existing power structures and concomitant institutional practices. No doubt the British Hessians felt that the American colonist guerrilla tactics were unethical and improper, and the Japanese samurai felt that gunpowder weapons were unethical, just as many Americans would consider a 'death of a thousand cuts' strategy relying on internet degradation of existing civilian systems to be unethical. New technologies, new techniques and the need

xxiv

The Applied Ethics of Emerging Military and Security Technologies

to fight effectively against the status quo powers usually mean changes in patterns of conflict, and changed patterns may change what is ethically acceptable. Indeed, in post-Second World War anti-colonial conflicts, counterinsurgency came to be seen as more ethical by some. In short, there are no good definitions of 'military and security technologies' that would not potentially be seriously misleading. The technologies selected in this volume - robots and autonomous systems, unmanned aerial vehicles, cyber and information technologies, and biotechnology and bioengineering, in Parts II to V respectively - should be regarded as case studies, selected to introduce a wide variety of the applied ethics issues that arise in this complex domain, but not as definitive of the class of military and security technologies as a whole. Indeed, an important theme in Part I, the context and overview section, is precisely that the question of what constitutes a technology with important military and security implications cannot be answered in today's world. Nor is this simply an academic point; militaries around the world are struggling with how to avoid strategic surprise, and defend their societies, when current technological challenges are not easily identifiable, and future surprises so likely - and yet so hard to perceive and defend against a priori (NRC, 2005; NRC, 2010). Applied Ethics and the Laws of War The challenge of emerging technologies generally and recent significant initiatives in technology development, such as the Human Genome Project and the US National Nanotechnology Initiative, have led to the development of an ELSI (ethical, legal and societal issues) framework. While this is still a somewhat inchoate and nascent approach, it does reinforce an important point about applied ethics in general, and this domain in particular: it is impossible to understand and assess the applied ethical implications of military and security technologies without at least a brief introduction to the applicable frameworks, norms and existing legal and operational structures, formal and informal, which have developed over many centuries around the complex reality of human conflict. The NRC essay (Chapter 3), in fact, was selected in part because it explicitly links ELSI analyses with emerging military and security technologies. More broadly, while applied ethics are not the same as law, it is also the case that legal structures, especially ones that many nations support in full or in large part, are an important consideration when evaluating the ethical implications of military and security technologies, and a reasonable guide to the underlying, and often somewhat inchoate, norms. To begin with, there are three philosophic frameworks within which conflict and war are generally perceived: realism, pacifism and just war theory. Realism and pacifism are opposites of each other in many ways. To a pure realist, a state has no ethical constraints whatsoever: people, not states, are governed by ethics, and whatever a state does to advance its interests in times of war, when its existence may be at stake, is appropriate. The only relevant question about conflicts and wars, therefore, is operational: are the interests of the state being advanced by the action taken? To a pure pacifist, on the other hand, no conflict or war can be ethical, so to discuss laws of war is to already be in unethical territory. A pure pacifist might even oppose any laws of war on the grounds that they are an attempt to make a per se unethical activity more ethical, which is not logically possible. Moreover, by making war less horrible, the laws of war may make it more likely or more acceptable, therefore encouraging conflict. Very few people, and virtually no states, are either pure pacifists or pure realists. Rather, most tend towards a third perspective, just war theory, which begins from the premise that

The Applied Ethics of Emerging Military and Security Technologies

xxv

history demonstrates that war and conflict are part of the human condition and that when they cannot be prevented they should be conducted as humanely as possible. In particular, care must be taken to avoid unnecessary harm to those civilians unfortunate enough to be caught up in them. For the last several hundred years, since the 1648 Peace of Westphalia and Hobbes' 1651 Leviathan, and especially with the founding of the United Nations, just war theory has been tied to the state sovereignty model of international governance: membership of the UN and treaty and statutory obligations flow to states, not individuals, firms or NGOs. This statebased model has come under increasing pressure from a number of developments. Among these are cases where states assert the right to intervene in the internal affairs of another state where internal mistreatment of minorities is occurring, leading to a new theory postulating a 'duty to intervene' (often called R2P or 'Responsibility to Protect'), a clear violation of the state sovereignty doctrine. Additionally, the evolution of transnational, often informal, networks of non-state actors, such as the Islamic jihadists, challenges the entire structure: even if such groups wanted to sign relevant treaties, they cannot, because they are not a state and they are not controlled by any state. While there have always been those who acted through terrorism - after all, that was the spark behind the First World War - such terrorists were either manageable as criminals or strongly, if informally, associated with a particular state (the Serbian Black Hand terrorist organization that set off the First World War was founded by Serbian military personnel). A modern, global, informal, non-state terrorist network capable in theory of obtaining and deploying weapons of mass destruction is a relatively new phenomenon and a significant challenge to the state-based just war legal regime. Extensive privatization of military activities, such as the US implemented in Afghanistan and Iraq, presents another challenge: private armies are not bound by international treaty (although they may be bound by their home country laws implementing such treaties or through contract clauses) (Singer, 2008). Moreover, in today's confused conflict zones it is not unusual to find military, private armies and espionage entities engaging simultaneously; military personnel may be bound by the laws of war but espionage and surveillance agencies are not. Finally, some technologies, such as cyberwar conducted within a strategy of unrestricted warfare, raise a number of potential issues for the state model and more generally for just war theory (see George R. Lucas (Chapter 20) and Lin (Chapter 19) in this volume). These complexities do not mean that just war theory is obsolete as a body of law or as a valid source of guidance for applied ethical discussions of military and security technology. But they do caution that it is a period of rapid evolution, both of the technologies and of the laws and ethics governing them (Mattick et a/., 2012). Thus, it should not surprise the alert reader to see many of these conflicts implicit in most of the essays in this volume, and to detect a tension between existing legal and ethical structures and the changing realities on the ground (the US Department of Justice White Paper discussing when a lethal attack can be carried out on a US citizen who is also a terrorist (Chapter 15) is a particularly good example, as is the dialogue between Randall R. Dipert and James Cook regarding cyberwarfare in Chapters 21 and 22, and the contrast between Peter Asaro and Bradley Jay Strawser, Chapters 10 and 14). Just war theory itself can be broken down into three components, all of which combine norms, written laws and treaties, and customary international law.9 Jus ad helium addresses 9

Customary international law is a body of law that derives from custom rather than explicit treaty or statutory obligations. While it may be informal, it is often cited and relied on in national and

xxvi

The Applied Ethics of Emerging Military and Security Technologies

the question of when a war can be ethically begun. Jus in bello addresses how wars may be ethically fought once they are begun. Jus post helium is a much newer component of just war theory and deals with questions arising from the termination of war and subsequent peace agreements and associated activities (for example payment of reparations). While these categories have traditionally been framed separately, they are obviously coupled: an initially just war (say, responding to an attack in self-defence) may become an unjust war if fought using unjust means. Similarly, even is a war is begun unjustly, the parties are not thereby relieved from having to prosecute the war itself in a just and legal fashion. Traditional jus ad helium held that in order for a war to be just it had to satisfy three conditions. First, it required a compelling cause or justification, such as self-defence in the face of aggression. Second, there must be a public declaration issued by a legitimate authority, such as a state. Third, the use of deadly force must be undertaken only as a last resort. While a pre-emptive attack is possible, there must be a powerful, immediate and identifiable threat if such an attack is to be justified. With the founding of the UN, jus ad helium became a function of the UN Charter, adopted in 1945. The UN Charter provides that the use of force by states against each other (but not against internal minorities) is prohibited unless it is necessary to restore international peace and security in the face of aggression or a breach of the peace, or a matter of self-defence (see generally Articles 2(4), 39, 42 and 51). The first of the major treaties implementingyz/s in bello, the initial Geneva Convention treaty, was signed in 1864. This began the development of what is often referred to as international humanitarian law, or IHL (IHL is often referred to as the 'laws of armed conflict' as well). IHL has subsequently developed two major branches. The first deals with the need to separate combatants from non-combatants, and the protection of the latter to the extent possible. The second regulates the practices, technologies and methods of armed conflict. Perhaps the most important basic requirement of the laws of war taken as a whole is military necessity. This principle requires that only those engagements and actions that are conducted in the pursuit of legitimate military objectives are ethical. This is obviously only the beginning of analysis, because it is easy to identify many actions which might be militarily useful but would still be unethical: many attacks on civilians, for example, might be argued by their proponents to be in pursuit of military objectives but they would still be regarded under the prevailing interpretations of IHL as unethical (as well as unlawful). But it is an important touchstone in that it requires at least some effort to identify a military purpose for a particular action. While there are many principles, treaties, practices and norms that have arisen over the centuries which, taken together, constitute a complex and robust domain, there are several additional general requirements that the reader should be familiar with, as they arise frequently in the literature, as well as in readings in this book. The core requirement of the principle of discrimination, sometimes referred to as distinction, is that combatants be distinguished from non-combatants and that only valid military targets should be engaged. Obviously, applying this principle can be difficult in counterinsurgency operations and even more difficult in terrorism environments where someone who is a civilian almost all of the time dons an explosive vest and becomes a terrorist (whether terrorism is or international legal decisions and documents, although there is, not surprisingly, no universal agreement on specifics.

The Applied Ethics of Emerging Military and Security Technologies

xxvii

should be subject to IHL in the first place or whether it represents something other than war is another complicated question). This principle often arises in discussions of autonomous robotic systems: some argue that robots will never be able to discriminate between combatants and non-combatants (see, for example, Human Rights Watch, Chapter 11 in this volume), while others point out that humans are not terribly good at discrimination in many cases such as terrorism and that the assumption that robots can never develop such capabilities, and at least become better than humans, is only a hypothesis about a technology that is still evolving rapidly and unpredictably (see Ronald C. Arkin, Chapter 12 in this volume) (for a general review of the legal implications of autonomous military robots, see Marchant et a/., Chapter 8 in this volume). The proportionality principle requires that the force exerted to achieve a legitimate military objective must not be excessive. A soldier cannot, for example, shoot and kill an individual who is yelling insults absent other factors because the lethal response is far in excess of the amount required by the situation. Similarly, excessive collateral damage to civilian property or excessive civilian injuries or deaths are not permitted under this principle. It is important to understand that under just war theory and its current incarnation in the laws of armed conflict, collateral damage - harm to civilians and civilian infrastructure and materials - is not per se unlawful. Under the Doctrine of Double Effect, collateral damage is not unlawful if the causative combat action is otherwise permissible and the military and not the collateral effect is the one intended (civilians may not be deliberately targeted). In addition, the collateral effect cannot be the means to achieve the combat effect and, in a variant of the proportionality principle, the potential benefits of the combat activity must outweigh the collateral impacts. Thus, for example, it is not unlawful if a drone attack on a terrorist group kills a civilian, assuming that the other tests of lawfulness are met. Whether it is politically or strategically wise is a different question, especially as many opponents of Western military operations have learned to use 'lawfare' against such operations: Lawfare denotes 'the use of the law as a weapon of war' or, more specifically, the abuse of Western laws and judicial systems to achieve strategic military or political ends. It consists of the negative manipulation of international and national human rights laws to accomplish purposes other than, or contrary to, those for which they were originally enacted. (Lawfare Project, 2014)

Obviously, the use of IHL against militaries that are charged with following its requirements is generally done by organizations that do not follow or support IHL principles, since the longterm effect of such strategies may well be to weaken IHL compliance. Many treaties and agreements act to prohibit 'weapons of a nature to cause superfluous injury or unnecessary suffering' (ICRC, 2014). Thus, for example, treaties and/or customary international law prohibit use of such weapons as lances or spears with barbed heads and serrated bayonets; expanding or explosive bullets; weapons with poison on or in them, and poison gas; biological weapons; weapons that produce fragments that can't be detected by X-ray; and blinding lasers (ICRC, 2014). This principle obviously overlaps with the previous one: a weapon that causes unnecessary injury and suffering usually violates proportionality as well. Finally, the principle of command responsibility makes commanding officers responsible for the actions of those under their command. This principle, which will be very familiar to any officer in a modern military, in principle assures that there is an identifiable chain of

xxviii

The Applied Ethics of Emerging Military and Security Technologies

responsibility in case of any violation of the laws of armed conflict and provides incentives for those in charge to ensure that their personnel are trained in, and comply with, IHL. While these principles sound simple, practice is often far more complicated; additionally, many studies have shown that even well trained and led soldiers can commit war crimes under the stress of combat. Command responsibility, for example, is an excellent and necessary principle, and is needed to protect lower ranking soldiers from being unfairly blamed for poor training and orders that lead to inappropriate behaviours, but it doesn't always work that way (some argued, for example, that it was a violation of this principle when lower ranking soldiers were disciplined for the problems at Abu Ghraib while more senior personnel, including civilians in the chain of command that made such behaviour possible and arguably encouraged it, were generally not). Moreover, it should be remembered that, especially in heterogeneous conflict environments, which may involve elements of traditional military combat, policing activities, espionage and sabotage efforts by intelligence agencies, private firms acting in many different capacities for many different actors, individuals slipping from combatant to non-combatant roles unpredictably and without outward sign (for example donning explosive vests under civilian clothes), such principles apply only to military personnel carrying out military missions. The opportunities for confusion are legion and often cannot be reduced, in part because it may well suit some of the parties to the conflict to encourage such confusion.10 Concluding Observations While this is obviously a difficult and highly complex domain, there are a few general observations regarding approach and perspective that may be useful to the reader, both in this field and more broadly with applied ethics issues generally. •

10

Appreciate the co-evolution of, interconnections among and tensions between law, applied ethics and norms. Law is often established and relatively clear, at least in principle, and because it is a function of jurisdiction, it is conceptually clear that most laws are not universal: even the UN Charter applies only to signatory states. Major treaties bind only certain parties (that is, states) and only indirectly or through national or local legislation bind individuals, firms or NGOs. Norms are trickier in that they are often implicit, and they are often assumed, especially by dominant cultures, to be universal when they may not be. Thus, for example, many states with very different cultural backgrounds accede to the UN Charter and various treaties, such as the Hague and Geneva Conventions, implementing the 'standard' view of the laws of armed conflict, but it is not clear that the norms underlying them are accepted by, for example, global terroristic networks such as that within which al-Qaeda is embedded or non-

The intersections among IHL, the laws of armed conflict, morality and ethics as applied to armed conflict and developing international practices in the rapidly shifting and increasingly complex environment of armed conflict in today's world are highly complicated, and the literature is substantial. On the subject of IHL itself, Orend (2006) is an excellent overview text of the topic and Bovarnick et al. (2010) an excellent resource containing relevant text and summaries of many of the source documents. The ICRC is an excellent, authoritative and balanced source of interpretation and analysis of current issues and dilemmas, as the chapters in this volume by the ICRC and Backstrom and Henderson suggest.

The Applied Ethics of Emerging Military and Security Technologies

xxix

state actors such as ISIS, the Islamic State in Iraq and Syria. Recognizing that value systems other than one's own exist and are regarded by adherents as equally valid is not ethical relativism but rather a simple matter of appreciating the world as it is, and as such is an important step in applied ethics. Appreciate the power of hypotheticals but don't reify them, and don't develop current policy based on them. Many emerging military and security technologies, are, after all, designed to cause damage and death, and are therefore simply quite scary: there is a tendency for activists and others to generate hypotheticals that are highly dystopian (it is much rarer for hypotheticals to be Utopian in this domain). This process is, by and large, helpful to policy-makers and applied ethicists in that it develops practice in thinking about scenarios and how to respond to them - indeed, this is one reason why militaries, well aware of the chaos and unpredictability of conflict, play war games. But it must always be remembered that the future path of any technology of sufficient power to be interesting is impossible to predict a priori: the interactions of technologies, cultures and societies, politics, existing economic and technological interests and institutions is far too complex, and often too strange, to even guess at and can only be adapted to in real time. That's why the people that built the first computers had no idea how ubiquitous they would become and the military researchers that built the initial internet had no idea what we would be using it for today. Thus, it is a serious category mistake to demand policy today based on either Utopian or dystopian hypotheticals because that implies a level of certainty about the future that is, in practice or in principle, impossible. It is an exercise in ideology, not rational decision-making, and not applied ethics. Similarly, watch for 'coded language' wherever it appears in an argument. For example, one of the most brilliant public relations coups of modern times was the coining of the word 'frankenfood'. The integration of deep distrust of technology implicit in the Frankenstein narrative, with something - food - that all people require is profoundly unsettling and avoids entirely the need to understand underlying issues of science, technology, economics and culture (contrary to the implications of the term, most studies have found no adverse human health or mortality effects from genetically modified food crops). Coded language thus becomes a way to avoid analytical and rational discussion, critical to applied ethics, by directly engaging emotional responses. But also note that the use of coded language does not, by itself, invalidate the underlying position: the applied ethicist has a responsibility to consider all facets of an issue as fairly, and with as much integrity, as possible, regardless of how inartfully or even duplicitously they may be presented. Especially with emerging military and security technologies, remember that the applied ethicist must reach beyond traditional boundaries. Thus, for example, miniature flying robot platforms might be quite useful in counterinsurgency environments, enabling targeted surveillance and attack that significantly reduce collateral damage to civilian property and health. On the other hand, virtually no interesting technology introduced in military environments will stay there: what do such miniature flying robot platforms mean when they proliferate in civilian society, used by everyone from divorce lawyers to local news channels to political candidates and parties? What do they imply if used in societies where surveillance because of fear of crime is already at high levels?

xxx

The Applied Ethics of Emerging Military and Security Technologies



The initial build/buy decision for a technology will often reflect immediate uses and demands; a sophisticated analysis of the technology and its implications should move far beyond that, using scenarios, war game techniques and other tools to try to identify potential costs and benefits as the technology spreads - not because such costs and benefits will happen, for technological evolution is unpredictable, but because such a process enables real time responses that are more ethical, rational and responsible.11 Watch for embedded and unstated assumptions that drive the results of analyses; while these may not be illegitimate, that can only be determined once they are explicated and evaluated in light of the purpose of the analysis. For example, some writers may assume that war must be 'fair' and that it is 'unfair' when technology gives one side or the other an advantage in killing. The implied underlying assumption is that conflict is a game environment where the two sides should be equally matched, a sort of egalitarian perspective. Does this assumption suit the analysis being made? Is viewing combat as a game useful under some circumstances? Alternatively, some technologies may assume that the proponent will always have air superiority, as Western powers have generally had in recent conflicts (in Iraq and Afghanistan, for example, deployment of unmanned aerial vehicles has been possible partially because the US and NATO powers had absolute air superiority). Where are the boundaries within which this assumption operates, as it obviously fails under easily foreseeable scenarios (for example in a conflict between Europe and Russia or the US and China it is highly unlikely that drones could be deployed as they were in Iraq and Afghanistan by NATO forces)? Regardless of what the implicit assumptions are, it is usually a good idea, and results in a better analysis, if they are made explicit.

Finally, it is critical to realize that conflict and war, and the balancing desire for security, are part of the human condition. Combat of various kinds is one of the few activities that in many and various forms has characterized humans and their societies apparently from the beginning. Moreover, while many of the essays in this volume make strong, even passionate, appeals for one or another perspective, it is impossible to select any particular position or set of values and identify it as objectively, obviously, 'right'. Indeed, recent scholarly work has even made a strong case that, while the damage and costs of wars are obvious, over time and for most people wars have been beneficial, creating order, security, growth and prosperity (Morris, 2014), a most counterintuitive argument. A Chinese planner facing US conventional forces, an ISIS commander, a harried US platoon leader, an insurgent force subverting a weak government, a private military firm hired by an NGO to fight a predatory state committing internal genocide, an intelligence agent operating outside traditional institutional boundaries - all will have different perspectives on the applied ethics of their activities, some of which overlap, some of which don't. The rich chorus of voices in this volume do not agree with each other but in that they reflect the real world of conflict and emerging technologies - and it is that real world that is the domain of the applied ethicist. 11

Allenby and Sarewitz (2011) differentiate between the immediate use stage, called 'Level F, the system within which that immediate use is embedded, called 'Level IF, and potential systemic impacts as the technology diffuses, and co-evolves with, social and economic institutions, called 'Level IIP. The important point is that while Level I effects may be fairly obvious and explicit, Level II and III effects are just as real, even if they are inchoate until they actually develop, and are unpredictable a priori.

The Applied Ethics of Emerging Military and Security Technologies

xxxi

Acknowledgement The author would like to thank the Lincoln Center for Applied Ethics at Arizona State University for its support for his work in the applied ethics of emerging military and security technologies. References Acemoglu, D. and Robinson, J.A. (2012), Why Nations Fail, New York: Crown Business. Allenby, B.R. (2011), The Theory and Practice of Sustainable Engineering, Upper Saddle River, NJ: Pearson/Prentice-Hall. Allenby, B.R. and Sarewitz, D. (2011), The Techno-Human Condition, Cambridge, MA: MIT Press. Bijker, W.E., Hughes, T.P. and Pinch, T. (eds) (1997), The Social Construction of Technological Systems, Cambridge, MA: MIT Press. Boot, M. (2013), Invisible Armies, London: W.W. Norton. Boot, M. (2007), War Made New, New York: Gotham. Bovarnick, J.A, Harlow, P., Rush, T.A., Brown, C.R., Marsh, J.J., Musselman, G.S. and Reeves, S.R. (2010), Law of War Deskbook, Charlottesville, VA: International and Operational Law Department, The Judge Advocate General's School, US Army. Chaliand, G. (ed.) (1994), The Art of War in World History, Berkeley: University of California Press. CIA (US Central Intelligence Agency) (2013), The World Factbook, Country Comparison: Military Expenditures, at: https://www.cia.gov/library/publications/the-world-factbook/rankorder/2034rank. html (accessed September 2013). Clark, G. (2007), A Farewell to Alms, Princeton: Princeton University Press. DARPA (Defense Advanced Research Projects Agency) (2013), Defense Sciences Office: Neuroscience, at: http://www.darpa.mil/Our_Work/DSO/Focus_Areas/Neuroscience.aspx (accessed September 2013). DARPA (Defense Advanced Research Projects Agency) (2014), XDATA, at: http://www.darpa.mil/ Our_Work/I2O/Programs/XDATA.aspx (accessed May 2014). De Grey, A. and Rae, M. (2007), Ending Aging, New York: St. Martin's Press. Freeman, C. and Louca, F. (2001), As Time Goes By: From the Industrial Revolutions to the Information Revolution, Oxford: Oxford University Press. Gates, R.M. (2011), 'Remarks Upon Receiving the Thayer Award', US Military Academy, West Point, NY, 6 October 2011, at: http://www.westpointaog.org/page.aspx?pid=4843 (accessed June 2014). Huntington, S.R (1996), The Clash of Civilizations and the Remaking of World Order, New York: Simon & Schuster. ICRC (International Committee of the Red Cross) (2014), Rule 70, at: http://www.icrc.org/customaryihl/eng/docs/vl_rul_rule70 (accessed July 2014). Kagan, R. (2012), The World America Made, New York: Alfred A. Knopf. Keegan, J. (1993), A History of Warfare, New York: Vintage Press. Kupchan, C.A. (2012), No One s World, New York: Oxford University Press. Kurzweil, R. (2005), The Singularity is Near, New York: Viking. LaFollette, H. (ed.) (2003), The Oxford Handbook of Practical Ethics, Oxford: Oxford University Press. Lawfare Project (2014), Lawfare: The Use of Law as a Weapon of War, at: http://www.thelawfareproject. org/what-is-lawfare.html (accessed July 2014). Liang, Q. and Xiangsui, W. (1999), Unrestricted Warfare, Beijing: People's Liberation Army Literature and Arts Publishing House. Translation by US Central Intelligence Agency Foreign Broadcast Information Service, at: http://www.cryptome.org/cuw.htm (accessed September 2013).

xxxii

The Applied Ethics of Emerging Military and Security Technologies

Mattick, C.S., Allenby, B.R. and Lucas, G. (2012), 2012 Chautauqua Council Final Report: Implications of Emerging Military/Security Technologies for the Laws of War, Tempe: Arizona State University Lincoln Center for Applied Ethics. Available online at: http://indianstrategicknowledgeonline.com/ web/Chautauqua%20Final%20Report%20v8%20sept%202012.pdf (accessed September 2013). Morris, I. (2014), War! What Is It Good For?, New York: Farrar, Straus and Giroux. NRC (US National Research Council) (2005), Avoiding Surprise in an Era of Global Technology Advances, Washington DC: National Academy Press. NRC (US National Research Council) (2010), Persistent Forecasting of Disruptive Technologies, Washington DC: National Academy Press. Orend, B. (2006), The Morality of War, Peterborough, Ontario: Broadview Press. Parker, G. (2005), The Cambridge History of Warfare, Cambridge: Cambridge University Press. Rosenberg, N. and Birdzell, L.E. Jr. (1986), How the West Grew Rich: The Economic Transformation of the Industrial World, New York: Basic Books. Singer, P.W (2008), Corporate Warriors: The Rise of the Privatized Military Industry, Ithaca: Cornell University Press. SIPRI (Stockholm International Peace Research Institute) (2013), SIPRIMilitary Expenditure Database, at: http://www.sipri.org/research/armaments/milex/milex_database (accessed September 2013). Wolmar, C. (2012), The Great Railroad Revolution, Philadelphia: Public Affairs.

Parti Changing Context and Overview

This page intentionally left blank

[1]

THE IMPLICATIONS OF EMERGING TECHNOLOGIES FOR JUST WAR THEORY BradAllenby1 i. INTRODUCTION

T

echnological evolution and military activity have been linked throughout history. The relationship is not, however, straightforward. The existential challenge to society represented by warfare, combined with the immediate advantage that new technology can deliver, tends to accelerate technological innovation and diffusion; the inherent conservatism of military personnel, the emphasis on tradition and culture that marks many military organizations, and the high costs of experimentation in conflict environments serve as a powerful brake on technological evolution. Similarly, the relationships among military and security technology systems and consequent institutional, cultural, and social changes are profound, complex, unpredictable, and often subtle. Many technologies of sufficient power to be of interest militarily have at least the potential to be deeply destabilizing to existing economic, social, and technological systems, especially as they are introduced into civil society.2 As military radio frequency identification (RFID) and sensor systems, and robots and cyborgs at many different scales, are shifted from theatre intelligence and combat to civil society environments, for example, the implications for privacy, and for the balance between national security and civil rights, could be substantial. Technologies that can accelerate the development of human varietals within the overall population could be very effective for warriors, but raise difficult issues for social stability (many cultures, after all, do not deal very gracefully or equitably with the race, gender, and sexual preference differences that have long been part of the human story).3 Equally important, emerging technologies are likely to have similar destabilizing effects within the military as well, potentially affecting not just operations, but military culture and organization as well. A military leadership class that has developed in traditional combat environments will not have the same values, nor behave in the same way, as a military leadership class selected for its ability to play video games in high school (an issue that Singer points to as the US Air Force leadership,4 at present consisting almost entirely of pilots, is affected by incoming gamers who

The Applied Ethics of Emerging Military and Security Technologies 50

PUBLIC AFFAIRS QUARTERLY

are proficient at flying unmanned aerial vehicles (UAVs)). A military that prizes physical and mental toughness in marines and special operations forces will have a difficult time embracing and retaining the computer geeks and nerds necessary for effective cyberconflict operations. Equally important, it is clear that leading contenders for great power status— including at least the United States, the BRICS (Brazil, Russia, India, China, and, to some, South Africa), the EU and certain member states, and perhaps others such as Indonesia, Turkey, Iran, and Mexico—realize that scientific and technological capability is a critical competency for achieving and defending such a position. This is not just true in the obvious terms of economic performance, and in the less obvious but equally critical realm of "soft power,"5 but in a purely military sense as well. Especially as more traditional great power conflicts, such as that between the rising power of China and the existing power of the United States, are reframed informally or formally as confrontations that must be engaged across all domains of culture and society,6 the importance of broad technological competence and innovation is enhanced. Such issues are particularly pertinent given the accelerating technological change that characterizes the current era, combined with the changes in military operations discussed below. Accordingly, this article will provide an overview of emerging technologies and the environment within which such technologies are being developed, and suggest some of the concomitant implications for just war theory. It should be emphasized, however, that because of the complexity and unpredictability necessarily associated with such an evolutionary process, any such effort should be regarded as illustrative and partial, rather than predictive (as many reviews of the implications of emerging technology for the military and security domains have emphasized7). 2. TECHNOLOGY SYSTEMS: TALE OF THE RAILROAD It is very common for both technologists and social scientists to misunderstand both the essence, and the implications, of technology systems. To begin with, technology systems are not just artifacts; rather, they are integrated cultural, social, psychological, economic, institutional, and built phenomena.8 Moreover, any technology system of more than trivial power tends to be profoundly destabilizing of existing institutions, norms, and power relationships—as well as of the technology systems that it replaces, along with the firms and employment patterns built on the now-obsolete technologies—the well known capitalist "gale of creative destruction," as the Austrian economist Joseph Schumpeter famously put it. Accordingly, emerging technologies tend to generate substantial and potentially powerful opposition.9 In societies where conservative forces are able to dominate, they are therefore highly likely to impede technological evolution, thus creating less competitive cultures; whether a culture will evolve technologically

The Applied Ethics of Emerging Military and Security Technologies IMPLICATIONS OF EMERGING TECHNOLOGIES

51

may therefore depend to a large extent on whether conservative forces are able to merely hinder technological evolution, or whether they can stifle it completely.10 Thirdly, technology systems are complex adaptive systems, which means that trying to predict their future evolutionary paths, as opposed to exploring possibilities through techniques such as scenario analysis, is essentially impossible. One can prepare for the future by, for example, creating agile institutions through war gaming and scenario exercises; one cannot predict it.11 Consider the seemingly simple and mundane example of the railroad, a technology system that to moderns may be trite, but that was frightening and, indeed, devastating to the societies it affected as it diffused across the global landscape and through cultures and societies in the early 1800s. Like all foundational technologies, it was a seemingly inexorable juggernaut that profoundly disrupted not just other technologies, local economies and family businesses, and small farms, but also cultural, institutional, and psychological verities.12 The railroad may indeed have been critical in building a new world, but the cost was the destruction of much of the old, the comfortable, and the familiar. It is not surprising, therefore, that the story of the railroad is in some ways the framing of the nineteenth century. The rapid increase in speed and performance of railroad technology over that century, while impressive, does little to provide an idea of its real impacts.13 Remembering that technologies are coevolving parts of complex adaptive systems rather than causal mechanisms, the breadth of change of which railroads were a major forcing component, is still remarkable. For example, because railroads were a network technology, they required a uniform, precise system of time that was coextensive with, and matched to the characteristics of, their physical network;14 accordingly, our modern structure of time coevolved with the railroads. Similarly, large integrated networks also require coextensive signaling and communications systems if they are to be coordinated. Thus, the railroads not only provided a convenient right of way but also a raison d'etre for telegraph technology (including the "software," Morse Code).15 But this by no means exhausts the social and cultural impacts of railroad technology. Before railroads, or in areas where railroads were less common (e.g., the American South because of its plantation agrarian economy, or Austria and Russia by mandate16), local, fairly isolated economic institutions were the norm. This changed because railroads could inexpensively transport bulk commodities, people, and information rapidly over long distances. They therefore provided a technological basis for economies of scale in industrial operations, which in turn led to the growth of national economic institutions such as trusts and monopolies in commodities such as sugar, tobacco, oil, and steel. More subtly, railroads enabled a fundamental shift in the essential nature of industrial economies: it was not only that economic power passed to industrial firms from agriculture. So, too, did cultural authority; the yeoman farmer was replaced by the urban factory worker and the capitalist, and economists began

The Applied Ethics of Emerging Military and Security Technologies 52

PUBLIC AFFAIRS QUARTERLY

measuring national status not by agricultural prowess but by industrial production; the "nature as sacred" teleology that still energizes environmentalism today was challenged, and in many instances replaced, by a human-designed, human-built, high-technology sublime.17 Agriculture in continental cores in countries such as Canada and the United States was significantly expanded and transformed to industrial scale, and entire continental ecologies were altered; Chicago grew where and how it did, and enabled an economically, politically, physically, and environmentally new American Midwest, because of railroads.18 In institutional terms, railroad firms represented a significant increase in the complexity and size of private firms. In terms of finance, railroad firms with their vast need for land (rights of way) and material demanded far more substantial flows of capital than the simpler factory capitalism they replaced (in western Europe, railroad construction was the single most important stimulus to industrial growth by the 1840s).19 In terms of management, railroad firms encouraged the expansion of the principle of division of labor from blue collar to white collar positions, a critical development supporting the evolution of the modern hierarchical firm. In psychological terms, travel by railroad was different in kind, not just degree, from the forms of personal travel it replaced. The horse and carriage and canal had a pace and simplicity, and energy sources (e.g., animal and wind power), that led people to experience them as "natural." The mechanical railroad was faster and went where its designers intended, through rather than with its landscape, substituting a technological sublime for the natural sublime that characterized earlier modes of transport.20 More subtle was the commodification that many felt with the routine mass movement of people enabled by railroad technology; contemporary travelers complained of feeling like packages rather than individuals.21 The military and national security implications of the railroad were also nontrivial, especially in large, relatively unpopulated countries such as the United States, where efficient and inexpensive long distance transportation systems were critical. But although the military advantage of more rapid and efficient transportation of men and war materials to areas of immediate need was direct and explicit, there were other, perhaps more important, advantages. During the Civil War, for example, the Northern war manufacturing economy relied to a far greater extent than that of the South on railroad infrastructure, enabling greater productivity and industrial efficiency.22 Another example of the integrated strategic and cultural power of railroad technology is provided by the rise of Prussia. In 1815, the Congress of Vienna concluded the Napoleonic Wars and left Prussia as one minor state among many in Central Europe. But tensions between reactionary absolutism and the citizen nationalism that the French Revolution had unleashed continued, and a series of popular revolts broke out across Europe in 1848. In Prussia, the uprisings were controlled because the Prussian military used its railroads to rush troops from trouble spot to trouble spot, illustrating the strategic value of rail technology

The Applied Ethics of Emerging Military and Security Technologies IMPLICATIONS OF EMERGING TECHNOLOGIES

53

to Prussian leaders such as Helmuth von Moltke.23 Consequently, the Prussian military constructed a "dual use" railroad network, supporting both commercial and military functions. Prussian commercial railroad cars, for example, were explicitly designed so that, in addition to their routine commercial purpose, they could carry soldiers, horses, military supplies, and military equipment if necessary, while Prussian railroad networks were integrated into strategic military planning (in contrast, the French were lukewarm on railroads, and Russian and Austrian elites deliberately stifled railroad construction because of the potential for perturbing feudalistic structures24). The results became obvious in 1866 at the battle of Koniggratz, where the upstart Prussians stunned and essentially destroyed the Austrian Empire (which, remember, had eschewed the new technology because of its socially destabilizing potential—a possible caution to societies that attempt to avoid the creative destruction of powerful new technology systems). A major factor in this battle was the Prussian feat in managing to transport 197,000 men and 55,000 horses to the front using railroads, an accomplishment that caught the Austrians by surprise in part because of their lack of familiarity with railroad technology.25 Of course, railroads weren't the only factor; the Prussians also had the needle gun, arguably the most advanced rifle in Europe, as well as world class military management (no one else had von Moltke) and highly advanced training. In the event, as Austria fell, Prussia rose, a point that was lost on no one when the Prussians defeated the French in 1870 (again, partially because of their sophisticated use of railroad technology). In this systemic restructuring of society and its institutions, railroads are not unique. Economic historians have developed a theory of "long waves" or "Kondratief waves" of innovation, where periods of economic expansion are driven by constellations of technologies and institutions that form around core technology systems, with concomitant social, cultural, legal, psychological, technological, and economic change. This is a theory of developed, industrialized systems, so the first Kondratief wave is generally understood to involve the mechanization of textile manufacture, the basis of the Industrial Revolution in the UK.26 Although this is not something about which the literature is exact, subsequent examples might include railroads and steam technology as discussed above, which powered a wave from about 1840 to 1890; a following wave from about 1890 to 1930, which developed around steel, heavy engineering, and electricity; and an automobile, fossil fuel, and aviation wave from about 1930 to 1990.27 Each wave is characterized by coevolution of the core technology cluster with institutional, organizational, economic, cultural, and political institutions and systems, leading to profound and unpredictable change, as the railroad example illustrates. Thus, for example, some have suggested that modern developed economies are reacting to an information and communication technology wave by shifting from a mass production, heavy industry paradigm of specialized professional

The Applied Ethics of Emerging Military and Security Technologies 54

PUBLIC AFFAIRS QUARTERLY

managerial systems and associated "Taylorism" industrial efficiency techniques to a far more networked, adaptive, and flexible structure characterized by, for example, virtual offices and informal and protean regional and global information networks.28 Technology systems, both in the limited context of military operations and national security, and across social and cultural systems generally, are not just profoundly destabilizing. Rather, they define new earth system states; they change and integrate built, human, and natural systems and networks in unpredictable, contingent, and fundamental ways.29 The observations about some of the impacts of railroad technology may seem trivial, but that is only because the backward-looking perspective of history hides the traumas, conflicts, confusion, and dislocations experienced by contemporaries of that technology. Individuals and societies today are, in critical ways, children of the railroad, and it is impossible to realistically even imagine a world that has not been shaped by railroad technology. More recently, anthropogenic climate change, a product of a global civilization based on fossil fuel technology; major shifts in nitrogen and phosphorous cycles, products of the agricultural revolution (specifically, fertilizer production and use); and cultural phenomena such as the environmentalist and sustainability discourses all illustrate the continuing transformative power of human technologies. That being the case, it is sobering to realize that, at this point in human history, it is not one or two core technologies that are undergoing rapid evolution, but five: The Five Horsemen of emerging technologies: nanotechnology, biotechnology, robotics, information and communication technology (ICT), and applied cognitive science (NBRIC).30

3. EMERGING TECHNOLOGIES: THE FIVE HORSEMEN Taken as a whole, the NBRIC technologies in some ways are the logical end of the chapter of human history that began 2,500 years ago with the Greeks, the human effort to rationally understand and control physical reality (which has now been extended to synthetic realities, combining virtual and real realities in different patterns). Nanotechnology, generally defined as technology that manipulates materials and devices at the level of 100 or less nanometers (one nanometer is a billionth of a meter), extends human will and design to the molecular and atomic level. While it is certainly true that the complexity of biological systems remains daunting, biotechnology is rapidly providing new tools and knowledge that will enable the extension of human design to the level of genes and proteins, so that life can be designed and controlled at scales from the molecular to the cell to the organism to the community to the global (e.g., extinction events). Advances in robotics do not only continue to provide enhanced mechanical power and functionality (e.g., manufacturing robots in factories, unmanned aerial vehicles) but also to blur the line between human and biological systems and mechanism

The Applied Ethics of Emerging Military and Security Technologies IMPLICATIONS OF EMERGING TECHNOLOGIES

55

(for example, the "ratbot," a hardware robot guided by hybrid rat neuron/chip configurations consisting of some 300,000 rat neurons in a soup31)Information and communication technology (ICT) is a somewhat more complicated set of technologies; robots, artificial life, and control of matter have long been the stuff of myth and archetype, but ancient wizards and gods did not, for example, generally exercise their control through social networking or Google. Nonetheless, some of the effects of ICT, such as the facilitation of migration of functionality to information rather than physical structures, certainly act over historical time. Thus, money used to be valuable artifacts such as shells, which then evolved into coins that had some value because of their precious metal content. Now, however, to the extent physical money is still used, it consists of coins composed of cheap metal and paper bills—still physical, but with value determined symbolically, rather than inherently. Most financial value, of course, no longer even pretends to physical premise, but consists of electrons somewhere in cyberspace; indeed, most financial instruments no longer have a symbolic specific value at all, but are mathematical (informational) constructs.32 Similarly, economies evolve from mainly physical (exchange of produce and artifacts in a barter economy) to virtual (by value, most stock trading today is machine-tomachine, with people completely out of the loop). Of course, the explosion in the amount of information in the ambient environment has just begun to affect social, cultural, and institutional systems (Eric Schmidt, the CEO of Google, famously noted that in today's world, more information is created every two days than was created in the entire of human history up until 2003—whether that is a technically correct statement is, inevitably, contested; that it is directionally correct is not33). What, for example, will be the eventual effect on human knowledge structure of the growth of search engines such as Google, which facilitate real time recovery of virtually any desired piece of information—or, put functionally, of the way that memory as a cognitive function is now being distributed across the Internet? What is the effect on employment of the rapid mechanization of any task that is relatively repetitive (including, for example, much of the legal work traditionally performed by new hires at law firms, which has led to a precipitous drop in entry-level legal positions)? To judge by earlier examples—printing in Europe, which enabled the growth of individualized theology via the Reformation, and thus several centuries of religious warfare, for example—significant changes in ICT are not neutral in effect.34 Perhaps a hint of this can be seen in the globalizing interconnectivity illustrated when publication of a few cartoons in a small country in Europe resulted in riots and deaths around the Islamic world, or when radical elements that ordinarily would have been isolated in their geographical communities link across the Internet to form terrorist or activist networks. Certainly, changes in information availability and density can be expected to destabilize the structures—cultural constructs, ideas, ideologies, and even language—that humans use to create and simplify their world, with highly unpredictable effects.

10

The Applied Ethics of Emerging Military and Security Technologies

56

PUBLIC AFFAIRS QUARTERLY

This is especially the case when changes in ICT are coupled to accelerating changes in applied cognitive science. For example, the need for rapid, comprehensive cognition in complex and chaotic environments is pushing the development of "augmented cognition," where elements of the cognitive function that were previously performed by individuals are built into technology networks within which humans function.35 This may be a fundamental shift in the way humans integrate with technology, but it is also quotidian; many car manufacturers are building more and more cognitive function into their cars (an early and now familiar example is speed control; more recently, lane control, technologies that guard against sleepy or drunk drivers, and autonomous parking capability are being deployed in high end vehicles—and, famously, Google has obtained licenses in Nevada for its completely autonomous vehicles, although the state still requires that a person be present in the vehicle just in case). Indeed, the ability to couple with ambient technology systems is perhaps an underappreciated aspect of being human (remember the beginning of 2001: A Space Odyssey)', that coupling to potent ICT technologies, so closely associated with cultural and psychological dimensions of the human, should be particularly potent should be no surprise. Not all applied cognitive science developments, of course, involve ICT. Another familiar area of technological evolution is in the area of psychoactive pharmaceuticals: although humans have been using drugs, from alcohol and caffeine to more esoteric (and less legal) formulations, for thousands of years, new cognitive enhancers—such as anti-ADD (attention deficit disorder) drugs such as Ritalin for concentration, modafinil (Provigil) for alertness, and many others in the pharmaceutical pipeline—are coming more quickly, and are increasingly designed to achieve specific cognitive effects. Psychopharma directed to psychological states critical to military activities is an active subset of this category; drugs that will enable memory control to minimize PTSD, and that will reduce perception of risk, are in development. If the current off-label use of first generation pharmaceutical enhancers by certain sub-populations (e.g., college students) is any indication, use of psychoactive pharmaceuticals, even if developed for military applications, will rapidly spread to civic society. To take a final example, recent research in cognitive science suggests that at least some perceived forms of free will are illusory, in that, for example, the consciousness decision to, for example, move a part of the body appears to lag the actual, unconscious decision.36 The implications of weakening the assumption of free will flow not just to the psychological distress that might result, but deep into the practical world, which has generally been formulated with an implicit assumption that free will exists. Thus, for example, the growing evidence that adolescent brains differ from adult brains in ways that weaken the fundamental assumption of mens rea, or guilty intent, that is a necessary element of a legal finding of criminal liability for an action, has already begun to create difficult issues for the American legal system (is it unconstitutional to impose the death

The Applied Ethics of Emerging Militaiy and Security Technologies

IMPLICATIONS OF EMERGING TECHNOLOGIES

11

57

penalty on adolescents because their brains are not developed enough to exhibit mature free will?). Other examples include the identification of probabilistic linkages between certain genetic patterns and a high probability of violent and antisocial behaviors. Clearly, such work raises difficult issues for legal systems predicated on an assumption of free will (not to mention theological positions requiring free will). Practical issues cannot be ignored: is pre-criminal activity restraint based on genetic propensities permissible, and if so, to what degree? Can a different brain structure than the norm form a legal defense against criminal liability? How should protection of society be balanced against the rights of the individual when probabilities of different types of antisocial behaviors may be a result of genetics or physical neurobiology? It is not just that each one of these technology systems is powerful in itself (and, of course, there is an element of arbitrariness in what technology system is identified as core, and how that system is bounded for purposes of analysis). Rather, it is that they are being combined, and enable each other's development, in complex and unpredictable ways. The ratbot, combining robotics with biotechnology, has already been mentioned; augmented cognition, where different aspects of cognition (memory, perception, response to immediate environment) are dispersed across technology networks rather than centered in the Cartesian individual, another. A third example, involving a number of different foundational scientific and technological domains, is the possibility of radical human life extension. This is an active area of research and, like any emerging technology that is not yet clearly demonstrated, is best regarded as a scenario, rather than a prediction. That said, there are some who believe that, given the progress being made in genetics, biotechnology, and other enabling fields, the first people to live to 150 years, with a high quality of life, have already been born in developed economies.37 Such predictions are viewed by other experts as highly unlikely. Nonetheless, there is some evidence that substantial extensions of average lifespan, with a high quality of life, are achievable (if nothing else because of the increase in lifespan over the last two centuries that has resulted from economic development, medical science, infrastructure engineering, and other factors—before 1800, life expectancy virtually everywhere was only thirty to thirty-five years). Indeed, South Korea has seen a thirty-year jump in lifespan since I960.38 What makes the possibility of radical lifespan extension interesting is not just the science and technology implied by such a goal, but how unaware of the possibility virtually all policymakers and the public generally remain. After all, even a cursory consideration of the implications of such an evolution reveals significant challenges for, for example, pension and senior health systems, reproductive freedom and demographic trends, material and energy consumption patterns, and global equity and stability. Moreover, radical life extension is merely another example of a broad trend in human enhancement technologies (such as the psychopharma mentioned above)—the human as design space.39 It is notable that, at

12

The Applied Ethics of Emerging Military and Security Technologies

58

PUBLIC AFFAIRS QUARTERLY

least initially, military researchers and organizations are among the institutions most interested in enhancing humans, and human design, for better performance. It is entirely premature to speculate as to what effects the NBRIC Kondratief wave will generate. But one thing is clear. Powerful technologies destabilize psychological, institutional, social, and cultural systems; the combination of NBRIC technologies taken as a whole, combined with the demographic and economic growth of the past two hundred years, suggest that most of the assumptions that underlie current systems are now contingent. The Earth is now a continuing experiment in terraforming (e.g., anthropogenic climate change; perturbations to the hydrologic, nitrogen, sulfur, phosphorous, and other systems; urbanization and life extension); the human is becoming a design space to a far greater degree than in the past. It is therefore not inappropriate to wonder whether emerging technologies are, in fact, destabilizing the traditional laws of war. To that question we now turn.

4. EMERGING TECHNOLOGIES AND THE LAWS OF WAR For purposes of this discussion, the "laws of war" will be treated as a comprehensive, albeit informal, corpus that has developed, particularly in the West, over many centuries and which, in complicated and sometimes partial ways, governs the behavior of most states to some degree as they contemplate, conduct, and conclude combat activities.40 Moreover, as the previous discussion will have made clear, a focus only on the physical technologies themselves will be entirely inadequate to consideration of deeper questions of technological impact. In the instant case, for example, it is impossible to understand the implications of planned or potential military and security technologies without a deeper initial understanding of the cultural, operational, and institutional frameworks within which the technologies are being conceived and will be used. It is axiomatic, given the competition implicit in warfare, that technological change is endemic to the domain. It is also apparent, however, that modern conflict exhibits both an accelerated rate of change, and increased complexity, when compared to historical eras.41 For example, the sorts of conflicts that developed countries have engaged in most recently—Libya, Afghanistan, Iraq, Bosnia, United Nations interventions in African conflicts—are very different from the European continental and religious wars of the past centuries, the Napoleonic Wars or World Wars I and II, not just in conduct, but in intent (recent European, American, and UN conflicts have not been for conquest of territory or establishment of colonies, but to achieve ideological aims and defeat disfavored elites). More fundamentally, the Iranian-Israeli conflict has to date occurred primarily in cyberspace, where the relationship between the physical and the ethereal, and hence the relatively clean assumptions of the laws of war (e.g., you know when

The Applied Ethics of Emerging Military and Security Technologies

IMPLICATIONS OF EMERGING TECHNOLOGIES

13

59

aggression occurs and who is responsible, because it is physically obvious, as opposed to cyberconflict, where attribution of an attack, even if it can be ascertained that it has occurred, is difficult if not impossible) are brought into question. Even in the case of Afghanistan, a more traditional conflict, the geographical boundaries of the battlefield become vague when US attacks with Predator unmanned aerial vehicles are guided from sites outside Las Vegas, Nevada. It appears to be a sociological truism that, the more complex a situation, the more desirable technological fixes appear to be. This has certainly been the case with climate change, where both the policy framework and the proposed technological responses are dysfunctionally simplistic (geo-engineering as a "silver bullet" response to climate change has been seriously questioned42). Similarly, there appears to be a tendency for leaders of technologically superior military forces to rely overmuch on the (unquestioned) advantages their technology provides them, an observation that may well apply to the US experience in Iraq. Indeed, while it is rare for any nation to achieve unquestioned dominance in conventional military technologies, the US experience suggests a danger of misperceiving true sources of power under such conditions—perhaps even more dangerous if such a country has a technologically optimistic culture. But overreliance on technology and under-appreciation of soft power is not the only trap posed by modernity.43 The entire environment within which conflict occurs is changing in fundamental ways. Figure 1 presents one highly schematic framework for conceptualizing this complex domain. Initially, it is apparent that many, if not most of, the working assumptions that have been stable over much of the past centuries are stable no more. The Libyan intervention, for example, demonstrates the increasing emphasis on "Duty to Protect," which is clearly in conflict with the recent historical assumption that the internal activities of states directed against their minorities are not subject to intervention by the international community. Global terrorist networks have made obsolete another significant historical assumption: states are clearly no longer the only operational entities in international conflicts (contrary to the 1648 Treaties of Westphalia and subsequent international law). Is a soldier sent to Afghanistan there to conduct combat operations, to conduct espionage, or to police? Those are very different missions requiring very different cultures, training, and, importantly, technology. An incapacitating directed energy weapon is a police weapon, not a combat weapon. Moreover, the legal and ethical frameworks within which each of those missions occurs is profoundly different. For example, applicable military law would not permit an American soldier to deliberately target a civilian; policing activities necessarily target civilians. So not only is the system as mapped highly complex and unpredictable, but the domains are not independent, as Figure 1 might suggest; indeed, they interact in many new and uncertain ways.44 That said, consider the four major domains given in Fig. 1. The first, Revolutions in Military Technology, or RMT, is the traditional technology dimension. It is the

14

The Applied Ethics of Emerging Military and Security Technologies

60

PUBLIC AFFAIRS QUARTERLY

CONFLICT IN THE 2i ST CENTURY Lethal Autonomous Robots

^

Pharma Enhanced Warriors

Cyberspace Weapons

Miniature Surveillance Devices

=^--—

—^^^^^ Revolutions m Military Technologies

Asymmetric Warfare

\

\

Military Policing Operations

Non-State Actors

Global. Diffuse Conflict, Including Cyberspace

X^ A

/ 1

Others

Revolutions in Nature of Conflict

Conflict in the 21" Century

Revolutions in Civilian Systems

Destabihzation of Military Cultures and Institutions

^

\^\

Privatization of Military Activities and Institutions

\

^\

Revolutions in Military Operations and Culture

^^

}

Privatization of Government Activities Human Enhancement Technology in Civil Society New Weapons in Civil Society Militarization of Society/Erosion of Civil Rights

Information Dense Warfare

one that seems most obvious and simple, although as the discussion of Kondratief waves has suggested, that is not necessarily the case, because it is the cumulative implications of entire suites of technologies taken across the technological frontier as a whole, that poses such a difficult conceptual and analytical problem.45 The second realm is that of Revolutions in Nature of Conflict, or RNC. Fundamental change is occurring in this domain as well. For example, cultures and governments have always mistreated minorities, but the post-WWII focus on human rights, combined with some egregious recent examples—including Bosnia, Sudan, Rwanda, and the Congo—have led to development of a principle of "responsibility to protect," where nations are permitted under applicable international law to intervene in the internal affairs of other countries under certain circumstances. This remains a new, ill-defined, and contentious principle opposed by a number of countries that have difficult internal conflicts of their own, such as China and Russia, but it is increasingly the basis of armed intervention in what heretofore would be regarded as unfortunate civil wars.46 The movement away from the state as sole actor is reflected in the rise of counterinsurgency, policing actions, violent non-governmental regional and global networks, and other models of conflict (sometimes combining in different ways, as states project their power using actors that are not part of the formal military establishment, a pattern that appears to characterize the current cyberconflict environment, and is perhaps not unlike the use of privateers in an earlier age).47 The US dominance of conventional military technologies actually complicates this space; not only can it lead to dangerous hubris, but it encourages potential adversaries to invest heavily in asymmetric warfare. The most obvious example at this point is cyberconflict: the United States and to some extent Europe,

The Applied Ethics of Emerging Military and Security Technologies

IMPLICATIONS OF EMERGING TECHNOLOGIES

15

61

because of their economic, cultural, and military reliance on cyber systems, are obviously highly vulnerable to cyberattack.48 The third realm, Revolutions in Civilian Systems or RCS, is also undergoing unpredictable and accelerating change. Consider, for example, the implications of the ongoing privatization of conflict, such as has characterized the US experience in Iraq and Afghanistan over a period of more than a decade.49 Additionally, many technologies that have highly positive implications in a military context, and thus are likely to be funded, are at least more problematic if introduced into civic society: UAVs may erode privacy and civil rights; pharmaceuticals that enhance performance may become endemic at college campuses; nonlethal weapons developed for use in military policing environments will become domestic technological staples not just for police but for civil society given US culture. Finally, one would be remiss not to point out that technologies do not just interact with civic society but also have profound effects on military organization and structure—hence, Revolutions in Military Operations and Culture, or RMOC. This is not a trivial issue; after all, a significant reason some countries are coup-prone while others are not is military culture. The stirrup, the crossbow, the handgun, and field artillery all dramatically changed military tactics, strategies, logistics, and balances of power, both between militaries and within military organizations.50 Here one also sees interactions with changes in mission: policing operations not only require different technologies but different training, a different attitude toward the indigenous civilian population, different institutional cultures—one reason why crack soldiers are seldom good policemen—and why it is extremely difficult, if not impossible, to be both at the same time, or switch rapidly from role to role. Similarly, the rapid introduction of unmanned aerial vehicles—UAVs—creates deep cultural strains as US Air Force officers who have absorbed military culture for many years as they worked their way through the ranks find themselves managing (and sometimes competing with) geek gamers, and military personnel in active combat situations in Afghanistan are augmented with remote control operators who go to their homes in American suburbs when their day is done.51

5. THOUGHTS ON THE CONTINUING VALIDITY OF THE LAWS OF WAR

It is against this background of coevolving complexity and destabilization in not just a single domain, but across RMT, RNC, RCS, and RMOC taken both individually and together, that one must ask whether the laws of war, conceptually and as embodied in existing treaty and agreement, are adequate. Given the changes occurring across the relevant landscape, and the nascent state of the dialogue, any discussion at this point should be regarded as preliminary and tentative—indeed, what is sketched out here in rough outline is more a research agenda than any sort of substantive analytical effort.

16

The Applied Ethics of Emerging Military and Security Technologies

62

PUBLIC AFFAIRS QUARTERLY

But there are a few thoughts that are (probably) reasonably robust. Most importantly, it is doubtful that the answer to the question as to the continuing applicability and efficacy of the laws of war is a simple "yes" or "no." It is unlikely that fundamental and pervasive change across virtually all relevant domains will not affect in some meaningful way doctrines and principles that were formulated and tested under very different, and in relevant ways less complex, conditions. If nothing else, many of the deep assumptions underlying the laws of war, such as the primacy of the state and the physical nature of warfare—and, indeed, that a condition called "war" is still separable from other conditions of conflict and conflict management—are being destabilized. On the other hand, very few human systems change discontinuously, and even fundamental change, upon inspection, has roots deep in the past and occurs far more gradually than a superficial inspection might indicate. Moreover, the laws of war have been developed over a long period, with commentary and input from many cultures (including Christian, Islamic, and Chinese thinkers, for example), and it is unlikely that a body of work that is so robust has suddenly become completely inapplicable to present conditions. After all, many argue that the embodiment of the laws of war in humanitarian law, treaties, and institutions such as the United Nations is one of the great, albeit imperfect, strides in human civilization, and that such an achievement should not be jettisoned lightly (others, of course, might take issue with this, arguing that the laws of war and modern humanitarian law are a Western framework, and thus merely superficially universal, when in fact they constitute continuing Western imperialism). If one is to devise a research and dialogue agenda, therefore, the more pertinent question is to what extent, and where, the existing formulation of the laws of war should be revised, or even rejected, and what should take the place of the doctrines that currently exist. In addressing this question, one must also recognize that, regardless of what may be happening in new domains such as cyberconflict, there remain a number of "traditional" situations where the laws of war remain applicable. Moreover, one may well have conditions where traditional and nontraditional conflict coexist, raising difficult analytical challenges, but perhaps not fundamentally challenging the applicability of the laws of war, at least to specific aspects of operations. Thus, for example, the AfPak (Afghanistan and Pakistan) war includes traditional ground combat operations, where the laws of war presumptively are fully applicable (at least to NATO operations); policing activities where the laws of war may not be entirely appropriate; espionage and sabotage operations (e.g., UAV flights over Pakistan operated by the CIA, not by military institutions), which are not governed by the laws of war; and privatized military operations, which are also not governed by the laws of war. If these were completely separate operational domains, one might have less problem, but the fact that they overlap considerably in practice makes analysis difficult.

The Applied Ethics of Emerging Military and Security Technologies

IMPLICATIONS OF EMERGING TECHNOLOGIES

17

63

More subtly, AfPak also raises difficult issues of weaponization of the laws of war, in that they, or at least alleged violations of them, are converted to effective propaganda weapons against Western forces. This also leads to interesting questions as to whether, or how, the laws of war, modern conflict, and soft power considerations should be integrated: one of the main arguments against alleged American mistreatment of prisoners is that it led to loss of the moral high ground against terrorists, which is a soft power argument (e.g., that America gains strength from maintaining the moral high ground that should not be undermined by its own actions even if they allegedly provide some short-term operational benefit). Note this latter case suggests at least two different, if intertwined, research questions: first, to what extent is morality an operational component of the exercise of power (a realist sort of inquiry); and secondly, what is the proper ethical framework from a philosophical and moral perspective (a normative inquiry). Similarly, for purposes of clarity, one must be careful to differentiate between traditional critiques of the laws of war and new conditions that may call their applicability into question. For example, realists have consistently questioned whether the very idea of "laws of war" is viable, given that states engaging in warfare are usually doing so for fundamental values that cannot be defended by anything short of total war (and the more basic argument that states are not moral agents but can, and should, care only about their interests and their power). On the other side, pacifists have argued that no war, no matter what the rationale or how conducted, can be moral; "laws of war," therefore, is oxymoronic, an attempt to pretend that the initiation, conduct, and termination of war may be, under some conditions, just.52 These arguments and others—Keegan's argument, for example, that war has, over thousands of years, become a habit that can no longer be tolerated because humans have developed such effective weaponry and a capability to push war beyond savagery to extremes (consider World War IFs air wars against civilian targets, and what nuclear war would do if it ever happened53)—are important ones, and they indeed affect the way one might consider the viability and desirability of the laws of war, but they are not new. A research and dialogue agenda around the current status of the laws of war should therefore differentiate between their philosophic and cultural underpinnings and justification, which remains a legitimate and contentious arena, and the different and less explored, the question of whether changes in the nature of conflict, and the environment within which it is conducted, have, at least in some cases, rendered the laws of war inapplicable not because they are philosophically questionable but because they are obsolete, and thereby dysfunctional. To begin to explore this latter question requires a more sophisticated understanding of technology systems and their far-reaching implications than is usually brought to the table. It is still necessary to understand the engineering, operational, and tactical implications of new military artifacts as they are introduced to the battlefield, but as the battlefield, civil societies and cultures, and the technologies

18

The Applied Ethics of Emerging Military and Security Technologies 64

PUBLIC AFFAIRS QUARTERLY

mutate in unpredictable and complex ways, it is no longer sufficient. It is not a choice to have to do this; it is a requirement for military and strategic adequacy in the modern world of conflict, war, and competition. Especially given the democratization of weaponized violence, and the always serious impacts of conflict on civilian populations and institutions, it is also a requirement for ethical and rational behavior in the twenty-first century.

NOTES 1. Brad Allenby is President's Professor of Civil, Environmental, and Sustainable Engineering; Lincoln Professor of Ethics and Engineering; and the Founding Chair of the Consortium for Emerging Technology, Military Operations, and National Security at Arizona State University in Tempe, Arizona. This manuscript was supported by a grant from the National Science Foundation (Award SES-0921806). Additional funding was provided by the Lincoln Center for Applied Ethics at Arizona State University. The views expressed in this manuscript are those of the author and do not necessarily represent the views of the National Science Foundation, US Government, or the Lincoln Center. 2. Allenby and Sarewitz (2011). 3. Riddihough et al. (2012). 4. Singer (2009), pp. 326ff. 5. Nye(2011). 6. Liang and Xiangsui (1999). 7. NRC (2005, 2010). 8. Bijker, Hughes, and Pinch (1997). 9. Schumpeter(1942). 10. Landes (1998); Acemoglu and Robinson (2012). 11. Allenby (2011), pp. 267ff. 12. Allenby and Sarewitz (2011), p. 71; Schivelbusch (1977). 13. PSRM (2008); Schivelbusch (1977). 14. Rosenberg and Birdzell (1986). 15. Grubler(1998). 16. Acemoglu and Robinson (2012). 17. Marx (1964); Nye (1994). 18. Cronon (1991). 19. Freeman and Louca (2001). 20. Nye (1994); Nye (2003). 21. Schivelbusch (1977), p. 121.

The Applied Ethics of Emerging Military and Security Technologies IMPLICATIONS OF EMERGING TECHNOLOGIES

19 65

22. Keegan (1993); Acemoglu and Robinson (2012). 23. Boot (2007); Allenby and Sarewitz (2011). p.75. 24. Parker (2005); Acemoglu and Robinson (2012). 25. Boot (2007), pp. 116ff. 26. Rosenberg and Birdzell (1986). 27. Freeman and Louca (2001). 28. Castells (2000). 29. Allenby (2011). 30. Ibid.; Allenby and Sarewitz (2011), p. 80. 31. Marks (2008). 32. Allenby (2011), p. 131; IEEE (2012). 33. Finley(2011). 34. Allenby and Sarewitz (2011), p. 118. 35. Augcog (2012). 36. Baer, Kaufman, and Baumeister (2008). 37. DeGrey and Rae (2007); more dramatic claims of "functional human immortality" within fifty years, either as a result of biotechnology, or downloading of human consciousness into information networks, have also been made in Moravec (1988); and Kurzweil (2005). 38. World Bank (2012). 39. Allenby and Sarewitz (2011). 40. Bovarnick et al. (2010). 41. Tainter (1988); Keegan (1993); Nye (2011). 42. Allenby (2012). 43. Nye (2011). 44. Allenby (2011), pp. 293ff. 45. NRC (2005); NRC (2010). 46. Orend (2006), pp. 90S. 47. U.S. Army (2007). 48. Liang and Xiangsui (1999). 49. Singer (2003). 50. McNeill (1982); Parker (2005). 51. Singer (2009), pp. 326ff. 52. Orend (2006), pp. 244ff. 53. Keegan (1993).

20

The Applied Ethics of Emerging Militaiy and Security Technologies 66

PUBLIC AFFAIRS QUARTERLY

REFERENCES Acemoglu, Daron, and James A. Robinson. 2012. Why Nations Fail (New York: Crown Business). Allenby, Braden R. 2011. The Theory and Practice of Sustainable Engineering (Upper Saddle River, NJ: Pearson/Prentice-Hall). . 2012. "Durban: Geoengineering as a Response to Cultural Lock-in," Proceedings of the 2012 IEEE International Symposium on Sustainable Systems and Technology, May 16-18, pp. 1-4. Allenby, Braden R., and Daniel Sarewitz. 2011. The Techno-Human Condition (Cambridge, MA: MIT Press). Augcog. 2012. Augmented Cognition International Society, www.augmentedcognition.org. Baer, John, James C. Kaufman, and Roy F. Baumeister. 2008. Are We Free?: Psychology and Free Will (Oxford: Oxford University Press). Bijker, Wiebe E., Thomas P. Hughes, and Trevor Pinch, eds. 1997. The Social Construction of Technological Systems (Cambridge: MIT Press). Boot, Max. 2007. War Made New (New York: Gotham). Bovarnick, Jeff A., Porter Harlow, Trevor A. Rush, Christopher R. Brown, J. Jeremy Marsh, Gregory S. Musselman, and Shane R. Reeves. 2010. Law of War Deskbook (Charlottesville, VA: International and Operational Law Department, The Judge Advocate General's School, U. S. Army). Castells, Manuel. 2000. The Rise of the Network Society (2nd edition) (Oxford: Blackwell). Cronon, William. 1991. Nature's Metropolis: Chicago and the Great West (New York: Norton). de Grey, Aubrey, and Michael Rae. 2007. Ending Aging: The Rejuvenation Breakthroughs That Could Reverse Human Aging in Our Lifetime (New York: St. Martin's). Finley, Klint. 2011. "Was Eric Schmidt Wrong about the Historical Scale of the Internet?," ReadWrite, February 7. www.readwriteweb.com/cloud/2011/02/are-we-really-creating -as-much.php#.T9PaRW-w8Q. Freeman, Chris, and Francisco Louc.a. 2001. As Time Goes By: From the Industrial Revolutions to the Information Revolution (Oxford: Oxford University Press). Griibler, Arnulf. 1998. Technology and Global Change (Cambridge: Cambridge University Press). IEEE. 2012. "The Last Days of Cash: How E-money Technology Is Plugging Us into the Digital Economy," IEEE Spectrum, Special Report on the Future of Money, http:// spectrum.ieee.org/static/future-of-money. Keegan, John. 1993. A History of Warfare (New York: Vintage). Kurzweil, Ray. 2005. The Singularity Is Near (New York: Viking). Landes, David. S. 1998. The Wealth and Poverty of Nations (New York: Norton). Liang, Qiao, and Wang Xiangsui. 1999. Unrestricted Warfare (Beijing: PLA Literature and Arts Publishing House). CIA translation. Marks, Paul. 2008. "Rat-Brained Robots Take Their First Steps," New Scientist, vol. 199, no. 2669, pp. 22-23. Marx, Leo. 1964. The Machine in the Garden: Technology and the Pastoral Ideal in America (Oxford: Oxford University Press). McNeill, William H. 1982. The Pursuit of Power: Technology, Armed Force, and Society Since A.D. 1000 (Chicago: University of Chicago Press).

The Applied Ethics of Emerging Military and Security Technologies

IMPLICATIONS OF EMERGING TECHNOLOGIES

21

67

Moravec, Hans P. 1988. Mind Children: The Future of Robot and Human Intelligence (Cambridge: Harvard University Press). NRC (US National Research Council). 2005. Avoiding Surprise in an Era of Global Technology Advances (Washington, DC: National Academies Press). . 2010. Persistent Forecasting of Disruptive Technologies (Washington, DC: National Academies Press). Nye, David E. 1994. American Technological Sublime (Cambridge: MIT Press). . 2003. America as Second Creation: Technology and Narratives of New Beginning (Cambridge: MIT Press). Nye, Joseph S., Jr. 2011. The Future of Power (Philadelphia, PA: Perseus Books Group). Orend, Brian. 2006. The Morality of War (Peterborough, ON, Canada: Broadview). Parker, Geoffrey. 2005. The Cambridge History of Warfare (Cambridge: Cambridge University Press). PSRM (Pacific Southwest Railway Museum). 2012. Railroad History: Important Milestones in English and American Railway Development, www.sdrm.org/history/timeline. Riddihough, Guy, Gilbert Chin, Elizabeth Culotta, Barbara Jasny, Leslie Roberts, and Sacha Vignieri, eds. 2012. "Human Conflict: Winning the Peace" (special section), Science, vol. 336, no. 6083, pp. 819-884. Rosenberg, Nathan, and L. E. Birdzell, Jr. 1986. How the West Grew Rich: The Economic Transformation of the Industrial World (New York: Basic Books). Schivelbusch, Wolfgang. 1977. The Railway Journey: The Industrialization of Time and Space in the 19th Century (Berkeley: University of California Press). Schumpeter, Joseph A. 1942. Capitalism, Socialism, and Democracy (London: Routledge, 1994). Singer, Peter W. 2003. Corporate Warriors: The Rise of the Privatized Military Industry (Updated edition) (Ithaca, NY: Cornell University Press, 2008). . 2009. Wired for War: The Robotics Revolution and Conflict in the 21st Century (New York: Penguin). Tainter, Joseph A. 1988. The Collapse of Complex Societies (Cambridge: Cambridge University Press). U.S. Army. 2007. The U. S. Army/Marine Corps Counterinsurgency Field Manual (USAFM 3-24; MCWP 3-33.5) (Chicago: University of Chicago Press). World Bank. (2012). http://www.google.com/publicdata/explore?ds=d5bncppjof8f9_&met _y=sp_dyn_leOO_in&idim=country:KOR&dl=en&hl=en&q=average%201ifespan %20south%20korea.

This page intentionally left blank

[2] The Ethics of Killer Applications: Why Is It So Hard To Talk About Morality When It Comes to New Military Technology? P.W. SINGER The Brookings Institution, Washington, DC, USA ABSTRACT We live in a world of rapidly advancing, revolutionary technologies that are not just reshaping our world and wars, but also creating a host of ethical questions that must be dealt with. But in trying to answer them, we must also explore why exactly is it so hard to have effective discussions about ethics, technology, and war in the first place? This article delves into the all-toorarely discussed underlying issues that challenge the field of ethics when it comes to talking about war, weapons, and moral conduct. These issues include the difficulty of communicating across fields; the complexity of real world dilemmas versus the seminar room and laboratory; the magnified role that money and funding sources play in shaping not just who gets to talk, but what they research; cross-cultural differences; the growing role of geographic and temporal distance issues; suspicion of the actual value of law and ethics in a harsh realm like war; and a growing suspicion of science itself. If we hope better to address our growing ethical concerns, we must face up to these underlying issues as well.

KEY WORDS: robotics, military robotics, ethics, Moore's Law, Predator, Reaper Introduction

We live in a world of 'killer applications.' It used to be in history that every so often a new technology would come along and change the rules of the game. These were technologies like the printing press, gunpowder, the steam engine, or the atomic bomb. Indeed, such technologies were so rare in history that many other ones were oversold as being 'revolutionary,' when they actually were far from it, such as the Rumsfeld Pentagon's infatuation with 'network-centric warfare.' What is different today, though, is the incredible pace of technologies' emergence. It used to be that an entire generation would go by without one technologic breakthrough that altered the way people fought, worked, communicated, or played. By the so-called "age of invention" in the late Correspondence address: P. W. Singer, The Brookings Institution, 1775 Massachusetts Avenue, NW, Washington, DC 20036, USA. E-mail: [email protected]

24

The Applied Ethics of Emerging Military and Security Technologies

300 R W. Singer 1800s, they were coming one every decade or so. Today, with the everaccelerating pace of technologic development (best illustrated by Moore's Law, the finding that microchips - and related developments in computers have doubled in their power and capability every 18 months or so), wave after wave of new inventions and technologies that are literally rewriting the rules of the game are bursting onto the scene with an ever increasing pace. From robotic planes that strike targets 7,000 miles away from their human operators to 'synthetic life,' man-made cells assembled from DNA born out of laboratory chemicals, these astounding technologies grab today's headlines with such regularity that we have become almost numb to their historic importance. Looking forward, the range of technologies that are already at the point of prototyping are dazzling in their potential impact, both in war and beyond. Directed energy weapons (aka Lasers), the proliferation of precision guided weapons ('smart' lEDs), nanotech and microbotics (The Diamond Age), bioagents and genetic weaponry (DNA bombs), chemical and hardware enhancements to the human body (IronMan meets Captain America), autonomous armed robots (Terminators}, electromagnetic pulse weaponry (The Day After, Oceans 11), and space weaponry (Star Wars) all may seem straight from the realm of science fiction, but are on track to be deployable well before most of us have paid off our mortgages. What makes such technologies true 'Killer Applications' was that they rock an existing understanding or arrangement back to its fundamentals. A prototypical example from the field of business is how the iPod device changed the way people bought music. That is, what makes a new development a 'killer app' is not merely the incredible, science fiction-like capabilities it might offer, but the hard questions it forces us to ask. The most difficult of these questions are not about what is possible that was unimaginable before. Rather, the more challenging and perhaps more important questions are about what is proper. True killer apps raise issues of right and wrong which we did not have to think about before. These issues of ethics are not just fascinating; the disputes they raise can often have immense consequences to foreign affairs and international security. The science fiction of a submarine being used to attack shipping (Jules Verne), for example, not only became reality, but dispute over the right and wrong of it was what drew the United States into World War I, ultimately leading to its superpower rise. Similarly, H. G. Wells' concept of an 'atomic bomb,' which inspired the real world scientists of the Manhattan Project, helped keep the Cold War cold, but continues to haunt the world today. Why are these questions of ethics and technology so difficult, especially in the realm of war? Why is it that we are both 'giants' as General Omar Bradley once put it, when it comes to the technology of battle, but at the same time 'ethical infants?' As he remarked in a speech on the eve of Armistice Day in November 1948: 'The world has achieved brilliance without wisdom, power without conscience. Ours is a world of nuclear giants and ethical infants. We know more about war than we know about peace, more about killing than we know about living' (Bradley 1948). To my mind, this comes down to seven key

The Applied Ethics of Emerging Military and Security Technologies

The Ethics of Killer Applications 301 factors, or 'sins' so to speak, that make it so difficult to talk about the ethical ramifications of emerging technology in war. The Disconnect of the Fields

My background is in the social sciences, rather than in engineering or natural science, which means many in those fields would joke I have no background in the Sciences. Going from my own academic field into another can, on occasion, be like crossing into a foreign land. What is simple in one field can be viewed as incredibly dense and expert in another. Whenever we cross fields or disciplines, the paths and norms of behavior that are familiar become confusing; even the very language and words we have to use become exclusionary. Take the field of cyber-security, which literally has a language of zeros and ones, but also encompasses very complex issues of law and ethics. Recently, one of the virtual world companies discovered that an individual had come up with a very sophisticated program for stealing virtual currency from the accounts of users in the virtual world and then transferring it into real money abroad. The company documented the hacking effort and then reached out to the Federal Bureau of Investigation (FBI) to explore the legal means for responding. The FBI agent first met with then asked, 'So, um, this is on the Internet right?'1 This same sort of issue is a particular one for discussions of ethics and the sciences especially in war. As scary as someone trained in the fields of law, politics, philosophy or ethics might find a discussion of the design parameters of MQ-X unmanned combat air system, the scientists or engineers behind that design might find entering the discussions of the ethical dilemmas with people who are trained to do so equally daunting. As one robotics professor I interviewed put it, 'Having discussions about ethics is very difficult because it requires me to put on a philosopher's hat, which I don't have.'2 The result is that we often steer away from such discussions and stay within our own worlds. A good example of this comes from a survey conducted of the top 25 stakeholders in the robotics field, as identified by the Association of Unmanned Vehicles and Systems International (the field's professional trade group). When asked whether they foresaw 'any social, ethical, or moral problems' that the continued development of unmanned systems might bring, 60 percent answered with a simple 'No.'3 Frankly, if we continue with this ostrich attitude, both the policy and ethical discussions will be uninformed, or even worse, guided by those who do not have scientific fact on their side. And, in turn, scientists and engineers with real world ethical issues will be abandoned. Consider the following series of questions: • From whom is it ethical to take research and development money? From whom should one refuse to accept funding? • What attributes should one design into a new technology, such as its weaponization, autonomy or intelligence? What attributes should be limited or avoided?

25

26

The Applied Ethics of Emerging Military and Security Technologies

302 R W. Singer • What organizations and individuals should be allowed to buy and use the technology? Who should not? • What type of training or licensing should the users have? • When someone is harmed as a result of the technology's actions, who is responsible? How is this determined? • Who should own the wealth of information the technology gathers about the world around them? Who should not? These are all examples of real questions emerging in the robotics field that are not so much about the technology as the ethics. Most importantly, ethics without accountability is empty, so each of these questions also has potential legal ramifications. But, if a young robotics graduate student wanted to do the ethical thing in answering these questions, he or she would have no code to turn to for guidance, the way, for example, a young doctor or medical researcher would in their field. This is perhaps to be expected, as robotics is a much newer field than medicine. The key is not whether there is a gap, which is to be expected in any cutting-edge field, but what is being done to fill this gap. For example, the scientists working in the Human Genome Project set aside 5 percent of their annual budget to push discussions on the 'ethical, legal, and social implications' of their work (Moore 2002). The result is not that we have solved these tough issues related to the field of genetics. But we are certainly far better equipped to handle these debates, with the tenor and content of the discussion far past some of the inanity that dominated the project's early years. Yet very few other fields are doing something similar. Indeed, more common is the sort of troubling attitude perhaps best illustrated by an email I once received after a talk at a leading engineering school. A professor wrote to chastise me for 'troubling' his students 'by asking them to think about the ethics of their work.' Both Socrates and Asimov are likely chuckling. Applied Ethics: The Revenge of the Real World

Technology is often described as a way to reduce war's costs, passions, and thus its crimes. The poet John Donne, for example, claimed that in 1621 how the invention of better cannons would mean that wars would 'come to a quicker ends than heretofore, and the great expence of bloud is avoyed' (Dunlap 1999: 5). Almost 400 years later, we still hear similar discussions when it comes to a variety of technologic efforts. The Economist and Discovery Magazine both recently covered attempts to create an 'ethical governor,' essentially software that weapons might be programmed with to make them act ethically in war, even more morally than their human masters. As Ron Arkin, a professor of computer science at Georgia Tech, who is working on such a project observes: 'Ultimately these systems could have more information to make wiser decisions than a human could make. Some robots are already stronger, faster and smarter than humans. We want to do better than people, to ultimately save more lives' (Bland 2009).

The Applied Ethics of Emerging Military and Security Technologies

The Ethics of Killer Applications 303 It is a noble sentiment, but also one that ignores the seamy underside of war, which may be becoming even darker in the twenty-first century. My own past books, for example, looked at the emergence of a global private military industry and the growing role of money and greed in war, and then at the sad reality of child soldiers; contrary to our idealized visions of war and who fights it, one of every ten combatants today is a child. When we own up to the reality of war, rather than how we wish it, we see the double-edged sword of technology. We see that, for example, while human enhancements research is taking us beyond the prior limitations of the human body, those resulting biological enhancements do not take us past our all too human limitations and the inherent flaws or original sins that have also characterized us, such as our capacity for arrogance, greed, and hate. Similarly, just as a fork can be a tool for eating as well as plucking out eyeballs, we recognize that a non-lethal weapon can chase away Somali pirates, but also can be used by Japanese fisherman to chase away environmentalists protesting their illegal slaughter of endangered whales. Too frequently discussions of new technologies assume that the Tog of war has been lifted' (as the technophile thinkers who once surrounded Donald Rumsfeld argued) by either the perfection of our technology or the perfection of our souls. Instead, war in the twenty-first century shares the same qualities with past centuries: it is a mess, and maybe even more of a complicated mess. So, for example, when someone asserts confidently that new war technologies will lead to less bloodshed or greater compliance with established moral principles, we should check such sentiments with a look through a different, dirtier lens. While scientists might note the fact that such promises often prove empty (the ethical generator for example, remains a 'black box' all design concept, but no reality), we should also recognize that even if such fantasies were to come true, we would still have problems. For example, an enduring aspect of war is that regardless of how novel and advanced the technology, the enemy still has a vote. Making all this even more difficult today is that contemporary terrorist and insurgent groups are doing all they can to take advantage of the very laws they are supposed to follow. Charles I Dunlap, Jr. has described this tactic of deliberately violating the Geneva Conventions that divide soldier from civilian as fighting a form of 'lawfare'. In Somalia, for example, fighters would fire at US soldiers from behind non-combatants. One Ranger recalled how a gunman shot at him with an AK-47 that was propped between the legs of two kneeling women, while four children sat on the shooter's back. The Somali warrior had literally created for himself a living suit of non-combatant armor (Edwards 2005: 288). Another example in the Kosovo War was a tank that was parked in a school yard, while another was a tank that drove through a town on an ethnic cleansing mission with women and children riding on top. In the 2006 Lebanon War civilians were blackmailed by Hizballah into launching remotecontrolled rockets on their farms to rain down on Israeli cities. Ethicists and lawyers could fill pages arguing back and forth when one should use force in response to such scenarios. To argue that the problem is going to be easily solved with some imaginary, yet un-invented artificial

27

The Applied Ethics of Emerging Military and Security Technologies 304

P. W. Singer

intelligence is a bit of a stretch. That is, even if we could invent a system that always followed the Geneva Conventions, in the mess of modern war, the problem is that applying them is just not a simple zero versus one proposition in terms of programming language. The laws themselves are often unclear about what to do, and even more so, under siege by the people supposed to follow them. In short, in the harsh reality of war, there are no silver bullettechnologic solutions for ethics. The Dirty Role of Money

Whether it is a research laboratory, think-tank seminar room, or congressional hearing room, there is one topic that is generally considered impolite to talk too much about: Money. Yet there is perhaps no more important factor in determining who gets to talk, who does not, what gets worked on, and what does not. My own experience perhaps can illustrate. Last year I was invited by the US Naval Academy's Stockdale Center for Ethical Leadership to give the annual ethics lecture to the student body about some of the questions that robotics was raising for warfighters. A day before the event, the center received an angry letter, expressing shock that it would invite such an 'evil,' 'unethical' person to corrupt the young minds of future leaders and arguing that it should dis-invite me. The writer of the letter was the chief executive officer (CEO) of a private military firm, whose employees had been identified in several US military reports as being involved in the abuses at Abu Ghraib prison, one of the worst scandals in American history. But I had apparently done something far worse than billing the taxpayer for acts of abuse for which soldiers would later be court-martialed, but contractors would escape accountability because of a gap in the law. I had argued in an editorial piece that the contractors of this CEO's multi-billion dollar firm should be held to the same standards as those in uniform, and that the government should 'investigate the issue, bring people to justice, and ensure that lessons are learned so that the same mistakes are not repeated' (Singer 2004). Normally, such a letter would be ignored, or the rich irony of it would merely prompt laughter. But the administrators of the center had to take it seriously for the sole reason that he was a wealthy individual that the school had previously been cultivating as a donor. Money, as we know, has a right to talk louder in our world than perhaps it should. Or, as the American writer Napoleon Hill might have commented about my CEO friend, 'Money without brains is always dangerous.' I have a deep respect for how the Stockdale Center responded to this threat. The center did not let money shut down conversation, but instead investigated the situation and decided there was no merit to silencing my talk on ethics simply because a millionaire was upset. We all know that many other places would not have responded the same. My point here in reflecting on this incident is not to rehash old academic war stories, but that I wonder what lessons the very first American

The Applied Ethics of Emerging Military and Security Technologies

The Ethics of Killer Applications 305 conservatives might have for us. The authors of the Federalist Papers, who helped craft and defend the US Constitution, warned about the role of any private interests not responsive to the general interests of a broadly defined citizenry4 Among the Founding Fathers' worries for the vitality of democracy was that, when private interests move into the public realm and the airing of public views on public policy is stifled, governments tend to make policies that do not match the public interest. I think this is something to keep in mind when we reflect on the issues of ethics and technology, which of course also touch on vast amounts of money sloshing about. How do we handle discussions that call into question multimillion dollar interests versus those that do not? Who has a louder bully pulpit? This issue of money does not just shape the public debate, but goes all the way down to the individual decisions that a scientist or researcher working on such emerging technologies has to wrestle with. Benjamin Kuipers, a computer scientist at the University of Texas perhaps best described this real world dilemma that we often do not reflect upon: DARPA [Defense Advanced Projects Agency] and ONR [Office of Naval Research] and other DOD [Department of Defense] agencies support quite a lot of research that I think is valuable and virtuous. However, there is a slippery slope that I have seen in the careers of a number of colleagues. You start work on a project that is completely fine. Then, when renewal time comes, and you have students depending on you for support, your program officer says that they can continue to fund the same work, but now you need to phrase the proposal using an example in a military setting. Same research, but just use different language to talk about it. OK. Then, when the time comes for the next renewal, the pure research money is running a bit low, but they can still support your lab, if you can work on some applications that are really needed by the military. OK. Then, for the next round, you need to make regular visits to the military commanders, convincing them that your innovation will really help them in the field. And so on. By the end of a decade or two, you have become a different person from the one you were previously. You look back on your younger self, shake your head, and think, 'How naive', (as quoted in Singer 2009: 172-173)

The same sort of financial pressures, either positive or negative, also happen outside the laboratory. Those who study in such areas as ethics or policy often similarly depend on some sort of financial support from foundations or donors. The more money such a donor has, the more likely they are able to get people to research the questions they want, in the way they want them answered, and the more likely that their agenda will be advanced. Indeed, this issue of money and bias can prove to be a problem even when a donor is guided by the loftiest ideals of charity. A good illustration of this comes from a meeting with a representative of one of the world's leading foundations for academic research about a potential initiative that would look at ways to establish global norms in cyber-security. They explained how their foundation thought the topic was interesting, but that their board 'had not yet decided whether cyber issues were important or not'. They felt that they would be in a position to decide whether cyber-security was an important issue in 'about three or four years.'5

29

30

The Applied Ethics of Emerging Military and Security Technologies

306 R W. Singer While science works on the cutting-edge, donors to the social sciences and ethics tend to want to sponsor what is already established (with the irony that they are least often there at the point in time when their support might have the most impact). The sad truth is that if you are an ethics or policy researcher seeking funding, which, in turn, can be crucial in determining such things as whether you get tenure or not, you are better off starting new projects on old problems, rather than new projects on new problems. Your Ethics Are Not My Ethics

One of the most important ripple effects of a technology in terms of its impact on foreign policy takes place when culture encounters ethics. This issue is playing out right now in the vastly different perceptions of unmanned aerial systems inside America and 7,000 miles away, where they are actually being used daily in war. While we use such adjectives as 'efficient' and 'costless' and 'cutting edge' to describe the Predator in our media, a vastly different story is being told in places like Lebanon, where the leading newspaper editor there called them 'cruel and cowardly' or in Pakistan, where 'drone' has become a colloquial word in Urdu and rock songs have lyrics that talk about America not fighting with honor. This issue of narrative, of differing realities when it comes to talking about the exact same technology, is hugely important, to the overall 'war of ideas' that we are fighting against radical groups and their own propaganda and recruiting efforts. It helps explain how painstaking efforts to act with precision emerge on the other side of the world through a cloud of anger and misperceptions. But this issue of perception is something that goes beyond just drone strikes. We live in a diverse world and as Star Trek creator Gene Roddenberry put it, Tf man is to survive, he will have learned to take a delight in the essential differences between men and between cultures. He will learn that differences in ideas and attitudes are a delight, part of life's exciting variety, not something to fear.'6 Yet we must also acknowledge that that these differing cultural understandings can have a huge impact, creating differing perceptions of what is ethical or not. We see this greatly illustrated with the differing perceptions of robotics in East and West. In Western culture, going back to its very first mention in the play RUR in 1921, the robot is consistently portrayed as the mechanical servant that wises up and then rises up. The technology is portrayed as its something heartless to be scared of. Yet, the very same technology is consistently viewed exactly opposite in Asian culture, going back to first mention in post-World War II anime comics as Astro Boy in which the robot is consistently the ethical hero, rather than the would-be Terminator to be feared. The machine is a friend of humans in Japan. A robot is a friend, basically,' tells Shuji Hasimoto, a robotics professor at Waseda University in Tokyo (Jacob 2006: 7). This difference is not just something that comes out of popular culture. As opposed to the strict Western division between living and dead, the

The Applied Ethics of Emerging Military and Security Technologies

The Ethics of Killer Applications 307 traditional Japanese religion of Shintoism holds that both animate and inanimate objects, from rocks to trees to robots, have a spirit or soul just like a person. Thus, to endow a robot with a soul is not a logical leap in either fiction or reality. Indeed, in many Japanese factories, robots are given Shinto rites and treated like members of the staff (Hornyak 2006). The result is that popular attitudes over what is acceptable or not when it comes to this technology and how to use it diverge widely. In Asia, 'companion' robots for the elderly or babysitter robots for children are marketed with little controversy. By contrast, Rodney Brooks, an Massachusetts Institute of Technology professor and Chief Technical Officer of iRobot, explains that the concepts would not work in the US, for the simple reason that Americans find them 'too artificial and icky' (Singer 2009: 168). One can see similar differences in perceptions, influenced by culture in a wide variety of emerging areas, from work in genetics, enhancements, and even what were once cutting edge but are now quite normal human medical treatments - normal, that is, only in some societies. The Japanese, for example, may love their robotics but are often quite horrified by organ transplants, to the extent that the very first heart transplant doctor in Japan was prosecuted for murder.7 The same cultural attitudes flow out and influence what different cultures think is acceptable in war or not. The issue of arming an autonomous weapons system is hugely controversial; controversial, that is, to Western minds. By contrast, in South Korea, it is not. The country's military forces sent two robotic sniper rifles to Iraq in April 2004, with little public debate. Indeed, Samsung not only manufactures the Autonomous Sentry Gun, a 5.5 millimeter machine gun with two cameras (infrared and zooming) and pattern-recognition software processors that can identify, classify, and destroy moving human targets from 1.25 miles away, but even made a promotional commercial extolling it, set to jazzy music. This issue of differing senses of right and wrong behavior can even spark conflict. Much of the recent dispute in cyber-security between the US and China, especially as it relates to the Google incident, is woven within different cultural attitudes towards privacy and individual rights. Similarly, US State Department officials like Hillary Clinton often make a point to extol the power of 'social networking.' They fail to realize that others describe the West's very same push for it as its own form of cyber-warfare. The Distance Problem

Our codes of law and our sense of ethics have long focused on our intent and actions. Take this quote from Aristotle: 'We do not act rightly because we have virtue or excellence, but rather we have those because we have acted rightly'8 The challenge of this new wave of technologies is that the concepts of virtuous intent and action are being modulated in ways that Aristotle, and even the Geneva Conventions, crafted several thousand years later, did not imagine. Advancements used to be distinguished by the increase they created in a weapon's power, range, or capacity. The World War II era B-24 bomber,

31

32

The Applied Ethics of Emerging Military and Security Technologies

308 R W. Singer for example, was an improvement over the B-17 because it could carry a greater number of bombs over a greater distance. Similarly today, the MQ-9 unmanned Reaper plane is an improvement over the MQ-1 Predator because it can carry a greater number of bombs over a greater distance. But there is one more fundamental difference: The Reaper drone is smarter. That is, the newer technology can do much more on its own; take off and land autonomously, fly out mission way-points without human intervention, and carry targeting software that even allows it not only to pick out small details (such as footprints in a field), but also to start to make sense of them (such as backtracking the footprints to their point of origin).9 Such improvements present dilemmas for which our prevailing laws of war, best exemplified by the 1949 Geneva Conventions, may not be fully prepared. In essence, our new technology does not remove human intent and human action, but it can move that human role geographically and chronologically. Decisions now made thousands of miles away, or even years ago, may have great relevance to the actions of a machine (or to its inactions) in the here and now. This does not mean that there is no ability to talk about ethics or apply the law, but rather that this aspect of distancing makes it far more complex and difficult, especially in fields that very much focus on the role of the 'commander on the scene' and concepts of individual intent and responsibility for one's actions. The Conventions date from the middle of the twentieth century; the year they came out, the average American home cost $7,400, and the most notable invention was the 45rpm record player. But while there is little chance of the global cooperation emerging for them to be updated and ratified anytime soon, technology is guaranteed to advance. Under what is known as Moore's Law, for instance, the computing power and capacity of microchips (and thus the many devices like computers that use them) has essentially doubled every 18 months of so, while continually going down by similar measures in price. If Moore's Law holds true over the next 25 years, the way it has over the last 40, our technologies will be a billion times more powerful in their computing. Indeed, even if we see just 1 percent of the improvement we have experienced historically, our technology will be a mere 10,000,000 times more powerful than today. This may be the most challenging part of the distance dilemma, that while our understanding of law and ethics moves at a glacial pace, technology moves at an exponential pace. Thus, the distance between them grows further and further. The Suspicion of Ethics

In the spring of 2010 I had the honor of sharing a panel with two US military Judge Advocate General officers at a session hosted by the Institute for Ethics and Public Affairs at Old Dominion University, Norfolk, Virginia. The session's focus was to be on the ethical questions of using new robotic technologies in war. However when a person in the audience stood up and

The Applied Ethics of Emerging Military and Security Technologies

The Ethics of Killer Applications 309 asked a more fundamental question of the three of us, 'What would you tell a mother who has lost her son to one of these terrorists in Iraq as to why we should even care about something like ethics or the laws of war? Why should we even bother?' The question took the discussion to a whole deeper level. For all the discussion in academic journals and seminar rooms, many people wonder about something far more fundamental: why do we even care about trying to figure out right from wrong in a realm like war where so much wrong happens? It is a difficult issue and cuts to the heart of why we even care about ethics and morality to begin with in any human endeavor. Linked to the specific realm of war, two factors carry weight in my mind. First, the son was a 'serviceman' and that is a powerful term. It meant he was not fighting out of anger or hate, but serving something beyond. When he joined the US military, he took an oath to serve the Constitution and respect the laws, including those of war. It is this notion of service, this sense of ideals, that is why we honor servicemen and women, and even more so it is the essence of what distinguishes them from their foes. I think they serve on the side of right, not just because they are my countrymen, but because the other side does not respect the laws of war, and thus is the equivalent of barbarians, which is how historically we have viewed those who willfully violate the law.10 This sense of service is not just why we fight, but is why our way of fighting can be made just. The second factor is something we do not like to talk about in issues of ethics, but very much matters in war: raw self-interest (something we may try to sugarcoat by terming it as 'pragmatism'). Rather than it being an advantage to break the laws of war, the facts show something different. In the history of war, the side that has fought with a sense of ethics, respected the laws, and fought as 'professionals,' has tended to win more often than those that do not. That is, in the historic record of over 5,000 years of war, professionals, almost always triumph over those who are willing to behave like barbarians. Indeed, this distinction was found to be a key event in Iraq, where that son was killed. Even though the US forces in Al-Anbar province were far more alien to the local society than the members of Al-Qaeda, and many in the media complained that our forces were stymied by a webwork of laws and lawyers, we won in the end for this very reason. These extremists, these twenty-first century barbarians, carried the seeds of their own downfall, by the very fact that they were wanton with violence and our forces were not. Eventually this was what persuaded the local tribes to turn on them, which was the actual key to the victory in the 'surge.' But while this may be the case, the narrative that ethics is only a hindrance remains powerful and popular. Those who care about ethics must acknowledge they face a basic problem of convincing folks both inside and outside the military that ethics is worthwhile, not just because to be selfless is morally praiseworthy, but also because we are self-interested and want to win.

33

34

The Applied Ethics of Emerging Military and Security Technologies

310 R W. Singer Beware of Magic

When the warriors of the Hehe tribe in Tanzania surrounded a single German colonist in 1891, they seemingly had little to worry about. But he had strong magic, a box that spat out bullets (what we call the 'machine gun'). Armed with such seemingly mystical power, he killed almost 1,000 spear-armed warriors (Ellis 1986: 89). As English physicist and science fiction author Arthur C. Clarke famously put it, Any sufficiently advanced technology is indistinguishable from magic.'11 And in no realm is this truer than in war, where we do not just merely view advanced technology sometimes as magical, we also fear it for that very reason. We fear what we do not understand. While we might think such times have past, the problem of technology's magical side continues to bedevil us even in the twenty-first century, perhaps even more so as technology truly performs magical feats, even while large parts of the world live their lives no differently than they did centuries past. For instance, an official with the US military's Joint Special Operations Command recounted a meeting with elders in the tribal region of Pakistan, the area where Al-Qaeda leaders were reputed to be hiding out, and the site of more than 175 drone strikes in the last few years. One of the elders was enamored of the sweet-tasting bread that was served to them at the meeting. He, however, went on to tell how the Americans had to be working with forces of 'evil,' because of the way that their enemies were being killed from afar, in a way that was almost inexplicable. 'They must have the power of the devil behind them.' As the official recounted with a wry chuckle, 'You have a guy who's never eaten a cookie before. Of course, he's going to see a drone as like the devil, like black magic.'12 Again, the elder felt we were doing something evil not because of civilian casualties or the like, but for the very reason that he did not understand. As Marian Anderson once put it, 'Fear is a disease that eats away at logic.'13 But this suspicion of advancement is not just limited to distant, tribal regions. It is something that is increasingly playing out here in the US, and is harming our ability to have effective discussions on policy and ethics in the twenty-first century. We are seeing what CNN (Cable News Network) has characterized as a growing American 'fear of science' or what writer Michael Specter explored in his book, Denialism. As Specter puts it, the problem now when it comes to discussions that involve the intersection of science and public policy is that, 'when people don't like facts, they ignore them.'14 We see this in all sorts of areas, from the widespread fear of vaccines, to the useless trust placed in the multibillion industry of dietary supplements, to the climate change debate. Indeed, a major political party nominated for Vice President of the United States a person who described the scientific method as 'snake oil.'15 But we should not be too hard on them, they are simply reflections of a populace in which 11 percent of Americans cling to the belief that Elvis really is still alive, and 18 percent that the Inquisition did have it right and the Sun revolves around the Earth.

The Applied Ethics of Emerging Military and Security Technologies

The Ethics of Killer Applications

35

311

Our challenge is thus often not only to make sure the policymaker and public understand the ethical issues of emerging technologies, but that they also basically accept the scientific principles underlying those technologies as well. Conclusions

These challenges are certainly daunting, but by no means do they imply that any discussion of ethics, morality, and technology is hopeless, and that there is no way to think, speak, and act ethically when it comes to emerging technologies in war. Rather it is the very opposite. The difficulty makes the project all the more important and the efforts to solve them all the more worthy. Rather our success or failure in navigating the moral dilemmas of a world of killer apps will depend on recognizing that these very problems are part and parcel of the discussion. We must own up to these challenges, face them, and overcome them. Otherwise we will continue to spin in circles. And we had better act soon. For the thread that runs through all of this is how the fastmoving pace of technology and change is making it harder for our all too human institutions, including those of ethics and law, to keep pace. Notes 1

Author interview with Linden Laboratory executive, 27 March 2010. Author interview with Prof. Illah Nourbakhsh, 31 October 2006. Survey released at AUVSI-San Diego, 27 October 2009, conducted by Kendall Haven. 4 See especially Federalist Papers #10 and #51, available for download at http://avalon.law.yale.edu/ subject_menus/fed.asp; Internet. 5 Author meeting with Foundation grant officer, 13 April 2010. 6 Quotation of Star Trek creator and producer, Eugene W. Roddenberry recorded in Santa Barbara, CA (1971): http://memory-alpha.org/wiki/Template:Gene_Roddenberry_quotes. [Accessed 26 October 2010]. 7 For more information see Lock 2002. The next transplant in Japan was not performed until 30 years later; see also http://news.bbc.co.Uk/2/hi/health/287880.stm. 8 From Nicomachean Ethics, Book II, 4. 9 For more on this, see Singer 2009: ch. 5. 10 For more on this, see Peters 1994. 11 This is known widely as 'Clarke's Third Law,' after Arthur C. Clarke, 'Profiles of the Future' (1961): see, e.g., http://www.quotationspage.com/quotes/Arthur_C._Clarke/; Internet [accessed 26 October 2010]. 12 Author interview with JSOC official, 10 March 2009. 13 See, e.g., http://quotationsbook.com/quote/14686/; Internet [accessed 26 October 2010]. 14 See http://articles.cnn.com/2010-04-13/opinion/specter.denying.science_l_organic-food-geneticallysupplements?_s =PM:OPINION; Internet [accessed 26 October 2010]; see also Specter 2009. 15 See, e.g., http://www.cbsnews.com/stories/2010/02/09/politics/main6189211.shtml; Internet [accessed 26 October 2010]. 2

3

References Bland, E. (2009) Robot Warriors will get a Guide to Ethics, Discovery News, 18 May 2009, accessed 26 October 2010, available at: http://www.msnbc.msn.com/id/30810070/; Internet. Bradley, O. (1948) An Armistice Day Address: 10 November 1948, accessed 26 October 2010, available at: http://www.opinionbug.com/2109/armistice-day-1948-address-general-omar-n-bradley/; Internet.

36

The Applied Ethics of Emerging Military and Security Technologies

312 RW. Singer Dunlap, C. J. (1999) Technology and the 21st Century Battlefield: Re-complicating Moral Life for the Statesman and the Soldier (Carlisle, PA: Strategic Studies Institute, US Army War College, 1999), accessed 26 October 2010, available at: http://www.strategicstudiesinstitute.army.mil/pdfflles/pub 229.pdf; Internet. Edwards, S. J. A. (2005) Swarming and the Future of Warfare. Originally presented as the author' thesis Doctoral Pardee Rand Graduate School), RAND. Ellis, J. (1986) The Social History of the Machine Gun (Baltimore: Johns Hopkins University Press). Federalist Papers, accessed 26 October 2010, available at: http://avalon.law.yale.edu/subject_menus/ fed.asp; Internet. Hornyak, T. N. (2006) Loving the Machine: The Art and Science of Japanese Robots 1st edn. (Tokyo; New York: Kodansha International). Jacob, M. (2006) Japan's Robots Stride into Future, Chicago Tribune, 15 July 2006, p. 7. Lock, M. (2002) Twice Dead (Berkeley, CA: University of California Press)). Moore, J. A. (2002). The Future Dances on a Pin's Head: Nanotechnology: Will it be a Boon - or Kill Us All? Los Angeles Times, 26 November 2002, accessed 26 October 2010, available at: http://www. wilsoncenter.org/index.cfm?fuseaction=news.item&news_id=14638; Internet. Peters, R. (1994) The New Warrior Class, Parameters, 24(2), pp. 16-26. Singer, P. W. (2004) The Contract America Needs to Break, Washington Post, 12 September 2004, accessed 26 October 2010, available at: http://www.brookings.edu/opinions/2004/0912defenseindustry_singer. aspx; Internet. Singer, P. W. (2009) Wired for War: The Robotics Revolution and Conflict in the 21st Century (New York: Penguin). Specter, M. (2009) Denialism: How Irrational Thinking Hinders Scientific Progress, Harms the Planet and Threatens our Lives (New York: Penguin).

Biography Peter Warren Singer is Director of the 21st Century Defense Initiative and Senior Fellow in Foreign Policy at The Brookings Institution (Washington, DC). He is author of numerous articles and three books, including Corporate Warriors: the Rise of the Privatized Military Industry (Cornell UP, 2003), Children at War (Pantheon, 2005), and most recently the New York Times bestseller, Wired for War: The Robotics Revolution and Conflict in the 21st Century (Penguin, 2009). For more information visit www.pwsinger.com

[3] Emerging and Readily Available Technologies and National Security National Research Council and National Academy of Engineering

Summary

FRAMING THE ISSUES

The United States faces a complex array of challenges to its national security. Technology is an essential element of the U.S. strategy for meeting those challenges, and it continues to be U.S. policy to seek technological military superiority over U.S. adversaries. To enhance and expand technological superiority, the Department of Defense and other government agencies invest in science and technology on an ongoing basis. These investments cover a broad range of efforts, from fundamental research that might eventually support national security needs, broadly defined, to specific development and eventual production of weapons and other military materiel intended to address particular national security problems. The U.S. government also adapts technologies originating in the civilian sector, initially without national security purpose, to national security needs. Developments in science and technology (S&T) for military and national security use have often raised a variety of ethical, legal, and societal issues (ELSI). These ELSI-related challenges are accentuated in a context of emerging and readily available (ERA) technologies, that is, new technologies that are accessible at relatively low cost compared to more traditional militarily relevant technologies, such as nuclear weapons, and thus are within the reach of less technologically advanced nations, nonstate actors, and even individuals. This is true because ERA technologies do not require construction of large engineered systems for their exploita-

38

The Applied Ethics of Emerging Military and Security Technologies 2

ELSI FRAMEWORK FOR EMERGING TECHNOLOGIES AND NATIONAL SECURITY

tion, and in some cases have the potential for doing harm to U.S. interests on a large scale. In 2010, the Defense Advanced Research Projects Agency asked the National Academies to develop a framework for policy makers, institutions, and individual researchers to use in thinking through ethical, legal, and societal issues as they relate to research and development (R&D) on ERA technologies with military or other national security relevance. What are the ethics of using autonomous weapons that may be available in the future? How should we think about the propriety of enhancing the physical or cognitive capabilities of soldiers with drugs or implants or prostheses? What limits, if any, should be placed on the development of cyber weapons, given the nature and extent of the economic damage that they can cause? Such questions illustrate the general shape of ethical, legal, and societal issues considered in this report. This report begins with the assumption that defending and protecting national security against external threats are morally sound and ethically supportable societal goals. A related premise is that individuals who are part of the national security R&D establishment want to behave ethically. That said, the notion of deliberately causing death and destruction, even in defense of the nation from external threats, raises ethical, legal, and societal issues for many. Those who engage in combat, those who support combatants, directly or indirectly, and those whom they defend— that is, the American public at large—all have a considerable stake in these issues and the questions they raise. Knowledge regarding ethical, legal, and societal issues associated with R&D for technology intended for military purposes is not nearly as well developed as that for the sciences (especially the life sciences) in the civilian sector more generally. (This is generally true, even recognizing that the line between military and civilian technologies is not always entirely clear.) Some of the important differences between the two contexts include the following: • Unlike civilian technologies, some military technologies are designed with the explicit purpose of causing harm to people and to property. • Civilian technologies and products may unexpectedly turn out to be relevant to a military need and in that context raise the possibility of heightened and/or new ELSI implications. • Technologies developed in a military context may turn out to have significant ELSI implications when applied in a civilian context. • Advancing military technologies may also outpace the evolution of the laws designed to govern their use. For example, cyber weapons offer

The Applied Ethics of Emerging Military and Security Technologies SUMMARY

39 3

the possibility that a nation might be brought to economic ruin without physical death and destruction. • Some military research is conducted in a classified environment. A full investigation of ethical, legal, and societal issues associated with technology for military or national security purposes is beyond the scope of this report. To make its task more manageable, the committee explored three areas with respect to ERA technologies: • The conduct of research, which includes the selection of research areas, the design of particular research investigations (e.g., protocols, experiments), and the execution of those investigations. ELSI concerns relating to the conduct of research focus primarily on the effects of the research on parties other than those who are explicitly acknowledged as being research subjects, such as individuals living close to where the research is being performed, family members of research subjects, and so on. (ELSI concerns related to acknowledged research subjects are important, but there is today a well-developed infrastructure to address such concerns, and the adequacy of this infrastructure is not within the scope of this report.) • Research applications, which relate to capabilities intended to result from research on ERA technologies. ELSI concerns associated with specified applications fall into two categories: concerns about the intended effects or purposes of the application and concerns about undesired effects (sometimes known as side effects) that might occur in addition to the intended effects. Concerns about technologies that can be used for both military and civilian purposes fall into this category. • Unanticipated, unforeseen, or inadvertent ELSI consequences of either research or applications; such consequences are usually manifested by something going awry, as when research does not proceed as expected and thus causes harm outside the original bounds on the research or when unanticipated applications raise additional ELSI concerns. FOUNDATIONAL TECHNOLOGIES AND APPLICATIONS For illustrative purposes, this report considers three foundational technologies (foundational sciences and technologies) that enable progress and applications in a variety of problem domains: information technology, synthetic biology, and neuroscience. In addition, four application domains associated with specific operational military problems are addressed: robotics, prosthetics and human enhancement, cyber weapons, and nonlethal weapons. These technologies and applications are examples of ERA technologies as defined above—a multitude of state and nonstate

40

The Applied Ethics of Emerging Military and Security Technologies 4

ELSI FRAMEWORK FOR EMERGING TECHNOLOGIES AND NATIONAL SECURITY

actors, friendly or not, can adopt and adapt such technologies for a multitude of purposes even without large budgets and infrastructures. The report examines each illustrative ERA technology and application domain from the perspective of technology maturity (how close the science or technology is to producing useful applications) and possible military applications, and it highlights some of the ELSI implications that emerge for each technology or application. SOURCES OF INSIGHT ON ETHICAL, LEGAL, AND SOCIETAL ISSUES AND AN ANALYTICAL FRAMEWORK A number of ideas, intellectual disciplines, and related efforts are sources of ELSI insight into both new and existing technologies and their applications. These include philosophical and disciplinary ethics; international law (especially the law of armed conflict and various arms control treaties); social and behavioral sciences; scientific and technological framing; the precautionary principle and cost-benefit analysis; and risk science and communication. Considered together, they help to provide an analytical framework consisting of three types of questions: • Questions regarding various stakeholders that might have a direct or indirect interest in particular ELSI concerns and perspectives. Among these stakeholders are subjects of research, military users of a technology or application, adversaries, nonmilitary individuals or groups that might use a technology or application once R&D has been completed, organizations, noncombatants, and other nations. • Questions that cut across these stakeholder groups and that cluster around a number of themes reflecting ELSI impacts related to scale, including, for example, degree of harm; humanity, including what it means to be human; technological imperfections; unintended military uses; and opportunity cost, among others. • Questions that arise from a consideration of the different sources of ELSI insight described in Chapter 4. Drawing on ELSI-related insights from the consideration of the three foundational ERA technologies and four ERA technology-based applications discussed in Chapters 2 and 3, the report sets forth a framework to help identify ethical, legal, and societal issues that might not otherwise be apparent to program officials. Addressing the relevant questions associated with each stakeholder should help to develop useful knowledge on ethical, legal, and societal issues regarding specific military R&D programs and projects. Such knowledge can be used to determine how and to what extent, if any, a program or project might be modified—or in

The Applied Ethics of Emerging Military and Security Technologies SUMMARY

41 5

extreme cases abandoned—because of ELSI concerns. Use of this framework can thus provide input to policy makers, who will then have to make judgments about how, if at all, to proceed with a particular program or project; such judgments should be undertaken after, and not before, the policy makers have examined the issues raised by the questions posed in the framework. GOING BEYOND AN INITIAL ANALYSIS Using the analytical framework offered by this report is likely to bring to light some, although not all, of the ethical, legal, and societal issues associated with R&D on ERA technologies of military significance. Literally anticipating unanticipated ethical, legal, and societal issues is oxymoronic. But the ability to respond quickly to unanticipated issues that do arise can be enhanced by addressing in advance a wide variety of identified issues, because that exercise provides building blocks upon which responses to unanticipated ELSI concerns can be crafted. In general, the task of anticipating ethical, legal, and societal issues that might emerge in the future would be much easier if the specific path of a given science or technology development were known in advance. However, the history of technology forecasting suggests that inaccurate technology forecasts are not unusual, because a variety of paths for any given scientific or technological development are possible. Also, it sometimes happens that military technologies are used in ways that differ significantly from the original conceptions of use. Taking an approach that complements predictive analysis, policy makers have sometimes turned to deliberative processes that seek to include a broad range of perspectives and possible stakeholders in discussions of a given issue. From these different perspectives may well come the identification of new risks, questions of fact that have not previously been addressed, and specific knowledge or information that might not have been considered before. To improve their ability to identify and respond to previously unanticipated ethical, legal, and societal issues that may emerge during the course of an R&D effort, policy makers have sometimes also used adaptive planning that allows them to respond quickly as new information and concerns arise in the course of technology development. Adaptive planning can be a useful way of proceeding despite profound uncertainties about the future. Policies for coping with uncertain environments should take into account the possibility of new information and/or new circumstances emerging tomorrow that can reduce these uncertainties, thus allowing (and indeed including planning for) midcourse corrections.

42

The Applied Ethics of Emerging Military and Security Technologies 6

ELSI FRAMEWORK FOR EMERGING TECHNOLOGIES AND NATIONAL SECURITY

MECHANISMS FOR ADDRESSING ETHICAL, LEGAL, AND SOCIETAL ISSUES

Various organizations, both public and private, use a number of mechanisms to address different types of ethical, legal, and societal issues. Perhaps the most important mechanism for identifying problematic ELSI concerns that may be associated with a given research project is good judgment. That is, project proposers are expected to exercise good judgment in not submitting proposals that are unethical with respect to either the conduct of the research that would be supported or the applications that might result from that research. The same applies to program officials, who are expected not to approve or support projects that are unethical. To support, develop, and enhance the judgment of individual project proposers and program officials, a number of mechanisms, sometimes topic specific, have been used to address ethical, legal, and societal issues—some apply to research, and some to actual deployments of technology. Mechanisms discussed in Chapter 7 and Appendix D include self-regulation and self-awareness; DOD law-of-armed-conflict review and treaty compliance; codes of ethics and social responsibility in science, engineering, and medicine; ELSI research; oversight bodies (such as institutional review boards); advisory boards; research ethics consultation services; chief privacy officers; environmental assessments and environmental impact statements; and drug evaluation and approval. However, these mechanisms have been developed for use primarily in civilian environments. Adapting these ELSI mechanisms for the military R&D context must take into account the special characteristics of the military environment. In addition, those responsible must have an awareness of potential ethical, legal, and societal issues in the R&D effort; clear accountability and responsibility for addressing them; access to necessary expertise in ethics, law, and the social sciences, and to ELSI experts who in turn have access to relevant scientific and technical information; time to address ELSI concerns; and finally the involvement of a wide variety of perspectives, as well as comprehensiveness of and cooperation in attention to ethical, legal, and societal issues. Depending on their goals, policy makers will have to decide how far to go in any of these dimensions. FINDINGS AND RECOMMENDATIONS1 This report finds that some developments in emerging and readily available technologies in a military context are likely to raise complex 1

Boldface below includes findings of the report.

The Applied Ethics of Emerging Military and Security Technologies SUMMARY

43 7

ethical, legal, and societal issues, some of which are different from those associated with similar technologies in a civilian context. ERA technologies by their nature are associated with a very high degree of uncertainty about their future developmental paths, and thus a correspondingly broad range in the ethical, legal, and societal issues that are likely to emerge. Such breadth means that the ELSI concerns that may be associated with a given technology development are very hard to anticipate accurately at the start of that development. Using a diversity of sources of input with different intellectual and political perspectives on a given technology increases the likelihood that relevant ethical, legal, and societal issues will be revealed. Of course, when a particular technology development effort is classified, the universe of sources from which ELSI insights can be derived is more limited, and mechanisms for addressing ethical, legal, and societal issues that are predicated on the relative openness of civilian R&D (that is, unclassified work) are not likely to work as well. Sustainable policy—policy whose goals and conduct can be supported over the long run—regarding science and technology requires decision makers to attend to the ELSI aspects of the S&T involved. High-quality science is one of the more important and obvious factors that contribute to the success of any particular R&D effort involving that science or technology. But inattention to ELSI aspects of an R&D endeavor can undermine even scientifically sound R&D efforts and call into question policy decisions that led to those efforts, regardless of their initial intent. Public reaction to a given science or technology effort or application is sometimes an important influence on the degree of support it receives. A lack of support may manifest itself through adverse journalistic and editorial treatment, greater political scrutiny, reduced budgets (especially in a time of constrained finances), additional restrictions on research, and so on. On the other hand, a positive perception regarding the ethics of an R&D project may enhance public support for pursuit of that science or technology, irrespective of the scientific or technical basis for such pursuit. Finally, any approach to promote consideration of ethical, legal, and societal issues in R&D of military significance will have to address how such plans are implemented at both the program and the project levels. Controversy and concern can easily be fueled by inadequate attention to detail and the manner of implementing oversight processes. For example, it is important that policies for addressing ethical, legal, and societal issues systematically have a "light footprint" when they are implemented by program managers. The intent of the committee's findings and recommendations is not to impose undue compliance requirements on program managers or agencies, but rather to help well-meaning program manag-

44

The Applied Ethics of Emerging Military and Security Technologies 8

ELSI FRAMEWORK FOR EMERGING TECHNOLOGIES AND NATIONAL SECURITY

ers in these agencies do their jobs more effectively and to help ensure that basic American ethical values (such as those embodied in the U.S. Constitution's Bill of Rights) are not compromised. The exercise of common sense, judgment, and understanding of the fundamental intent of policies to address ethical, legal, and societal issues—not simply formal compliance—is the goal and is an important foundation for developing an ELSI-sensitive culture. The foregoing findings (shown in boldface type) help to shape the committee's five recommendations, the first four of which are directed to agencies sponsoring research with military significance. The term "interested agency" as used below means agencies interested in addressing ethical, legal, and societal issues associated with the research they support. Recommendation 1: The senior leaders of interested agencies that support R&D on emerging and readily available technologies of military significance should be engaged with ethical, legal, and societal issues in an ongoing manner and declare publicly that they are concerned with such issues. Such a public declaration should include a designation of functional accountability for ethical, legal, and societal issues within their agencies. High-level support from senior agency leadership is required if an agency is to seriously address ethical, legal, and societal issues associated with the research it funds. Such support must be visible and sustained over time; in its absence, little will happen. An agency's senior leadership sets the tone by publicly communicating to the organization and its stakeholders the importance of addressing ethical, legal, and societal issues, the willingness of the agency to learn from outside perspectives, and the intent of the ELSI-related processes. In the long run, these are key elements in creating an institutional culture that is sensitive to ELSI concerns. Accountability at all levels of an agency, including at the senior management level, is necessary to ensure that attending to ethical, legal, and societal issues is not haphazard and uncoordinated. To maximize the likelihood that ethical, legal, and societal issues will be addressed, an agency's senior leadership should designate a point of functional accountability for this responsibility. Parties with functional accountability provide a second line of defense against overlooking ELSI concerns that complements the primary role played by project teams in executing a program. Recommendation 2: Interested agencies that support R&D on emerging and readily available technologies of military significance should develop and deploy five specific processes to enable these

The Applied Ethics of Emerging Military and Security Technologies SUMMARY

45 9

agencies to consider ethical, legal, and societal issues associated with their research portfolios. 2.a-Initial screening of proposed R&D projects Before supporting any research in a particular S&T area, agencies should conduct a preliminary assessment to identify ethical, legal, and societal issues that the research might raise. In addition, all researchers should identify in their proposals to an agency plausible ELSI concerns that their research might raise. Using such information as a starting point, the funding agency should then make its own assessment about the existence and extent of such issues. Note that this initial assessment should be carried out for all R&D projects (both classified and unclassified). At this stage, the goal is to identify explicitly whether the research would raise significant ethical, legal, and societal issues that require further consideration. Mostly, the answer will be "no/7 and assessment of the proposed research project will proceed without any further consideration of ethical, legal, and societal issues. For the proposals that warrant a "yes/7 the process in Recommendation 2.b comes into play. 2.b-Reviewing proposals that raise ELSI concerns Once an agency has identified research proposals or projects that may raise significant ethical, legal, and societal issues, some closer scrutiny is needed to ascertain how likely it is that such issues will arise, how serious they are likely to be, and whether there are ways to mitigate them. Use of a systematic methodology, such as the analytical framework described in this report, can be helpful for identifying ethical, legal, and societal issues. If and when such issues are identified, program managers should have the opportunity to take action in response. (Of course, program managers are themselves subject to higher authorities, and the latter may take action as well.) Possible responses include not pursuing a given R&D effort, pursuing it more slowly, pursuing it in a modified form that mitigates the identified ethical or societal concerns, pursuing the original effort but also pursuing research to better understand the ethical or societal impacts, and so on. The responses should not be limited simply to a decision to proceed or not to proceed. Furthermore, it should be expected that the initial assessment will not be correct in all aspects. But the initial assessment will assemble resources that are likely to be helpful in formulating a response to unforeseen circumstances, even if these resources are used in ways that are very different from what an original plan specified. In addition, the initial assessment is a concrete point of departure for evolving an approach to handling ethi-

46

The Applied Ethics of Emerging Military and Security Technologies 10

ELSI FRAMEWORK FOR EMERGING TECHNOLOGIES AND NATIONAL SECURITY

cal, legal, and societal issues as circumstances change. Information from the assessment should be made available to modify the research proposal for mitigating ELSI concerns should that be appropriate. 2.c-Monitoring R&D projects for the emergence of ethical, legal, and societal issues and making midcourse corrections when necessary Perfect prediction of significant ELSI concerns is virtually impossible, especially in an area as fraught with uncertainty as research on emerging and readily available technologies. Projects that seemed to raise significant ethical, legal, and societal issues may turn out to raise none; projects that seemed to have no ethical or societal implications may turn out to have hugely important consequences. A process for monitoring the course of R&D projects is thus essential to help agencies to adjust to such changing realities. If the perceived ethical, legal, and societal issues change significantly during the course of a project (that is, if and when new issues are identified or previous attempts to address already-identified issues prove inadequate), the program or project plan can be modified accordingly. Such an adaptive approach plans for and relies on continual (or at least frequent) midcourse changes in response to such feedback. A monitoring process could, in principle, be similar to the initial screening process, with the important proviso that the baseline be updated to take into account what has been learned since the project was last considered. To catch ethical, legal, and societal issues that may have appeared in the interim, the monitoring process should touch all projects in the agency's R&D portfolio, so that projects that were previously determined not to raise ethical, legal, and societal issues can be reexamined. But the intent of this requirement is not to reopen a debate over a project as initially characterized but rather to see if new issues have arisen since the last examination—and in most cases, a project originally determined to not raise ethical, legal, and societal issues will retain that status upon reexamination. It may also be the case that projects originally determined to raise ethical, legal, and societal issues have evolved in such a way that it becomes clear that they do not. 2.d-Engaging with various segments of the public as needed With the stipulation that engagement with various segments of the public does not necessarily mean coming to consensus with them, an agency's ELSI deliberations will often benefit from such external engagement. For example, public concerns about a given R&D project are often formulated in ELSI terms rather than in technical terms. Policy makers

The Applied Ethics of Emerging Military and Security Technologies SUMMARY

47 11

must be prepared for the emergence of unforeseen outcomes and thus must have structures in place that will detect such outcomes and focus attention on them in a timely way. When unforeseen outcomes do emerge, policy makers must be prepared to communicate with the public using proven techniques. A developed strategy for public communication is also useful when anticipated ELSI concerns become public. Government actions in the United States ultimately depend, legally and practically, on the consent of the governed. Building public understanding of an agency's actions, the reasons for those actions, and the precautions the agency has taken will normally be the best strategy, for democracy and for the agency. In addition, members of the public (including, for example, technical experts, experts on risk assessment and communication, and those with ELSI expertise broadly defined) may have points of view that were not well represented in an agency's internal deliberations about a given R&D project. Ongoing engagement throughout the course of a project may reveal the impending appearance of initially unanticipated ethical, legal, and societal issues, and thus provide early warning to program managers and enable a more rapid response if and when these new issues do appear. Finally, the mere fact of consultation and engagement with a wide range of stakeholders helps to defuse later claims that one perspective or another was ignored or never taken into account. Finally, a relevant stakeholder group is the community of researchers themselves. An agency should not suddenly introduce substantive changes in its requirements for proposals without informing the research community about what those changes mean. What is the rationale for these changes? How, if at all, will research projects have to change? What, if anything, does "attending to ethical, legal, and societal issues" mean in the context of decisions about specific proposals? For R&D projects that are classified, public engagement is obviously constrained to a certain extent. Nevertheless, even if such projects can be discussed only with the cleared subsets of the various stakeholder groups, the result will still be more robust and defensible than if the project had not been discussed at all. 2.e-Periodically reviewing ELSI-related processes in an agency Well-meaning policy statements are sometimes translated into excessively bureaucratic requirements. To ensure that ESLI-related processes do not place undue burdens on researchers or on program managers in an agency, these processes should themselves be reviewed periodically to ensure that they are consistent with the intent of high-level policy statements regarding the agency's handling of ethical, legal, and societal issues.

48

The Applied Ethics of Emerging Military and Security Technologies 12

ELSI FRAMEWORK FOR EMERGING TECHNOLOGIES AND NATIONAL SECURITY

Recommendation 3: Interested agencies supporting R&D on emerging and readily available technologies of military significance should undertake an effort to educate and sensitize program managers to ethical, legal, and societal issues. If funding agencies are to screen, assess, and monitor research proposals and projects for possibly significant ethical, legal, and societal issues, they will need people with the ability to recognize those issues. The fields that assess ELSI concerns arising with various technologies have their own vocabularies. At the very least, the agency personnel dealing with these issues will have to understand, at some level, the relevant "language/7 At the same time, those with ELSI responsibilities and/or expertise must have some understanding of the underlying research in order to identify issues that may or may not emerge. One crucial, and easily overlooked, aspect of building internal expertise is building history. If an agency has no institutional memory of what ethical, legal, and societal issues it has faced in its history, how it dealt with those issues, and what the consequences were, its ability to learn from that past is diminished. This diminished capability will be a particular problem for agencies that have frequent turnover. An interested agency needs to make it a priority to collect—and to use—information about how it has dealt with these issues. The agency person or group in charge of screening proposals or projects for ethical, legal, and societal issues might be in a good position to collect and organize that kind of information. Recommendation 4: Interested agencies supporting R&D on emerging and readily available technologies of military significance should build external expertise in ethical, legal, and societal issues to help address such issues. Not all expertise should be, or can be, internal to an agency. Agencies should seek advice from external experts because properly addressing some ELSI concerns will require a depth of knowledge that cannot realistically be expected of program managers or scientists. If such expertise is not immediately available, it should be cultivated. Such cultivation would have both immediate and longer-term benefits. It would help the agency directly by providing that expertise, but, in the longer run, it could also build knowledge, expertise, and even trust outside the agency about what it does about ethical, legal, and societal issues, and why. The committee also makes one recommendation to researchperforming institutions.

The Applied Ethics of Emerging Military and Security Technologies SUMMARY

49 13

Recommendation 5: Research-performing institutions should provide assistance for researchers attending to ethical, legal, and societal issues in their work on emerging and readily available technologies of military significance. Recommendations 1 through 4 address government agencies that fund research on emerging and readily available technologies of military significance. To the extent that these recommendations are adopted, researchers supported by these agencies may need assistance in identifying and responding to ethical, legal, and societal issues with which they may be unfamiliar. The committee believes that universities and other research-performing organizations should provide such assistance when needed by the researchers working under their aegis, in much the same way that they provide other functional support to these researchers. In addition, many institutions performing research on emerging and readily available technologies with military significance already have in place policies and procedures to address a variety of ethical, legal, and societal issues that arise in S&T research. For example, institutional review boards for research involving human subjects are quite common. Leveraging policies and procedures already in place to address ELSI concerns associated with certain kinds of research will help to minimize unnecessary overhead in institutions performing research on ERA technologies with military significance, and where policies and procedures already exist to address ethical, legal, and societal issues that are common to both military and civilian-oriented research, new ones should not be created to address them.

This page intentionally left blank

[4] Unrestricted Warfare Qiao Liang and Wang Xiangsui

52

The Applied Ethics of Emerging Military and Security Technologies Unrestricted Warfare, by Qiao Liang and Wang Xiangsui (Beijing: PLA Literature and Arts Publishing House, February 1999)

[FBIS Editor's Note: The following selections are taken from "Unrestricted Warfare," a book published in China in February 1999 which proposes tactics for developing countries, in particular China, to compensate for their military inferiority vis-a-vis the United States during a high-tech war. The selections include the table of contents, preface, afterword, and biographical information about the authors printed on the cover. The book was written by two PLA senior colonels from the younger generation of Chinese military officers and was published by the PLA Literature and Arts Publishing House in Beijing, suggesting that its release was endorsed by at least some elements of the PLA leadership. This impression was reinforced by an interview with Qiao and laudatory review of the book carried by the party youth league's official daily Zhongguo Qingnian Bao on 28 June. Published prior to the bombing of China's embassy in Belgrade, the book has recently drawn the attention of both the Chinese and Western press for its advocacy of a multitude of means, both military and particularly non-military, to strike at the United States during times of conflict. Hacking into websites, targeting financial institutions, terrorism, using the media, and conducting urban warfare are among the methods proposed. In the Zhongguo Qingnian Bao interview, Qiao was quoted as stating that "the first rule of unrestricted warfare is that there are no rules, with nothing forbidden." Elaborating on this idea, he asserted that strong countries would not use the same approach against weak countries because "strong countries make the rules while rising ones break them and exploit loopholes . . .The United States breaks [UN rules] and makes new ones when these rules don't suit [its purposes], but it has to observe its own rules or the whole world will not trust it." (see FBIS translation of the interview, OW2807114599) [End FBIS Editor's Note] THIS REPORT MAY CONTAIN COPYRIGHTED MATERIAL. COPYING AND DISSEMINATION IS PROHIBITED WITHOUT PERMISSION OF THE COPYRIGHT OWNERS.

The Applied Ethics of Emerging Military and Security Technologies

53

TABLE OF CONTENTS Preface

1

Part One: On New Warfare

8

Chapter 1: The Weapons Revolution Which Invariably Comes First

15

Chapter 8: Essential Principles

204

Conclusion

220

Afterword

225

Author's Background

227

54

The Applied Ethics of Emerging Military and Security Technologies

Preface [pp 1-5 in original] [FBIS Translated Text] Everyone who has lived through the last decade of the 20th century will have a profound sense of the changes in the world. We don't believe that there is anyone who would claim that there has been any decade in history in which the changes have been greater than those of this decade. Naturally, the causes behind the enormous changes are too numerous to mention, but there are only a few reasons that people bring up repeatedly. One of those is the Gulf War. One war changed the world. Linking such a conclusion to a war which occurred one time in a limited area and which only lasted 42 days seems like something of an exaggeration. However, that is indeed what the facts are, and there is no need to enumerate one by one all the new words that began to appear after 17 January 1991. It is only necessary to cite the former Soviet Union, Bosnia-Herzegovina, Kosovo, cloning, Microsoft, hackers, the Internet, the Southeast Asian financial crisis, the euro, as well as the world's final and only superpower — the United States. These are sufficient. They pretty much constitute the main subjects on this planet for the past decade. However, what we want to say is that all these are related to that war, either directly or indirectly. However, we definitely do not intend to mythicize war, particularly not a lopsided war in which there was such a great difference in the actual power of the opposing parties. Precisely the contrary. In our in-depth consideration of this war, which changed the entire world in merely half a month, we have also noted another fact, which is that war itself has now been changed. We discovered that, from those wars which could be described in glorious and dominating terms, to the aftermath of the acme of what it has been possible to achieve to date in the history of warfare, that war, which people originally felt was one of the more important roles to be played out on the world stage, has at one stroke taken the seat of a B actor.

The Applied Ethics of Emerging Military and Security Technologies

55

A war which changed the world ultimately changed war itself. This is truly fantastic, yet it also causes people to ponder deeply. No, what we are referring to are not changes in the instruments of war, the technology of war, the modes of war, or the forms of war. What we are referring to is the function of warfare. Who could imagine that an insufferably arrogant actor, whose appearance has changed the entire plot, suddenly finds that he himself is actually the last person to play this unique role. Furthermore, without waiting for him to leave the stage, he has already been told that there is no great likelihood that he will again handle an A role, at least not a central role in which he alone occupies center stage. What kind of feeling would this be? Perhaps those who feel this most deeply are the Americans, who probably should be counted as among the few who want to play all the roles, including savior, fireman, world policeman, and an emissary of peace, etc. In the aftermath of "Desert Storm," Uncle Sam has not been able to again achieve a commendable victory. Whether it was in Somalia or Bosnia-Herzegovina, this has invariably been the case. In particular, in the most recent action in which the United States and Britain teamed up to carry out air attacks on Iraq, it was the same stage, the same method, and the same actors, but there was no way to successfully perform the magnificent drama that had made such a profound impression eight years earlier. Faced with political, economic, cultural, diplomatic, ethnic, and religious issues, etc., that are more complex than they are in the minds of most of the military men in the world, the limitations of the military means, which had heretofore always been successful, suddenly became apparent. However, in the age of "might makes right" - and most of the history of this century falls into this period — these were issues which did not constitute a problem. The problem is that the U.S.-led multinational forces brought this period to a close in the desert region of Kuwait, thus beginning a new period. At present it is still hard to see if this age will lead to the unemployment of large numbers of military personnel, nor will it cause war to vanish from this world. All these are still undetermined. The only point which is certain is that, from this point on, war will no longer be

56

The Applied Ethics of Emerging Military and Security Technologies

what it was originally. Which is to say that, if in the days to come mankind has no choice but to engage in war, it can no longer be carried out in the ways with which we are familiar. It is impossible for us to deny the impact on human society and its soul of the new motivations represented by economic freedom, the concept of human rights, and the awareness of environmental protection, but it is certain that the metamorphosis of warfare will have a more complex backdrop. Otherwise, the immortal bird of warfare will not be able to attain nirvana when it is on the verge of decline: When people begin to lean toward and rejoice in the reduced use of military force to resolve conflicts, war will be reborn in another form and in another arena, becoming an instrument of enormous power in the hands of all those who harbor intentions of controlling other countries or regions. In this sense, there is reason for us to maintain that the financial attack by George Soros on East Asia, the terrorist attack on the U.S. embassy by Usama Bin Laden, the gas attack on the Tokyo subway by the disciples of the Aum Shinri Kyo, and the havoc wreaked by the likes of Morris Jr. on the Internet, in which the degree of destruction is by no means second to that of a war, represent semi-warfare, quasi-warfare, and sub-warfare, that is, the embryonic form of another kind of warfare. But whatever you call them, they cannot make us more optimistic than in the past. We have no reason for optimism. This is because the reduction of the functions of warfare in a pure sense does not mean at all that war has ended. Even in the so-called post-modern, post-industrial age, warfare will not be totally dismantled. It has only re-invaded human society in a more complex, more extensive, more concealed, and more subtle manner. It is as Byron said in his poem mourning Shelley, "Nothing has happened, he has only undergone a sea change." War which has undergone the changes of modern technology and the market system will be launched even more in atypical forms. In other words, while we are seeing a relative reduction in military violence, at the same time we definitely are seeing an increase in political, economic, and technological violence. However, regardless of the form the violence takes, war is war, and a change in the external appearance does not keep any war from abiding by the principles of war.

The Applied Ethics of Emerging Military and Security Technologies

57

If we acknowledge that the new principles of war are no longer "using armed force to compel the enemy to submit to one's will," but rather are "using all means, including armed force or nonarmed force, military and non-military, and lethal and non-lethal means to compel the enemy to accept one's interests." This represents change. A change in war and a change in the mode of war occasioned by this. So, just what has led to the change? What kind of changes are they? Where are the changes headed? How does one face these changes? This is the topic that this book attempts to touch on and shed light on, and it is also our motivation in deciding to write this book. [Written on 17 January 1999, the 8th anniversary of the outbreak of the Gulf War]

58

The Applied Ethics of Emerging Military and Security Technologies

Part One: On New Warfare [pp. 1-9 in original] "Although ancient states were great, they inevitably perished when they were fond of war" Sima Rangju

Technology is the Totem of Modern Man [1] Stirred by the warm breeze of utilitarianism, it is not surprising that technology is more in favor with people than science is. The age of great scientific discoveries had already been left behind before Einstein's time. However, modern man is increasingly inclined to seeing all his dreams come true during his lifetime. This causes him, when betting on his own future, to prostrate himself and expect wonders from technology through a 1000-power concave lens. In this way, technology has achieved startling and explosive developments in a rather short period of time, and this has resulted in innumerable benefits for mankind, which is anxious for quick success and instant rewards. However, we proudly term this technological progress, not realizing that at this time we have already consigned ourselves to a benighted technological age in which we have lost our hearts [2]. Technology today is becoming increasingly dazzling and uncontrollable. Bell Labs and Sony continue to put out novel toys, Bill Gates opens new "Windows" each year, and "Dolly," the cloned sheep, proves that mankind is now planning to take the place of God the Creator. The fearsome Russian-built SU-27 fighter has not been put to use on any battlefield, and already the SU-35 has emerged to strike a pose [3], but whether or not, once it has exhausted its time in the limelight, the SU-35 will be able to retire having rendered meritorious service is still a matter of considerable doubt. Technology is like "magic shoes" on the feet of mankind, and after the spring has been wound tightly by commercial interests, people can only dance along with the shoes, whirling rapidly in time to the beat that they set.

The Applied Ethics of Emerging Military and Security Technologies

59

The names Watt and Edison are nearly synonymous with great technical inventions, and using these great technological masters to name their age may be said to be reasonable. However, from then on, the situation changed, and the countless and varied technological discoveries of the past 100 years or so makes it difficult for the appearance of any new technology to take on any selfimportance in the realm of human life. While it may be said that the formulations of "the age of the steam engine" and "the age of electrification" can be said to be names which reflect the realities of the time, today, with all kinds of new technology continuously beating against the banks of the age so that people scarcely have the time to accord them brief acclaim while being overwhelmed by an even higher and newer wave of technology, the age in which an era could be named for a single new technology or a single inventor has become a thing of the past. This is the reason why, if one calls the current era the "nuclear age" or the "information age," it will still give people the impression that you are using one aspect to typify the whole situation. There is absolutely no doubt that the appearance of information technology has been good news for human civilization. This is because it is the only thing to date that is capable of infusing greater energy into the technological "plague" that has been released from Pandora's box, and at the same time it also provides a magic charm as a means of controlling it [technology]. It is just that, at present, there is still a question of who in turn will have a magic charm with which to control it [information technology]. The pessimistic viewpoint is that, if this technology develops in a direction which cannot be controlled by man, ultimately it will turn mankind into its victim [4]. However, this frightening conclusion is totally incapable of reducing people's ardor for it. The optimistic prospects that it displays itself are intensely seductive for mankind, which has a thirst for technical progress. After all, its unique features of exchanging and sharing represent the light of intelligence which we can hope will lead mankind out of the barbarism of technology, although this is still not sufficient to make us like those futurists who cannot see the forest for the trees, and who use its name to label the entire age. Its characteristics are precisely what keep it from being able to replace the various technologies that we already have in great quantity, that

60

The Applied Ethics of Emerging Military and Security Technologies

are just emerging, or which are about to be born, particularly those such as biotechnology, materials technology, and nanotechnology, these technologies which have a symbiotic relationship with information technology in which they rely on and promote one another. Over the past 300 years, people have long since become accustomed to blindly falling in love with the new and discarding the old in the realm of technology, and the endless pursuit of new technology has become a panacea to resolve all the difficult questions of existence. Infatuated with it, people have gradually gone astray. Just as one will often commit ten other mistakes to cover up one, to solve one difficult problem people do not hesitate to bring ten more on themselves [5]. For example, for a more convenient means of transportation, people invented cars, but a long string of problems followed closely on the heels of the automobile — mining and smelting, mechanical processing, oil extraction, rubber refining, and road-building, etc., which in turn required a long string of technical means to solve, until ultimately it led to pollution of the environment, destroying resources, taking over farmland, traffic accidents, and a host of thornier problems. In the long run, comparing the original goal of using cars for transportation with these derivative problems, it almost seems unimportant. In this way, the irrational expansion of technology causes mankind to continually lose his goals in the complex ramifications of the tree of technology, losing his way and forgetting how to get back. We may as well dub this phenomenon the "ramification effect." Fortunately, at this time, modern information technology made its appearance. We can say with certainty that this is the most important revolution in the history of technology. Its revolutionary significance is not merely in that it is a brand new technology itself, but more in that it is a kind of bonding agent which can lightly penetrate the layers of barriers between technologies and link various technologies which appear to be totally unrelated. Through its bonding, not only is it possible to derive numerous new technologies which are neither one thing nor the other while they also represent this and that, and furthermore it also provides a kind of brand new approach to the relationship between man and technology. Only from the perspective of mankind can mankind clearly perceive the essence of technology as

10

The Applied Ethics of Emerging Military and Security Technologies

61

a tool, and only then can he avoid becoming a slave to technology — to the tool — during the process of resolving the difficult problems he faces in his existence. Mankind is completely capable of fully developing his own powers of imagination so that, when each technology is used its potential is exhausted, and not being like a bear breaking off corncobs, only able to continually use new technology to replace the old. Today, the independent use of individual technologies is now becoming more and more unimaginable. The emergence of information technology has presented endless possibilities for match-ups involving various old and new technologies and among new and advanced technologies. Countless facts have demonstrated that the integrated use of technology is able to promote social progress more than even the discovery of the technology [6]. The situation of loud solo parts is in the process of being replaced by a multi-part chorus. The general fusion of technology is irreversibly guiding the rising globalization trend, while the globalization trend in turn is accelerating the process of the general fusion of technology, and this is the basic characteristic of our age. This characteristic will inevitably project its features on every direction of the age, and naturally the realm of war will be no exception. No military force that thirsts for modernization can get by without nurturing new technology, while the demands of war have always been the midwife of new technology. During the Gulf War, more than 500 kinds of new and advanced technology of the 80s ascended the stage to strike a pose, making the war simply seem like a demonstration site for new weaponry. However, the thing that left a profound impression on people was not the new weaponry per se, but was rather the trend of systemization in the development and use of the weapons. Like the "Patriots" intercepting the "Scuds," it seemed as simple as shooting birds with a shotgun, while in fact it involved numerous weapons deployed over more than half the globe: After a DSP satellite identified a target, an alarm was sent to a ground station in Australia, which was then sent to the central command post in Riyadh through the U.S. Cheyenne Mountain command post, after which the "Patriot" operators were ordered to take their battle stations, all of

11

62

The Applied Ethics of Emerging Military and Security Technologies

which took place in the mere 90-second alarm stage, relying on numerous relays and coordination of space-based systems and C3I systems, truly a "shot heard 'round the world." The real-time coordination of numerous weapons over great distances created an unprecedented combat capability, and this was precisely something that was unimaginable prior to the emergence of information technology. While it may be said that the emergence of individual weapons prior to World War II was still able to trigger a military revolution, today no-one is capable of dominating the scene alone. War in the age of technological integration and globalization has eliminated the right of weapons to label war and, with regard to the new starting point, has realigned the relationship of weapons to war, while the appearance of weapons of new concepts, and particularly new concepts of weapons, has gradually blurred the face of war. Does a single "hacker" attack count as a hostile act or not? Can using financial instruments to destroy a country's economy be seen as a battle? Did CNN's broadcast of an exposed corpse of a U.S. soldier in the streets of Mogadishu shake the determination of the Americans to act as the world's policeman, thereby altering the world's strategic situation? And should an assessment of wartime actions look at the means or the results? Obviously, proceeding with the traditional definition of war in mind, there is no longer any way to answer the above questions. When we suddenly realize that all these non-war actions may be the new factors constituting future warfare, we have to come up with a new name for this new form of war: Warfare which transcends all boundaries and limits, in short: unrestricted warfare. If this name becomes established, this kind of war means that all means will be in readiness, that information will be omnipresent, and the battlefield will be everywhere. It means that all weapons and technology can be superimposed at will, it means that all the boundaries lying between the two worlds of war and non-war, of military and non-military, will be totally destroyed, and it also means that many of the current principles of combat will be modified, and even that the rules of war may need to be rewritten.

12

The Applied Ethics of Emerging Military and Security Technologies

63

However, the pulse of the God of War is hard to take. If you want to discuss war, particularly the war that will break out tomorrow evening or the morning of the day after tomorrow, there is only one way, and that is to determine its nature with bated breath, carefully feeling the pulse of the God of War today.

Footnotes [1] In Man and Technology, O. Spengler stated that "like God, our father, technology is eternal and unchanging, like the son of God, it will save mankind, and like the Holy Spirit, it shines upon us." The philosopher Spengler's worship for technology, which was just like that of a theologian for God, was nothing but a manifestation of another type of ignorance as man entered the great age of industrialism, which increasingly flourished in the post-industrial age. [2] In this regard, the French philosopher and scientist Jean Ladrihre has a unique viewpoint. He believes that science and technology have a destructive effect as well as a guiding effect on culture. Under the combined effects of these two, it is very difficult for mankind to maintain a clear-headed assessment of technology, and we are constantly oscillating between the two extremes of technical fanaticism and "anti-science" movements. Bracing oneself to read through his The Challenge Presented to Cultures by Science and Technology, in which the writing is abstruse but the thinking recondite, may be helpful in observing the impact of technology on the many aspects of human society from a broader perspective. [3] Although the improvement of beyond visual range (BVR) weapons has already brought about enormous changes in the basic concepts of air combat, after all is said and done it has not completely eliminated short-range combat. The SU-27, which is capable of "cobra" maneuvers and the SU-35, which is capable of "hook" moves, are the most outstanding fighter aircraft to date. [4] F. G. Ronge [as published 1715 2706 1396 2706] is the sharpest of the technological pessimists. As early as 1939, Ronge had recognized the series of problems that modern

13

64

The Applied Ethics of Emerging Military and Security Technologies

technology brings with it, including the growth of technological control and the threat of environmental problems. In his view, technology has already become an unmatched, diabolical force. It has not only taken over nature, it has also stripped away man's freedom. In Being and Time, Martin Heidegger termed technology an "outstanding absurdity," calling for man to return to nature in order to avoid technology, which posed the greatest threat. The most famous technological optimists were [Norbert] Wiener and Steinbuch. In Wiener's Cybernetics, God and Robots, and The Human Use of Human Beings" and Steinbuch's The Information Society, Philosophy and Cybernetics, and other such works, we can see the bright prospects that they describe for human society, driven by technology. [5] In David Ehrenfeld's book, The Arrogance of Humanism, he cites numerous examples of this. In Too Clever, Schwartz states that "the resolution of one problem may generate a group of new problems, and these problems may ultimately preclude that kind of resolution." In Rational Consciousness, Rene Dibo [as published 3583 0355 6611 0590] also discusses a similar phenomenon. [6] In The Age of Science and the Future of Mankind, E. Shulman points out that "during the dynamic development of modern culture, which is based on the explosive development of modern technology, we are increasingly faced with the fact of multidisciplinary cooperation...it is impossible for one special branch of science to guide our practice in a sufficiently scientific manner."

14

The Applied Ethics of Emerging Military and Security Technologies

65

Chapter 1: The Weapons Revolution Which Invariably Comes First [pp. 10-33 in original] "As soon as technological advances may be applied to military goals, and furthermore are already used for military purposes, they almost immediately seem obligatory, and also often go against the will of the commanders in triggering changes or even revolutions in the modes of combat" - Engels

The weapons revolution invariably precedes the revolution in military affairs by one step, and following the arrival of a revolutionary weapon, the arrival of the revolution in military affairs is just a matter of time. The history of warfare is continually providing this kind of proof: bronze or iron spears resulted in the infantry phalanx, and bows and arrows and stirrups provided new tactics for cavalry [1]. Black powder cannons gave rise to a full complement of modern warfare modes....from the time when conical bullets and rifles [2] took to the battlefield as the vanguard of the age of technology, weapons straightaway stamped their names on the chest of warfare. First, it was the enormous steel-clad naval vessels that ruled the seas, launching the "age of battleships," then its brother the "tank" ruled land warfare, after which the airplane dominated the skies, up until the atomic bomb was born, announcing the approach of the "nuclear age." Today, a multitude of new and advanced technology weapons continues to pour forth, so that weapons have solemnly become the chief representative of war. When people discuss future warfare, they are already quite accustomed to using certain weapons or certain technologies to describe it, calling it "electronic warfare," "precision-weapons warfare," and "information warfare." Coasting along in their mental orbit, people have not yet noticed that a certain inconspicuous yet very important change is stealthily approaching.

No One Has the Right to Label Warfare

15

66

The Applied Ethics of Emerging Military and Security Technologies

The weapons revolution is a prelude to a revolution in military affairs. What is different than in the past is that the revolution in military affairs that is coming will no longer by driven by one or two individual weapons. In addition to continuing to stimulate people to yearn for and be charmed by new weapons, the numerous technological inventions have also quickly eradicated the mysteries of each kind of weapon. In the past, all that was needed was the invention of a few weapons or pieces of equipment, such as the stirrup and the Maxim machine gun [3], and that was sufficient to alter the form of war, whereas today upwards of 100 kinds of weapons are needed to make up a certain weapons system before it can have an overall effect on war. However, the more weapons are invented, the smaller an individual weapon's role in war becomes, and this is a paradox that is inherent in the relationship between weapons and war. Speaking in that sense, other than the all-out use of nuclear weapons, a situation which is more and more unlikely and which may be termed nuclear war, none of the other weapons, even those that are extremely revolutionary in nature, possesses the right to label future warfare. Perhaps it is precisely because people recognize this point that we then have formulations such as "high-tech warfare" and "information warfare" [4], whose intent is to use the broad concept of technology to replace the concept of specific weapons, using a fuzzy-learning approach to resolve this knotty problem. However, it seems that this still is not the way to resolve the problem. When one delves deeply into this, the term "high-technology"[5], which first appeared in the architectural industry in the United States, is in fact a bit vague. What constitutes high technology? What does it refer to? Logically speaking, high and low are only relative concepts. However, using an extremely mutable concept in this irrational manner to name warfare, which is evolving endlessly, in itself constitutes a considerable problem. When one generation's high technology becomes low technology with the passage of time, are we still prepared to again dub the new toys that continue to appear as being high tech? Or is it possible that, in today's technological explosion, this may result in confusion and trouble for us in naming and using each

16

The Applied Ethics of Emerging Military and Security Technologies

67

new technology that appears? Not to mention the question of just what should be the standard to determine whether something is high or not? With regard to technology itself, each technology has specific aspects, which therefore means that each has its time limits. Yesterday's "high" is very possibly today's "low," while today's "new" will in turn become tomorrow's "old." Compared to the M-60 tank, the "Cobra" helicopter, and the B-52, the main battle weapons of the 60s-70s, the "Abrams" tank, the "Apache" helicopter gunship, the F-117, the "Patriot" missiles, and the "Tomahawk" cruise missiles are high tech. However, faced with the B-2, the F22, the "Comanche" helicopter, and the "J-Stars" joint-surveillance target-attack radar system, they in turn seem outmoded. It is as if to say there is the concept of high-tech weapons, which is a variable throughout, and which naturally becomes the title of the "bride." Then, as the "flowers bloom each year, but the people change," all that is left is the empty shell of a name, which is continually placed on the head of the girl who is becoming the next "bride." Then, in the chain of warfare with its continuous links, each weapon can go from high to low and from new to old at any time and any place, with time's arrow being unwilling to stop at any point; nor can any weapon occupy the throne of high technology for long. Since this is the case, just what kind of high technology does this so-called high-tech warfare refer to? High technology, as spoken of in generalities, cannot become a synonym for future warfare, nor is information technology - which is one of the high technologies of the present age and which seems to occupy an important position in the makeup of all modern weapons — sufficient to name a war. Even if in future wars all the weapons have information components embedded in them and are fully computerized, we can still not term such war information warfare, and at most we can just call it computerized warfare [6]. This is because, regardless of how important information technology is, it cannot completely supplant the functions and roles of each technology per se. For example, the F-22 fighter, which already fully embodies information technology, is still a fighter, and the "Tomahawk" missile is still a missile, and one cannot lump them all together as information weapons, nor can war which is conducted using these weapons

17

68

The Applied Ethics of Emerging Military and Security Technologies

be termed information warfare [7]. Computerized warfare in the broad sense and information warfare in the narrow sense are two completely different things. The former refers to the various forms of warfare which are enhanced and accompanied by information technology, while the latter primarily refers to war in which information technology is used to obtain or suppress information. In addition, the contemporary myth created by information worship has people mistakenly believing that it is the only rising technology, while the sun has already set on all the others. This kind of myth may put more money in the pockets of Bill Gates, but it cannot alter the fact that the development of information technology similarly relies on the development of other technology, and the development of related materials technology is a direct constraint on information technology breakthroughs. For example, the development of biotechnology will determine the future fate of information technology [8]. Speaking of bio-information technology, we may as well return to a previous topic and again make a small assumption: If people use information-guided bio-weapons to attack a bio-computer, should this be counted as bio-warfare or information warfare? I fear that no one will be able to answer that in one sentence, but this is something which is perfectly capable of happening. Actually, it is basically not necessary for people to wrack their brains over whether or not information technology will grow strong and unruly today, because it itself is a synthesis of other technologies, and its first appearance and every step forward are all a process of blending with other technologies, so that it is part of them, and they are part of it, and this is precisely the most fundamental characteristic of the age of technological integration and globalization. Naturally, like the figures from a steel seal, this characteristic may leave its typical imprint on each modern weapon. We are by no means denying that, in future warfare, certain advanced weapons may play a leading role. However, as for determining the outcome of war, it is now very difficult for anyone to occupy an unmatched position. It may be leading, but it will not be alone, much less never-changing. Which is also to say that there is no one who can unblushingly stamp his own name on a given modern war.

18

The Applied Ethics of Emerging Military and Security Technologies

69

"Fighting the Fight that Fits One's Weapons" and "Making the Weapons to Fit the Fight" These two sentences, "fight the fight that fits one's weapons" and "build the weapons to fit the fight" show the clear demarcation line between traditional warfare and future warfare, as well as pointing out the relationship between weapons and tactics in the two kinds of war. The former reflects the involuntary or passive adaptation of the relationship of man to weapons and tactics in war which takes place under natural conditions, while the latter suggests the conscious or active choice that people make regarding the same proposition when they have entered a free state. In the history of war, the general unwritten rule that people have adhered to all along is to "fight the fight that fits one's weapons." Very often it is the case that only after one first has a weapon does one begin to formulate tactics to match it. With weapons coming first, followed by tactics, the evolution of weapons has a decisive constraining effect on the evolution of tactics. Naturally, there are limiting factors here involving the age and the technology, but neither can we say that there is no relationship between this and the linear thinking in which each generation of weapons making specialists only thinks about whether or not the performance of the weapon itself is advanced, and does not consider other aspects. Perhaps this is one of the factors why a weapons revolution invariably precedes a revolution in military affairs. Although the expression "fight the fight that fits one's weapons" is essentially negative in nature because what it leaves unsaid reflects a kind of helplessness, we have no intention of belittling the positive meaning that it has today, and this positive meaning is seeking the optimum tactics for the weapons one has. In other words, seeking the combat mode which represents the best match for the given weapons, thereby seeing that they perform up to their peak values. Today, those engaged in warfare have now either consciously or unconsciously completed the transition of this rule from the negative to the positive. It is just that people still wrongfully believe that this is the only initiative that can be taken by backward countries in their helplessness. They hardly realize that the United States, the foremost power in the world, must similarly face this kind of helplessness. Even though she is the richest in the world, it is not necessarily possible for her to

19

70

The Applied Ethics of Emerging Military and Security Technologies

use up her uniform new and advanced technology weapons to fight an expensive modern war [9]. It is just that she has more freedom when it comes to the selection and pairing up of new and old weapons. If one can find a good point of agreement, which is to say, the most appropriate tactics, the pairing up and use of new and older generation weapons not only makes it possible to eliminate the weakness of uniform weaponry, it may also become a "multiplier" to increase the weapons' effectiveness. The B-52 bomber, which people have predicted on many occasions is long since ready to pass away peacefully, has once again become resplendent after being coupled with cruise missiles and other precision guided weapons, and its wings have not yet rested to date. By the use of external infrared guided missiles, the A-10 aircraft now has night-attack capabilities that it originally lacked, and when paired with the Apache helicopter, they complement each other nicely, so that this weapons platform which appeared in the mid-70s is very imposing. Obviously, "fight the fight that fits one's weapons" by no means represents passive inaction. For example, today's increasingly open weapons market and multiple supply channels have provided a great deal of leeway with regard to weapons selection, and the massive coexistence of weapons which span multiple generations has provided a broader and more functional foundation for trans-generation weapons combinations than at any age in the past, so that it is only necessary to break with our mental habit of treating the weapons' generations, uses, and combinations as being fixed to be able to turn something that is rotten into something miraculous. If one thinks that one must rely on advanced weapons to fight a modern war, being blindly superstitious about the miraculous effects of such weapons, it may actually result in turning something miraculous into something rotten. We find ourselves in a stage where a revolutionary leap forward is taking place in weapons, going from weapons systems symbolized by gunpowder to those symbolized by information, and this may be a relatively prolonged period of alternating weapons. At present we have no way of predicting how long this period may last, but what we can say for sure is that, as long as this alternation has not come to an end, fighting the kind of battle that fits one's

20

The Applied Ethics of Emerging Military and Security Technologies

71

weapons will be the most basic approach for any country in handling the relationship between weapons and combat, and this includes the United States, the country which has the most advanced weapons. What must be pointed out is that, the most basic thing is not the thing with the greatest future. Aggressive initiatives under negative preconditions is only a specific approach for a specific time, and by no means constitutes an eternal rule. In man's hands, scientific progress has long since gone from passive discovery to active invention, and when the Americans proposed the concept of "building the weapons to fit the fight," it triggered the greatest single change in the relationship between weapons and tactics since the advent of war. First determine the mode of combat, then develop the weapons, and in this regard, the first stab that the Americans took at this was "Air-Land battle," while the currently popular "digitized battlefield" and "digitized units" [10] which have given rise to much discussion represent their most recent attempt. This approach indicates that the position of weapons in invariably preceding a revolution in military affairs has now been shaken, and now tactics come first and weapons follow, or the two encourage one another, with advancement in a push-pull manner becoming the new relationship between them. At the same time, weapons themselves have produced changes with epoch-making significance, and their development no longer looks only to improvements in the performance of individual weapons, but rather to whether or not the weapons have good characteristics for linking and matching them with other weapons. As with the F-l 11, which was in a class by itself at the time, because it was too advanced, there was no way to pair it up with other weapons, so all they could do was shelve it. That lesson has now been absorbed, and the thinking that tries to rely on one or two new and advanced technology weapons to serve as "killer weapons" which can put an end to the enemy is now outmoded. "Building the weapons to fit the fight," an approach which has the distinctive features of the age and the characteristics of the laboratory, may not only be viewed as a kind of active choice, it can also be taken as coping with shifting events by sticking to a fundamental principle, and in addition to being a major breakthrough in the history of preparing for war, it also implies the

21

72

The Applied Ethics of Emerging Military and Security Technologies

potential crisis in modern warfare: Customizing weapons systems to tactics which are still being explored and studied is like preparing food for a great banquet without knowing who is coming, where the slightest error can lead one far astray. Viewed from the performance of the U.S. military in Somalia, where they were at a loss when they encountered Aidid's forces, the most modern military force does not have the ability to control public clamor, and cannot deal with an opponent who does things in an unconventional manner. On the battlefields of the future, the digitized forces may very possibly be like a great cook who is good at cooking lobsters sprinkled with butter, when faced with guerrillas who resolutely gnaw corncobs, they can only sigh in despair. The "generation gap"[11] in weapons and military forces is perhaps an issue that requires exceptional attention. The closer the generation gap is, the more pronounced are the battle successes of the more senior generation, while the more the gap opens, the less each party is capable of dealing with the other, and it may reach the point where no one can wipe out the other. Looking at the specific examples of battles that we have, it is difficult for high-tech troops to deal with unconventional warfare and low-tech warfare, and perhaps there is a rule here, or at least it is an interesting phenomenon which is worth studying[12].

Weapons of New Concepts and New Concepts of Weapons Compared to new-concept weapons, nearly all the weapons that we have known so far may be termed old-concept weapons. The reason they are called old is because the basic functions of these weapons were their mobility and lethal power. Even things like precision-guided bombs and other such high-tech weapons really involve nothing more than the addition of the two elements of intelligence and structural capabilities. From the perspective of practical applications, no change in appearance can alter their nature as traditional weapons, that is, their control throughout by professional soldiers and their use on certain battlefields. All these weapons and weapons platforms that have been produced in line with traditional thinking have without exception come to a dead end in their efforts to adapt to modern warfare and future

22

The Applied Ethics of Emerging Military and Security Technologies

73

warfare. Those desires of using the magic of high-technology to work some alchemy on traditional weapons so that they are completely remade have ultimately fallen into the high-tech trap involving the endless waste of limited funds and an arms race. This is the paradox that must inevitably be faced in the process of the development of traditional weapons: To ensure that the weapons are in the lead, one must continue to up the ante in development costs; the result of this continued raising of the stakes is that no one has enough money to maintain the lead. Its ultimate result is that the weapons to defend the country actually become a cause of national bankruptcy. Perhaps the most recent examples are the most convincing. Marshal Orgakov, the former chief of the Soviet general staff, was acutely aware of the trend of weapons development in the "nuclear age," and when, at an opportune time, he proposed the brand-new concept of the "revolution in military technology," his thinking was clearly ahead of those of his generation. But being ahead of time in his thinking hardly brought his country happiness, and actually brought about disastrous results [13]. As soon as this concept — which against the backdrop of the Cold War was seen by his colleagues as setting the pace for the time — was proposed, it further intensified the arms race which had been going on for some time between the United States and the Soviet Union. It was just that, at that time no one could predict that it would actually result in the breakup of the Soviet Union and its complete elimination from the superpower contest. A powerful empire collapsed without a single shot being fired, vividly corroborating the lines of the famous poem by Kipling, "When empires perish, it is not with a rumble, but a snicker." Not only was this true for the former Soviet Union, today the Americans seem to be following in the footsteps of their old adversary, providing fresh proof of the paradox of weapons development that we have proposed. As the outlines of the age of technology integration become increasingly clear, they are investing more and more in the development of new weapons, and the cost of the weapons is getting higher and higher. The development of the F-14 and F-15 in the 60s-70s cost one billion dollars, while the development of the B-2 in the 80s cost over $10 billion, and the

23

74

The Applied Ethics of Emerging Military and Security Technologies

development of the F-22 in the 90s has exceeded $13 billion. Based on weight, the B-2 [14], which runs $13-$ 15 billion each, is some three times more expensive than an equivalent weight of gold [15]. Expensive weapons like that abound in the U.S. arsenal, such as the F-l 17A bomber, the F-22 main combat aircraft, and the Comanche helicopter gunship. The cost of each of these weapons exceeds or approaches $100 million, and this massive amount of weapons with unreasonable cost-effectiveness has covered the U.S. military with increasingly heavy armor, pushing them step by step toward the high-tech weapons trap where the cost stakes continue to be raised. If this is still true for the rich and brash United States, then how far can the other countries, who are short of money, continue down this path? Obviously, it will be difficult for anyone to keep going. Naturally, the way to extricate oneself from this predicament is to develop a different approach. Therefore, new-concept weapons have emerged to fill the bill. However, what seems unfair to people is that it is again the Americans who are in the lead in this trend. As early as the Vietnam war, the silver iodide powder released over the "Ho Chi Minh trail" that resulted in torrential rains and the defoliants scattered over the subtropical forests put the "American devils" in the sole lead with regard to both the methods and ruthlessness of new-concept weapons. Thirty years later, with the dual advantages of money and technology, others are unable to hold a candle to them in this area. However, the Americans are not necessarily in the sole lead in everything. The new concepts of weapons, which came after the weapons of new concepts and which cover a wider area, were a natural extension of this. However, the Americans have not been able to get their act together in this area. This is because proposing a new concept of weapons does not require relying on the springboard of new technology, it just demands lucid and incisive thinking. However, this is not a strong point of the Americans, who are slaves to technology in their thinking. The Americans invariably halt their thinking at the boundary where technology has not yet reached. It cannot be denied that man-made earthquakes, tsunamis, weather disasters, or subsonic wave and new

24

The Applied Ethics of Emerging Military and Security Technologies

75

biological and chemical weapons all constitute new concept weapons [16], and that they have tremendous differences with what we normally speak of as weapons, but they are still all weapons whose immediate goal is to kill and destroy, and which are still related to military affairs, soldiers, and munitions. Speaking in this sense, they are nothing more than nontraditional weapons whose mechanisms have been altered and whose lethal power and destructive capabilities have been magnified several times over. However, a new concept of weapons is different. This and what people call new-concept weapons are two entirely different things. While it may be said that new-concept weapons are weapons which transcend the domain of traditional weapons, which can be controlled and manipulated at a technical level, and which are capable of inflicting material or psychological casualties on an enemy, in the face of the new concept of weapons, such weapons are still weapons in a narrow sense. This is because the new concept of weapons is a view of weapons in the broad sense, which views as weapons all means which transcend the military realm but which can still be used in combat operations. In its eyes, everything that can benefit mankind can also harm him. This is to say that there is nothing in the world today that cannot become a weapon, and this requires that our understanding of weapons must have an awareness that breaks through all boundaries. With technological developments being in the process of striving to increase the types of weapons, a breakthrough in our thinking can open up the domain of the weapons kingdom at one stroke. As we see it, a single man-made stock-market crash, a single computer virus invasion, or a single rumor or scandal that results in a fluctuation in the enemy country's exchange rates or exposes the leaders of an enemy country on the Internet, all can be included in the ranks of new-concept weapons. A new concept of weapons provides direction for newconcept weapons, while the new-concept weapons give fixed forms to the new concept of weapons. With regard to the flood of new-concept weapons, technology is no longer the main factor, and the true underlying factor is a new concept regarding weapons.

25

76

The Applied Ethics of Emerging Military and Security Technologies

What must be made clear is that the new concept of weapons is in the process of creating weapons that are closely linked to the lives of the common people. Let us assume that the first thing we say is: The appearance of new-concept weapons will definitely elevate future warfare to a level which is hard for the common people — or even military men — to imagine. Then the second thing we have to say should be: The new concept of weapons will cause ordinary people and military men alike to be greatly astonished at the fact that commonplace things that are close to them can also become weapons with which to engage in war. We believe that some morning people will awake to discover with surprise that quite a few gentle and kind things have begun to have offensive and lethal characteristics.

The Trend to "Kinder" Weapons Before the appearance of the atom bomb, warfare was always in a "shortage age" with respect to lethal power. Efforts to improve weapons have primarily been to boost their lethal power, and from the "light-kill weapons" represented by cold steel weapons and single-shot firearms to the "heavy-kill weapons" represented by various automatic firearms, the history of the development of weapons has almost always been a process of continuing to boost the lethal power of weapons. Prolonged shortages resulted in a thirst among military men for weapons of even greater lethal power that was difficult to satisfy. With a single red cloud that arose over the wasteland of New Mexico in the United States, military men were finally able to obtain a weapon of mass destruction that fulfilled their wishes, as this could not only completely wipe out the enemy, it could kill them 100 or 1000 times over. This gave mankind lethal capabilities that exceeded the demand, and for the first time there was some room to spare with regard to lethal power in war. Philosophical principles tell us that, whenever something reaches an ultimate point, it will turn in the opposite direction. The invention of nuclear weapons, this "ultra-lethal weapon" [17] which can wipe out all mankind, has plunged mankind into an existential trap of its own making.

26

The Applied Ethics of Emerging Military and Security Technologies

77

Nuclear weapons have become a sword of Damocles hanging over the head of mankind which forces it to ponder: Do we really need "ultra-lethal weapons"? What is the difference between killing an enemy once and killing him 100 times? What is the point of defeating the enemy if it means risking the destruction of the world? How do we avoid warfare that results in ruin for all? A "balance of terror" involving "mutually-assured destruction" was the immediate product of this thinking, but its by-product was to provide a braking mechanism for the runaway express of improving the lethal capabilities of weapons, which was continually picking up speed, so that the development of weapons was no longer careening crazily down the light-kill weapons — heavykill weapons — ultra-lethal weapons expressway, with people trying to find a new approach to weapons development which would not only be effective but which could also exercise control over the lethal power of the weapons. Any major technological invention will have a profound human background. The "Universal Declaration of Human Rights" passed by the United Nations General Assembly in 1948 and the more than 50 subsequent pacts related to it have established a set of international rules for human rights in which it is recognized that the use of weapons of mass destruction — particularly nuclear weapons — is a serious violation of the "right to life" and represents a "crime against mankind." Influenced by human rights and other new political concepts, plus the integration trend in international economics, the interlocking demands and political positions involving the interests of various social and political forces, the proposal of the concept of "ultimate concern" for the ecological environment, and particularly the value of human life, have resulted in misgivings about killing and destruction, forming a new value concept for war and new ethics for warfare. The trend to "kinder" [18] weapons is nothing other than a reflection in the production and development of weapons of this great change in man's cultural background. At the same time, technological progress has given us the means to strike at the enemy's nerve center directly without harming other things, giving us numerous new options for achieving victory, and all these make people believe that the best way to achieve victory is to control, not to kill. There

27

78

The Applied Ethics of Emerging Military and Security Technologies

have been changes in the concept of war and the concept of weapons, and the approach of using uncontrolled slaughter to force the enemy into unconditional surrender has now become the relic of a bygone age. Warfare has now taken leave of the meat-grinder age of Verdun-like campaigns. The appearance of precision-kill (accurate) weapons and non-lethal (non-fatal) weapons is a turning point in the development of weapons, showing for the first time that weapons are developing in a "kinder," not a "stronger" direction. Precision-kill weapons can hit a target precisely, reducing collateral casualties, and like a gamma knife which can excise a tumor with hardly any bleeding, it has led to "surgical" strikes and other such new tactics, so that inconspicuous combat actions can achieve extremely notable strategic results. For example, by merely using one missile to track a mobile telephone signal, the Russians were able to still forever the tough mouth of Dudayev, who was a headache, and at the same time eased the enormous trouble that had been stirred up by tiny Chechnya. Non-lethal weapons can effectively eliminate the combat capabilities of personnel and equipment without loss of life [19]. The trend that is embodied in these weapons shows that mankind is in the process of overcoming its own extreme thinking, beginning to learn to control the lethal power that it already has but which is increasingly excessive. In the massive bombing that lasted more than a month during the Gulf War, the loss of life among civilians in Iraq only numbered in the thousands [20], far less than in the massive bombing of Dresden during World War II. Kinder weapons represent the latest conscious choice of mankind among various options in the weapons arena by which, after the weapons are infused with the element of new technology, the human component is then added, thereby giving warfare an unprecedented kind-hearted hue. However, a kinder weapon is still a weapon, and it does not mean that the demands of being kinder will reduce the battlefield effectiveness of the weapon. To take away a tank's combat capabilities one can use cannons or missiles to destroy it, or a laser beam can be used to destroy its optical equipment or blind its crew. On the battlefield, someone who is injured requires more care than someone who is killed,

28

The Applied Ethics of Emerging Military and Security Technologies

79

and unmanned weapons can eliminate increasingly expensive protective facilities. Certainly those developing kinder weapons have already done cold cost-effectiveness calculations of this. Casualties can strip away an enemy's combat capabilities, causing him to panic and lose the will to fight, so this may be considered an extremely worthwhile way to achieve victory. Today, we already have enough technology, and we can create many methods of causing fear which are more effective, such as using a laser beam to project the image of injured followers against the sky, which would be sufficient to frighten those soldiers who are devoutly religious. There are no longer any obstacles to building this kind of weapon, it just requires that some additional imagination be added to the technical element. Kinder weapons represent a derivative of the new concept of weapons, while information weapons are a prominent example of kinder weapons. Whether it involves electromagnetic energy weapons for hard destruction or soft-strikes by computer logic bombs, network viruses, or media weapons, all are focused on paralyzing and undermining, not personnel casualties. Kinder weapons, which could only be born in an age of technical integration, may very well be the most promising development trend for weapons, and at the same time they will bring about forms of war or revolutions in military affairs which we cannot imagine or predict today. They represent a change with the most profound implications in the history of human warfare to date, and are the watershed between the old and the new forms of war. This is because their appearance has been sufficient to put all the wars in the age of cold and hot weapons into the "old" era. Nonetheless, we still cannot indulge in romantic fantasies about technology, believing that from this point on war will become a confrontation like an electronic game, and even simulated warfare in a computer room similarly must be premised upon a country's actual overall capabilities, and if a colossus with feet of clay comes up with ten plans for simulated warfare, it will still not be sufficient to deter an enemy who is more powerful with regard to actual strength. War is still the ground of death and life, the path of survival and destruction, and even the slightest innocence is not tolerated. Even if some day all the weapons have been made

29

80

The Applied Ethics of Emerging Military and Security Technologies

completely humane, a kinder war in which bloodshed may be avoided is still war. It may alter the cruel process of war, but there is no way to change the essence of war, which is one of compulsion, and therefore it cannot alter its cruel outcome, either.

Footnotes: [1] Engels said, "In the age of barbarism, the bow and arrow was still a decisive weapon, the same as the iron sword in an uncivilized age and firearms in the age of civilization." (Collected Works of Marx and Engels, Vol. 4, People's Press, 1972, p. 19) With regard to how stirrups altered the mode of combat, we can refer to the translation and commentary by Gu Zhun [7357 0402] of an article entitled "Stirrups and Feudalism — Does Technology Create History?" "Stirrups...immediately made hand-to-hand combat possible, and this was a revolutionary new mode of combat...very seldom had there been an invention as simple as the stirrup, but very seldom did it play the kind of catalytic role in history that this did." "Stirrups resulted in a series of military and social revolutions in Europe." (Collected Works of Gu Zhun, Guizhou People's Press, 1994, pp 293-309). [2] "Compared to the development of any advanced new weapons technology, the invention of the rifle and the conical bullet between 1850-1860 had the most profound and immediate revolutionary impact

The impact on their age of high-explosive bombs, airplanes, and tanks,

which appeared in the 20th century, certainly does not compare to that of the rifle at the time." For details, see T. N. Dupuy's The Evolution of Weapons and Warfare, part 3, section 21, "Rifles, Conical Bullets, and Dispersed Formations." (Military Science Publishing House, 1985, pp 238-250). [3] In the engagement of the Somme river in World War I, on 1 July 1916 the English forces launched an offensive against the Germans, and the Germans used Maxim machine guns to strafe the English troops, which were in a tight formation, resulting in 60,000 casualties in one day. From that point, mass formation charges gradually began to retreat from the battlefield.

30

The Applied Ethics of Emerging Military and Security Technologies

81

(Weapons and War — The Historical Evolution of Military Technology, Liu Jifeng [0491 2060 6912], University of Science and Technology for National Defense Publishing House, 1992, pp 172-173). [4] If Wiener's views on war game machines are not taken as the earliest discussion of information weapons. Then, a comment by Tom Luona [as published 5012 6719] in 1976 to the effect that information warfare is a "struggle among decision-making systems" makes him the first to come up with the term "information warfare" (U.S., Military Intelligence magazine, 1997, Jan-Mar issue, Douglas Dearth, "Implications, Characteristics, and Impact of Information Warfare"). Through independent research, in 1990, Shen Weiguang [3088 0251 0342], a young scholar in China who has over ten years of military service, published Information Warfare, which is probably the earliest monograph on information warfare. On the strength of his Third Wave, in another best-seller entitled Power Shift, Toffler gave information warfare a global look, while the Gulf War happened along to become the most splendid advertisement for this new concept of combat. At that point, discussing "information warfare" became fashionable. [5] Foreign experts hold that "high technology" is not a completely fixed concept and that it is also a dynamic concept, with different countries emphasizing high technology differently. Military high technology mainly includes military microelectronic device technology, computer technology, optoelectric technology, aerospace technology, biotechnology, new materials technology, stealth technology, and directed-energy technology. The most important characteristic of military high technology is "integration," i.e., each military high technology is made up of various technologies to form a technology group. (For details, see "Foreign Military Data," Academy of Military Sciences, Foreign Military Research Dept., No. 69, 1993). [6] Regarding the definition of "information warfare," to date opinions still vary. The definition by the U.S. Department of Defense and the Joint Chiefs of Staff is: Actions taken to interfere with the enemy's information, information processing, information systems, and computer networks to achieve information superiority over the enemy, while protecting one's own

31

82

The Applied Ethics of Emerging Military and Security Technologies

information, information processing, information systems, and computer networks. According to U.S. Army Field Manual FM100-6, "the DOD's understanding of information warfare leans toward the effects of information in actual conflicts," while the Army's understanding is that "information has already permeated every aspect, from peacetime to military actions in global warfare" (Military Science Publishing House, Chinese translation, pp 24-25). "In a broad sense, information warfare constitutes actions which use information to achieve national goals." That is the definition given to information warfare by George Stein, a professor at the U.S. Air University, reflecting a somewhat broader vision than that of the Army. In an article in the 1997 summer edition of "Joint Force Quarterly," Col. Brian Fredericks proposed that "information warfare is a national issue that goes beyond the scope of national defense," and perhaps this is the most accurate description of information warfare in the broad sense. [7] Running precisely counter to the situation in which the implications of the concept of "information warfare" are getting broader and broader, some of the smart young officers in the U.S. military are increasingly questioning the concept of "information warfare." Air Force Lt. Col. James Rogers points out that "information warfare really isn't anything new...whether or not those who assert that information warfare techniques and strategies will inevitably replace 'armed warfare' are a bit too self-confident." (U.S. Marines magazine , April, 1997). Navy Lieutenant Robert Guerli [as published 0657 1422 0448] proposed that "the seven areas of misunderstanding with regard to information warfare are: (1) the overuse of analogous methods; (2) exaggerating the threat; (3) overestimating one's own strength; (4) historical relevance and accuracy; (5) avoiding criticism of anomalous attempts; (6) totally unfounded assumptions; and (7) nonstandard definitions." (U.S., Events magazine, Sep 97 issue). Air Force Major Yulin Whitehead wrote in the fall 1997 issue of Airpower Journal that information is not all-powerful, and that information weapons are not "magic weapons." Questions about information warfare are definitely not limited to individuals, as the U.S. Air Force document "The Foundations of Information Warfare" makes a strict distinction between "warfare in the information age" and

32

The Applied Ethics of Emerging Military and Security Technologies

83

"information warfare." It holds that "warfare in the information age" is warfare which uses computerized weapons, such as using a cruise missile to attack a target, whereas "information warfare" treats information as an independent realm and a powerful weapon. Similarly, some well-known scholars have also issued their own opinions. Johns Hopkins University professor Eliot Cohen reminds us that "just as nuclear weapons did not result in the elimination of conventional forces, the information revolution will not eliminate guerilla tactics, terrorism, or weapons of mass destruction." [8] Macromolecular systems designed and produced using biotechnology represent the production materials for even higher order electronic components. For example, protein molecule computers have computation speeds and memory capabilities hundreds of millions of times greater than our current computers. (New Military Perspectives for the Next Century, Military Science Publishing House, 1997 edition, pp 142-145). [9] Even in the Gulf War, which has been termed a testing ground for the new weapons, there were quite a few old weapons and conventional munitions which played important roles. (For details, see "The Gulf War — U.S. Department of Defense Final Report to Congress — Appendix") [10] Starting with "Air-Land Battle," weapons development by the U.S. military has mainly been divided into five stages: Propose requirements, draft a plan, proof of concept, engineering development and production, and outfitting the units. Development regarding the equipping of digitized units is following this same path. (U.S. Army magazine, Oct 1995). In March, 1997, the U.S. Army conducted a brigade-size high-level combat test, testing a total of 58 kinds of digitized equipment. (U.S. Army Times, 31 March, 7 April, 28 April 1997). According to John E. Wilson, commander of the U.S. Army's Materiel Command, his mission is to cooperate with the Training and Doctrine Command, thinking up and developing bold and novel advanced technology equipment for them which meets their needs. (U.S., Army magazine, October 1997).

33

84

The Applied Ethics of Emerging Military and Security Technologies

[11] Slipchenko [si li pu qin ke 2448 0448 2528 3830 4430], chairman of the Dept. of Scientific Research at the Russian General Staff Academy, believes that war and weapons have already gone through five ages, and we are now heading toward the sixth. (Zhu Xiaoli, Zhao Xiaozhuo, The New U.S. and Russian Military Revolution, Military Science Publishing House, 1996 edition, p 6). [12] The Journal of the National Defense University, No. 11, 1998, carried an article on Chen Bojiang's interview of Philip Odeen, chairman of the U.S. National Defense Panel. Odeen mentioned "asymmetrical warfare" several times, believing that this is a new threat to the United States. Antulio Echevarria published an article in Parameters magazine in which he proposed that "in the post-industrial age, the thing that will still be most difficult to deal with will be a 'people's war.'" [13] U.S. defense specialists believe that Orgakov already saw that electronic technology would result in a revolution in conventional weapons, and that they would replace nuclear weapons with respect to their effects. However, Orgakov's foresight and wisdom with regard to the issue of a revolution in military affairs ran aground because of structural problems. "If, in keeping up with the extremely high costs of the revolution in military affairs, a country exceeds the limits that can be borne by its system and material conditions, but it keeps engaging in military power contests with its opponents, the only outcome can be that they will fall further behind with regard to the military forces that they can use. This was the fate of Russia during the czarist and Soviet eras: the Soviet Union undertook military burdens that were difficult to bear, while in turn the military was unwilling to accept the need for strategic retrenchment." (See U.S., Strategic Review magazine, spring 1996, Steven Blank, "Preparing for the Next War: Some Views on the Revolution in Military Affairs"). [14] In 1981, the U.S. Air Force estimated that it could produce 132 B-2s with an investment of $22 billion. However, eight years later, this money had only produced one B-2. Based on its

34

The Applied Ethics of Emerging Military and Security Technologies

85

value per unit weight, one B-2 is worth three times its weight in gold. (See Modern Military, No. 8, 1998, p 33, and Zhu Zhihao's Analysis of U.S. Stealth Technology Policy.) [15] The U.S. Dept. of Defense conducted an analysis of the 13 January 1993 air attack on Iraq and believes that there are numerous limitations to high-tech weapons, and that the effect of the combined effect bombs was at times better than that of precision bombs. (U.S., Aviation Week and Space Technology, 25 January 93). [16] New-concept weapons primarily include kinetic-energy weapons, directed-energy weapons, subsonic weapons, geophysical weapons, meteorological weapons, solar energy weapons, and gene weapons, etc. (New Military Perspectives for the Next Century, Military Science Publishing House, 1997 edition, p 3). [17] The point in substituting the concept of "ultra-lethal weapons" for the concept of "weapons of mass destruction" is to stress that the lethal power of such weapons exceeds the needs of warfare and represents a product of man's extremist thinking. [18] The "kind" in "kinder weapons" mainly refers to the fact that it reduces slaughter and collateral casualties. [19] The April 1993 issue of the British journal International Defense Review revealed that the United States was energetically researching a variety of non-lethal weapons, including optical weapons, high-energy microwave weapons, acoustic beam weapons, and pulsed chemical lasers. The 6 March 1993 issue of Jane's Defense Weekly reported that a high-level non-lethal weapons steering committee at the Dept. of Defense had formulated a policy regulating the development, procurement, and use of such weapons. In addition, according to the 1997 World Military Yearbook (pp 521-522), the U.S. Dept. of Defense has established a "non-lethal weapons research leading group," whose goal is to see that non-lethal weapons appear on the weapons inventory as soon as possible. [20] See Military Science Publishing House Foreign Military Data, 26 March 1993, No. 27, p 3.

35

86

The Applied Ethics of Emerging Military and Security Technologies

Chapter 8: Essential Principles [pp 223-240 in original] "Principles are a code of conduct, but not an absolute one." -George Kennan

In the history of warfare, the first person credited with using principles to regularize methods of fighting should be Sun Tzu. Principles which he advocated, such as "know the enemy and yourself and in a hundred battles you will never be defeated," "strike where the enemy is not prepared, take him by surprise," and "avoid the solid and strike the weak," are still articles of faith for modern strategists. But in the West, 2,400 years later, Napoleon would reveal his real desire to the world famous Saint-Cyr Military Academy, which would one day emblazon his name above its main doorway: "To write a book, describing the principles of war precisely, and provide it to all soldiers." Unfortunately, when he fought and won wars he had no time to write, and after he was defeated he was no longer in the mood. To a marshal who created nearly 100 victories in his lifetime, this should be neither too big nor too little a regret. But having been born a great man, it was enough for him to leave behind a brilliant record of victories for posterity to scour in search of his path to victory. A hundred years afterwards, from the wars directed by this old enemy who elicited dread from British people both during life and after death, a British general by the name of J.F.C. Fuller induced five principles for directing modern wars. [1] All of the West's principles of modern warfare are descended from these. Although later military regulations of quite a few countries and several military theorists proposed this or that as a principle of war, all of those things differ only in minor ways with those originated by Fuller. [2] This is because, from the beginning of the Napoleonic wars to the time prior to the Gulf War, apart from the continual increase in lethality and destructiveness, there was no reason for an essential change in the nature of war itself. Now the situation has changed, because of all that happened during and after the Gulf War. The introduction of precision guided weapons, non-lethal weapons, and non-military weapons has

204

The Applied Ethics of Emerging Military and Security Technologies

87

derailed warfare from its mad dash down the track toward increased lethality and destructiveness. Events have set in motion the first change of course since the dawn of history. This has laid a new track for war in the next century, and given rise to principles with which professional military people are unfamiliar. No principle can rest on a flimsy platform waiting to collapse. This is even more true of principles of war. Regardless of which military thinker produced them, or whatever military headquarters regulations they come from, the principles are all undoubtedly the product of repeated tempering in the furnace and on the anvil of war. If there had been no wars in the Spring and Autumn period there would be no principles of Sun Tzu. If there had been no Napoleonic wars, there would be no principles of Fuller. In the same way, if there had been no large and small military, quasi-military, and even non-military wars throughout the world before and after the Gulf War, then there would not be proposals for new concepts such as the Americans' "fulldimensional operations" and our "beyond-limits combined war." And of course, the principles of war which emerge with these concepts would be out of the question. While we are truly sorry that "full-dimensional operations" theory died on the vine, we are resolved that "beyond-limits combined war" will not be confined to the level of theoretical speculation. Instead, we want to see it incorporated into combat methods with practical application. Even though the intent of the "beyond limits" ideology which we advocate is to break through all restrictions, nevertheless there is one constraint which must be strictly observed, and that is, to abide by essential principles when carrying out combat actions. Only in some exceptional situations should a principle itself be broken. When deep thought about the rules of warfare congeals to become some type of combat method, a principle is born along with it. Whether or not these combat methods and principles, as yet untested in a new round of wars, can become road signs pointing the way to the next victory is still very hard to say. But the proposal of essential principles is no doubt an indispensable theoretical process for perfecting a combat method. Here's a gyroscope, let it dance here for us.

205

88

The Applied Ethics of Emerging Military and Security Technologies

Let's have a look at the principles below and see what they can bring to "beyond-limits combined war." Omnidirectionality Synchrony Limited objectives Unlimited measures Asymmetry Minimal consumption Multidimensional coordination Adjustment and control of the entire process

Omnidirectionality--360u Observation and Design, Combined Use of All Related Factors "Omnidirectionality" is the starting point of "unrestricted war" ideology and is a cover [fugai mian 6010 5556 7240] for this ideology. As a general principle of war, the basic demands it makes on the prosecutor of a war are to give all-round consideration to all factors related to "this particular" war, and when observing the battlefield or a potential battlefield, designing plans, employing measures, and combining the use of all war resources which can be mobilized, to have a field of vision with no blind spots, a concept unhindered by obstacles, and an orientation with no blind angles. In terms of beyond-limits warfare, there is no longer any distinction between what is or is not the battlefield. Spaces in nature including the ground, the seas, the air, and outer space are battlefields, but social spaces such as the military, politics, economics, culture, and the psyche are also battlefields. And the technological space linking these two great spaces is even more so the battlefield over which all antagonists spare no effort in contending. [3] Warfare can be military, or it can be quasi-military, or it can be non-military. It can use violence, or it can be nonviolent. It can be a confrontation between professional soldiers, or one between newly

206

The Applied Ethics of Emerging Military and Security Technologies

89

emerging forces consisting primarily of ordinary people or experts. These characteristics of beyond-limits war are the watershed between it and traditional warfare, as well as the starting line for new types of warfare. As a very strong principle applicable to actual warfare, omnidirectionality applies to each level of beyond-limits combined war [described in Chapter 7]. At the war policy level, it applies to the combined use of a nation's entire combat power, up to supra-national combat power, in an intercontinental or worldwide confrontation. At the strategic level, it applies to the combined use in warfare of national resources which relate to military objectives. At the operational level, it applies to the combined use on a designated battlefield of various kinds of measures, and mainly an army or force of that scale, to achieve campaign objectives. And at the tactical level, it applies to the combined use of various kinds of weapons, equipment, and combat methods, and mainly one unit or a force of that scale, to execute a designated mission in a battle. It must be kept in mind that all of the above combinations must also include intersecting combinations among the respective levels. Finally, it must be made clear that the scope of combat operations in each specific war will not always expand over all spaces and domains, but the first principle of beyond-limits combined war is to ponder omnidirectionality and grasp the combat situation.

Synchrony—Conducting Actions in Different Spaces within the Same Period of Time The technical measures employed in modern warfare, and in particular the spread of information technology; the emergence of long-range warfare technology; the increased ability to transform the battlefield; the linking together of battlefields which stretch forever, are scattered, or are different by their nature; and the introduction of various military and non-military forces on an equal footing into the war — all these things greatly shrink the course of warfare. So many objectives which in the past had to be accomplished in stages through an accumulation of battles and campaigns, may now be accomplished quickly under conditions of simultaneous occurence,

207

90

The Applied Ethics of Emerging Military and Security Technologies

simultaneous action, and simultaneous completion. Thus, stress on "synchrony" in combat operations now exceeds the stress on "phasing." [4] Taking as a given the requirement for thorough planning, beyond-limits war brings key factors of warfare which are dispersed in different spaces and different domains to bear in the same, designated space of time. These factors revolve around the objectives of the war, executing a well-arranged team-effort and combined attack to achieve surprise, secrecy, and effectiveness. A single full-depth, synchronized action may be just one short beyond-limits combat operation, but it may be enough to decide the outcome of an entire war. What we mean by "synchrony" here is not "simultaneity," differing by not even a second, but rather "within the same time period." In this sense, beyond-limits war is worthy of the name "designated time warfare." Using this as a standard, the armed force whose military capabilities most nearly reach this level is that of the Americans. Given its current equipment and technology, one of the U.S. military's information campaign systems [xinxi zhanyi xitong] can within one minute provide data on 4,000 targets to 1,200 aircraft. In addition to this is the extensive use of long-range attack weapons systems. This has led to a proposal for a "full-depth simultaneous attack" operations ideology. In terms of space, the U.S. military is starting to abandon the pattern of actions with a gradual push from the periphery towards the depth, and in terms of time, it is abandoning the obsolete combat model of sequential actions. However, judging from some documents openly published by the military, the Americans' line of thought in this regard so far is still confined to the scope of military action, and they have been unable to expand it to battlefields beyond the military sphere. [5]

Limited Objectives—Set a Compass to Guide Action within an Acceptable Range for the Measures [Available] Limited objectives means limited in relation to measures used. Thus, the principle of setting limited objectives means that objectives must always be smaller than measures.

208

The Applied Ethics of Emerging Military and Security Technologies

91

When setting objectives, give full consideration to the feasibility of accomplishing them. Do not pursue objectives which are unrestricted in time and space. Only with limits can they be explicit and practical, and only with limits can there be functionality. In addition, after accomplishing an objective, one will then have the resilience to go on and pursue the next. [6] When setting objectives, one must overcome the mentality of craving great successes, and instead consciously pursue limited objectives and eliminate objectives which are beyond one's abilities, even though they may be proper. This is because every objective which is achievable is limited. No matter what the reason, setting objectives which exceed allowable limits of the measures available will only lead to disastrous consequences. The most typical illustration of expanding objectives is the mistake which Mac Arthur made in the Korean War. Subsequent to that are similar mistakes committed by the Americans in Vietnam and the Soviets in Afghanistan, which prove that no matter what sort of action it is and no matter who is executing it, when objectives are greater than measures, then defeat is certain. Not all of today's statesmen and strategists are clear on this point. The 1996 U.S. Department of Defense Report contains this premise from President Clinton: "As the world's most powerful nation, we have a leadership obligation, and when our interests and sense of values are subject to great danger we will take action." When he spoke those words, obviously even Clinton was unaware that national interests and sense of values are strategic objectives of two completely different scales. If we say that the former is an objective which American power can protect through action, the latter is neither an objective that its power can achieve nor is an objective which the United States should pursue outside its own territory. "World's number one," an ideology corresponding to "isolationism," always makes the Americans tend to pursue unlimited objectives as they expand their national power. But this is a tendency which in the end will lead to tragedy. A company which has limited resources but which is nevertheless keen to take on unlimited responsibilities is headed for only one possible outcome, and that is bankruptcy.

209

92

The Applied Ethics of Emerging Military and Security Technologies

Unlimited Measures—The Trend is Toward Unrestricted Employment of Measures, but Restricted to the Accomplishment of Limited Objectives We speak of unlimited measures as related to limited objectives. [7] The trend toward no limits is a trend toward continual enlargement of the range of selection and the methods of use of measures. It is not intemperate use of measures, and even less is it absolutist use of measures, or the use of absolute measures. Unlimited measures to accomplish limited objectives is the ultimate boundary. Measures are inseparable from objectives. For a measure to be unlimited means that to accomplish some designated objective, one can break through restrictions and select among various measures. This is not to say that a measure can be separated from objectives and used however one likes. Atomic weapons, which can annihilate mankind, have been viewed as absolute measures precisely because they violated the principle that a measure must serve to accomplish an objective. Finally people laid them aside. The employment of unrestricted measures can only be, as Confucius put it, "as one pleases, but not beyond the rules." Here, "rules" means objectives. Beyond-limits ideology expands "as one pleases" the range of selection and the methods of use of measures, but this certainly does not mean expansion of objectives "as one pleases." It only means to employ measures beyond restrictions, beyond boundaries, to accomplish limited objectives. Conversely, a smart general does not make his measures limited because his objectives are limited. This would very likely lead to failure on the verge of success. Thus, the limited must be pursued by way of the unlimited. Sherman's advance toward Savanna in the American war between the north and south was not in search of combat, it was to burn and plunder all along the way. It was a measure used to destroy the economy in the southern army's rear area, to make the southern populace and the southern army lose the ability to resist, thus accomplishing the north's war objective. This is an example of the successful use of unlimited measures to achieve a limited objective.

210

The Applied Ethics of Emerging Military and Security Technologies

93

In contrast to this example, in the fourth Mideast War [the Yom Kippur War, 1973], to accomplish the combat objective designated by its front-line generals, which was the occupation of the Sinai Peninsula, the battle plan of the Egyptian Army's Supreme Command was just to break through the Bar Lev Line and consolidate control of the Sinai. Egypt attempted to use limited measures to achieve a limited objective. The results are well known. Egypt lost its hold on victory when victory was in its very grasp. [8]

Asymmetry—Seek Nodes of Action in the Opposite Direction from the Contours of the Balance of Symmetry "Asymmetry" [fei junheng 7236 0971 5899] as a principle is an important fulcrum for tipping the normal rules in beyond-limits ideology. Its essential point is to follow the train of thought opposite to the balance of symmetry [junheng duicheng 0971 5899 1417 4468], and develop combat action on that line. From force disposition and employment, selection of the main combat axis and the center of gravity for the attack, all the way to the allocation of weapons, in all these things give two-way consideration to the effect of asymmetrical factors, and use asymmetry as a measure to accomplish the objective. No matter whether it serves as a line of thought or as a principle guiding combat operations, asymmetry manifests itself to some extent in every aspect of warfare. Understanding and employing the principle of asymmetry correctly allows us always to find and exploit an enemy's soft spots. The main fighting elements of some poor countries, weak countries, and non-state entities have all used "mouse toying with the cat"-type asymmetrical combat methods against much more powerful adversaries. In cases such as Chechniya vs. Russia, Somalia vs. the United States, Northern Ireland guerrillas vs. Britain, and Islamic Jihad vs. the entire West, without exception we see the consistent, wise refusal to confront the armed forces of the strong country head-to-head. Instead, the weaker side has contended with its adversary by using guerrilla war (mainly urban guerrilla war) [9], terrorist war, holy war, protracted war, network war, and other

211

94

The Applied Ethics of Emerging Military and Security Technologies

forms of combat. Mostly the weaker side selects as its main axis of battle those areas or battlelines where its adversary does not expect to be hit. The center of mass of the assault is always a place which will result in a huge psychological shock to the adversary. This use of asymmetrical measures which create power for oneself and make the situation develop as you want it to, is often hugely effective. It often makes an adversary which uses conventional forces and conventional measures as its main combat strength look like a big elephant charging into a china shop. It is at a loss as to what to do, and unable to make use of the power it has. Apart from the effectiveness it displays when used, asymmetry in itself is a rule of action suggested by the golden rule. Of all rules, this is the only one which encourages people to break rules so as to use rules. Also it is an effective prescription for methodical and well-balanced medical treatment for a chronic illness of thought.

Minimal Consumption—Use the Least Amount of Combat Resources Sufficient to Accomplish the Objective The principle of minimal consumption is, first of all that rationality is more important than thrift [10]; second, the size of combat consumption is decided by the form of combat [11]; and third, use "more" (more measures) to pursue "less" (lower consumption). Rationality involves two aspects, the rational designation of objectives and the rational use of resources. Rational designation of objectives, besides specifying objectives that fall within the circle of the measures to be used, also refers to the need to compress the objectives' load, and as much as possible make them simple and concise. Rational use of resources obviously means using the most appropriate method to accomplish an objective, and not just imposing a singleminded requirement to economize. Economizing, that is, using the minimum amount of resources, has meaning only if the prerequisites for accomplishing an objective are met. More important than perfect familiarity with principles is how the principles are applied. Whether or not the minimum amount of combat resources is used to accomplish an objective

212

The Applied Ethics of Emerging Military and Security Technologies

95

depends on what form of combat operation is selected. The Verdun campaign is called by war historians a meat grinder, because both sides waged a senseless war of attrition. By contrast, the reason Germany was able to sweep away the joint British-French force after crossing the Maginot Line was because it combined the shortest length of time, the optimum route, and the most powerful weapons in a blitzkrieg. So it can be seen that the key to truly achieving "minimal consumption" is to find a combat method which makes rational use of combat resources. Today, with objectives and the measures to accomplish them assuming many complex forms as never before, confronting a complex objective in just one sphere and with just one measure will definitely fall short of the mark. The result of a mismatch between measures and objectives is inevitably high consumption and low effectiveness. The line of thought leading out of these difficulties is to use "more" to attain "less." That is, to combine the superiorities of several kinds of combat resources in several kinds of areas to form up a completely new form of combat, accomplishing the objective while at the same time minimizing consumption.

Multidimensional Coordination—Coordinating and Allocating All the Forces which can be Mobilized in the Military and Non-Military Spheres Covering an Objective "Multidimensional" here is another way of saying multiple spheres and multiple forces. It has nothing to do with the definition of dimensionality in the sense of mathematics or physics. "Multidimensional coordination" refers to coordination and cooperation among different forces in different spheres in order to accomplish an objective. On the face of it, this definition is not at all novel. Similar explanations are to be found in many combat regulations, both obsolete and newly published. The only difference between it and similar explanations is, and this is a great difference, the introduction of non-military and non-war factors into the sphere of war directly rather than indirectly. In other words, since any sphere can become a battlefield, and any force can be used under combat conditions, we should be more inclined to understand multidimensional coordination as the coordination of the military dimension with various other

213

96

The Applied Ethics of Emerging Military and Security Technologies

dimensions in the pursuit of a specific objective. It is not the case that in all wars military action must be considered as the primary form of action. With warfare facing the equalization of the various dimensions, this concept will become a formula for addressing the questions of future wars. [12] The concept of multidimensional coordination can only be established within the context of a specific objective. Without an objective, we cannot speak of multidimensional coordination. But the size of an objective determines the breadth and depth of the coordination of each dimension. If the set objective is to win a war at the war policy level, the spheres and forces needing coordination may involve the entire country, or may even be supra-national. From this we can generalize that in any military or non-military action, no matter what the depth of the spheres and the quantity of forces it involves, coordination among the various dimensions is absolutely necessary. This certainly does not imply that in each action the more measures mobilized the better. Instead, the limit is what is necessary. The employment of an excessive or an insufficient amount in each dimension will only cause the action to sway between edema and shriveling, and finally the objective itself will be in jeopardy. The bit of Eastern wisdom, "going beyond the limit is as bad as falling short," is helpful to our understanding and our application of this principle. In addition, we urgently need to expand our field of vision regarding forces which can be mobilized, in particular non-military forces. Besides, as in the past, paying attention to conventional, material forces, we should also pay particular attention to the employment of intangible "strategic resources" such as geographical factors, the role of history, cultural traditions, sense of ethnic identity, dominating and exploiting the influence of international organizations, etc. [13] But this is still not enough. In applying this principle we must also come up with beyond-limits action, and to the greatest extent possible make multidimensional coordination a commonplace move in ordinary operations, and bring about interlocking, gradational combinations at every level from war policy to tactics.

214

The Applied Ethics of Emerging Military and Security Technologies

97

Adjustment and Control of the Entire Process—During the Entire Course of a War, from its Start, through its Progress, to its Conclusion, Continually Acquire Information, Adjust Action, and Control the Situation Warfare is a dynamic process full of randomness and creativity. Any attempt to tie a war to a set of ideas within a predetermined plan is little short of absurdity or naivete. Therefore, it is necessary to have feedback and revisions throughout the entire course of a war while it is actually happening, in order to keep the initiative within one's grasp. This is what is meant by "adjustment and control of the entire process." Because of the addition of the principle of synchrony, we cannot understand the adjusted and controlled "entire course" to be a prolonged one. With modern, high-tech measures, this process may take the blink of an eye. As we said before, the time it takes to fight one battle can be sufficient to wind up a whole war. This may make the entire course of a war extremely short, and incidentally make adjusting and controlling it much more difficult. Today, with information technology welding the entire world together into a network, the number of factors involved in a war is much, much greater than in past wars. The ability of these factors to cloud the issues of war, and their intense influence on war, means that loss of control over any one link can be like the proverbial loss of a horseshoe nail which led to the loss of an entire war. [14] So, faced with modern warfare and its bursts of new technology, new measures, and new arenas, adjustment and control of the entire process is becoming more and more of a skill. It is not a kind of technology. What is needed to grasp the ever-changing battlefield situation is greater use of intuition, rather than mathematical deduction. More important than constant changes in force dispositions and continual updating of weapons is the whole set of combat rules which are the result of the shift of the battlefield to non-military spheres. The outcome of all this is that one will be sent to an unexplored battlefield to wage an unfamiliar war

215

98

The Applied Ethics of Emerging Military and Security Technologies

against an unknown enemy. Nevertheless, one must adjust and control this entire unfamiliar process if he is to win. "Beyond-limits combined war" is this use of strange, completely new methods of combat to wage war. All of the above principles are applicable to any beyond-limits combined war. Victory is certainly not in the bag just because a side adheres to the above principles, but violating them no doubt leads to defeat. Principles are always essential conditions for victory in war, but they are not the only conditions. In the absence of a principle that victory is certain, there are only essential principles. We should always remember this point.

Footnotes [1 ] The five principles which Fuller summarized from the Napoleonic wars are attack, maneuver, surprise, concentration, and support. Besides this, following the views of Clausewitz, Fuller also induced seven principles similar to those of the Napoleonic wars: maintain the objective, security of action, mobile action, exhaust the enemy's offensive capability, conserve forces, concentrate forces, and surprise. These principles became the foundation of modern military principles. (See "The Writings of Fuller" in Zhanzheng Zhidao (Combat Command), Liberation Army Publishing House, pp. 38-60.) [2] An example is the U.S. Army's nine main military principles: objective, offensive, concentration, economy of force, mobility, security, surprise, simplicity, and unity [of command]. These are very similar to the principles of war of the Napoleonic era. [3] The battlefield of beyond-limits war differs from those of the past in that it encompasses all natural spaces, such as the social realm, and the continually developing sphere of technology where space is now measured in nanometers. Today, these spaces are interlocked with each other. For example, outer space can be seen as a natural space, and also as a technological space,

216

The Applied Ethics of Emerging Military and Security Technologies

99

because each step in the militarization of outer space requires a technological breakthrough. In the same way, the interdynamics between society and technology are to be seen constantly. There is no more typical example of this than the effect of information technology on society. From these things we can see that the battlefield is ubiquitous, and we can only look upon it with '' omnidirectionality.'' [4] Wars in the past involved, in terms of space, forces charging from boundary areas into depths, and in terms of time, division into phases. By contrast, in terms of space, beyond-limits war instead goes straight to the core, and in terms of time it is "synchronous" and will often no longer be characterized by phases. [5] [Footnote not marked in original text, but assumed to belong here] There is no more typical example of this than four principles in the U.S. military's Joint Vision 2010, which are, "dominant maneuver, precision engagement, focused logistics, full-dimensional protection." All of these proposed new principles are for military warfare. [6] Setting limited objectives is not a matter of whether or not one is constrained subjectively, but rather whether or not restricted measures are exceeded. Measures are "restrictions" which cannot be exceeded when setting objectives. [7] For details, see How Great Generals Win by Bevin Alexander, pp. 101-125. [8] Before the Fourth Mideast War, the Egyptian "Baierde Plan" [inaccurate Chinese phonetic for "Badr"? (the war began on the anniversary of the Battle of Badr, 626 A.D.)] was divided into two steps. The first step consisted of forced crossings of the Suez Canal, breaking through the Bar Lev Line, and taking control of a 15-20 km [deep] area of the east bank of the canal. The second step was to attack and capture a line running from the Mitla Pass to the Giddi Pass to the Khatima Pass, guarantee the security of the east bank of the canal, and then expand into the enemy's depth as the situation warranted. But in actual combat, as soon as the Egyptian Army crossed the canal it went on the defensive. It was five days before it resumed its offensive, and this gave the Israeli Army an opportunity to catch its breath.

217

100

The Applied Ethics of Emerging Military and Security Technologies

[9] The famous researcher of the development of capitalist society, Buluodaier [Fernand Braudel? 1580 5012 0108 1422], placed particular emphasis on the "organizational usefulness" of large cities in the capitalist world. Despite its big size, this world nevertheless has a number of fulcrums, central cities such as New York, London, Tokyo, Brussels, and maybe Hong Kong. If these were attacked simultaneously or if guerrilla war broke out there simultaneously, it would leave the world in chaos. (The Motive Force of Capitalism, Buluodaier [Fernand Braudel?], Oxford Press) [ 10] Military principles have always included [the concept] "economize," mainly referring to the need to pay attention to controlling the consumption of manpower and materiel during wartime. In beyond-limits warfare, "rational usage" is the only correct [way to] economize. [11] Beyond-limits war allows for a great deal of leeway in the selection of the forms of combat. Naturally there is a big difference between the cost of conventional military warfare and warfare in which finance plays the leading role. Therefore, the cost of a future war depends mainly on what form of warfare is selected. [12] The most important [step toward] equality among various dimensions is to overcome the concept that "the military is supreme." In future wars, military measures will only be [considered] one of the conventional options. [ 13] In this regard, China is richly endowed by nature. A long cultural tradition, peaceful ideology, no history of aggression, the strong economic power of the Chinese people, a seat on the United Nations Security Council, etc., all these things are important "strategic resources." [14] In modern warfare, fortuitous factors influence the outcome of wars just as they did in antiquity. If a fuse in a command center's computer were to get too hot and burn out at a critical moment, this could lead to a disaster. (This is entirely possible. It was a factor in a mistaken attack by an F-16 over the Gulf. It happened because the electrical circuit in the "friend or foe device" aboard a Blackhawk helicopter frequently overheated, and the aviators would occasionally switch it off to lower the temperature.) This is perhaps the modern version of the

218

The Applied Ethics of Emerging Military and Security Technologies

101

loss-of-a-horseshoe story. For this reason, then, "adjustment and control" must continue "through the entire course."

219

102

The Applied Ethics of Emerging Military and Security Technologies

Conclusion [pp. 241-247 in original] "Computerization and globalization...have produced several thousand global enterprises and tens of thousands of international and inter-government organizations." — E. Laszlo "Mankind is making progress, and no longer believes that war is a potential court of appeals." — Bloch

At a time when man's age-old ideal of "the family of man" is used by IBM in an advertisement, "globalization" is no longer the prediction of futurists. An era in which we are impelled by the great trend of technological integration that is plastered all over with information labels, agitated by the alternately cold and warm ocean currents from the clash and fusion of civilizations, troubled by local wars rising first here then there and by domino-like financial crises and the ozone hole over the South Pole, and which causes everyone, including the futurists and visionaries, to feel strange and out of place - [such an era] is in the process of slowly unfolding between the dusk of the 20th century and the dawn of the 21st century. Global integration is comprehensive and profound. Through its ruthless enlightenment, those things which must inevitably be altered or even dispelled are the positions of authority and interest boundaries in which nations are the principal entities. The modern concept of "nation states" which emerged from the Peace of Westphalia [1] in 1648 is no longer the sole representative occupying the top position in social, political, economic and cultural organizations. The emergence of large numbers of meta-national, trans-national, and nonnational organizations, along with the inherent contradictions between one nation and another, are presenting an unprecedented challenge to national authority, national interests, and national will. [2] At the time of the emergence of the early nation states, the births of most of them were assisted by blood-and-iron warfare. In the same way, during the transition of nation states to

220

The Applied Ethics of Emerging Military and Security Technologies

103

globalization, there is no way to avoid collisions between enormous interest blocs. What is different is that the means that we have today to untie the "Gordian Knot" [3] are not merely swords, and because of this we no longer have to be like our ancestors who invariably saw resolution by armed force as the last court of appeals. Any of the political, economic, or diplomatic means now has sufficient strength to supplant military means. However, mankind has no reason at all to be gratified by this, because what we have done is nothing more than substitute bloodless warfare for bloody warfare as much as possible. [4] As a result, while constricting the battlespace in the narrow sense, at the same time we have turned the entire world into a battlefield in the broad sense. On this battlefield, people still fight, plunder, and kill each other as before, but the weapons are more advanced and the means more sophisticated, so while it is somewhat less bloody, it is still just as brutal. Given this reality, mankind's dream of peace is still as elusive as ever. Even speaking optimistically, war will not be wiped out rapidly within the foreseeable future, whether it is bloody or not. Since things which should happen will ultimately come to pass, what we can and must focus on at present is how to achieve victory. Faced with warfare in the broad sense that will unfold on a borderless battlefield, it is no longer possible to rely on military forces and weapons alone to achieve national security in the larger strategic sense, nor is it possible to protect these stratified national interests. Obviously, warfare is in the process of transcending the domains of soldiers, military units, and military affairs, and is increasingly becoming a matter for politicians, scientists, and even bankers. How to conduct war is obviously no longer a question for the consideration of military people alone. As early as the beginning of this century, Clemenceau stated that "war is much too serious a matter to be entrusted to the military." However, the history of the past 100 years tells us that turning over warfare to the politicians is not the ideal way to resolve this important issue, either. [5] People are turning to technical civilization, hoping to find in technological developments a valve which will control war. But what makes people despair is that the entire century is just about gone, and while technology has made great strides, war still remains an unbroken mustang. People still

221

104

The Applied Ethics of Emerging Military and Security Technologies

expect wonders from the revolution in military affairs, hoping that high-tech weapons and nonlethal weapons can reduce civilian and even military casualties in order to diminish the brutality of war. However, the occurrence of the revolution in military affairs, along with other revolutions, has altered the last decade of the 20th century. The world is no longer what it was originally, but war is still as brutal as it has always been. The only thing that is different is that this brutality has been expanded through differences in the modes in which two armies fight one other. Think about the Lockerbie air disaster. Think about the two bombs in Nairobi and Dar es Salaam. Then think about the financial crisis in East Asia. It should not be difficult to understand what is meant by this different kind of brutality. This, then, is globalization. This is warfare in the age of globalization. Although it is but one aspect, it is a startling one. When the soldiers standing at the crossroads of the centuries are faced with this aspect, perhaps each of them should ask himself, what can we still do? If those such as Morris, bin Laden, and Soros can be considered soldiers in the wars of tomorrow, then who isn't a soldier? If the likes of Powell, Schwartzkopf, Dayan, and Sharon can be considered politicians in uniform, then who isn't a politician? This is the conundrum that globalization and warfare in the age of globalization has left for the soldiers. Although the boundaries between soldiers and non-soldiers have now been broken down, and the chasm between warfare and non-warfare nearly filled up, globalization has made all the tough problems interconnected and interlocking, and we must find a key for that. The key should be able to open all the locks, if these locks are on the front door of war. And this key must be suited to all the levels and dimensions, from war policy, strategy, and operational techniques to tactics; and it must also fit the hands of individuals, from politicians and generals to the common soldiers. We can think of no other more appropriate key than "unrestricted warfare."

222

The Applied Ethics of Emerging Military and Security Technologies

105

Footnotes [1] The general term for the European agreement of 1648. This brought an end to the 80-year war between Spain and Holland, and the Thirty Years' War in Germany, and it is also seen as laying the foundation for all the treaties concluded up to the break up of the Holy Roman Empire in 1806.

[2] The state's position as the ultimate entity is being challenged from various quarters, and the thing that is most representative as well as being most worrisome, is that the state's monopoly on weapons is being seriously challenged. According to the views of Earnest Jierna [as published 0679 1422 4780] in Nationality and Nationalism, a state is defined as the only entity that can use force legally. According to a 1997 public opinion survey by Newsweek magazine in the United States regarding "where the threat to security will come from in the 21st century," 32 percent believed it would come from terrorism, 26 percent believed that it would be international crime and drug trafficking groups, 15 percent believed it would be racial hatred, with nation states only coming in fourth. In a small pamphlet that the U.S. Army has put on the Web, but which has not been published (TRADOC PAMPHLET 525-5: FORCE XXI OPERATIONS), the non-nation forces are clearly listed as "future enemies," saying that "non-nation security threats, using modern technologies that give them capabilities similar to those of nation states, have become increasingly visible, challenging the traditional nation state environment. Based on the scope involved, these can be divided into three categories. (1) Subnational. Subnational threats include political, racial, religious, cultural, and ethnic conflicts, and these conflicts challenge the defining features and authority of the nation state from within. (2) Anational. Anational threats are unrelated to the countries they belong to. These entities are not part of a nation state, nor do they desire to establish such a status. Regional organized crime, piracy, and terrorist activities comprise these threats.

223

106

The Applied Ethics of Emerging Military and Security Technologies

(3) Metanational. Metanational threats transcend the nation state borders, operating on an interregional or even global scale. They include religious movements, international criminal organizations, and informal economic organizations that facilitate weapons proliferation. See The World Map in the Information Age, Wang Xiaodong, Chinese People's University Press, 1997, p. 44-46. The U.S. military does not treat transnational companies which seize monopolistic profits as security threats, and in addition to their deeply-rooted awareness of economic freedom, this is also related to the fact that they still limit threats to the military arena. Transnational companies such as Microsoft and Standard Oil-Exxon, whose wealth rivals that of nations, may also constitute real threats to national authority, and can even have a serious impact on international affairs. [3] Legend has it that after Alexander the Great led his army into the interior of Asia Minor, he went to worship in the temple of Zeus in the city of Gordium. In the temple there was a wagon which had formerly belonged to Midas, king of Phrygia. It was secured very tightly by a jumbled cord, and it was said that no one had been able to untie it. Faced with this, Alexander pondered for a moment, then suddenly pulled out his sword and severed it at one stroke. From this, "Gordian knot" has come to be another term for intractable and complex problems. [4] In future wars, there will be more hostilities like financial warfare, in which a country is subjugated without spilling a drop of blood. Think about it for a moment. What would the disastrous impacts have been on the economies of Hong Kong and even China if the August 1998 battle to protect Hong Kong's finances had failed? Furthermore, such situations are by no means impossible, and if it had not been for the collapse of the Russian financial market, which caused the financial speculators to be under attack from the front and the rear, it is still hard to predict how things would have turned out. [5] Regardless of whether we are talking about Hitler, Mussolini, Truman, Johnson, or Saddam, none of them have successfully mastered war. This also includes Clemenceau himself.

224

The Applied Ethics of Emerging Military and Security Technologies

107

Afterword [pp 253-254 in original] [FBIS Translated Text] The motives for writing this book originated from military maneuvers which caught the attention of the world. Three years ago, due to participation in the maneuvers, Xiangsui and I encountered each other in a small city in Fujian called Zhao An. At the time, the situation was becoming daily more tense on the Southeast coast, both sides of the straits were all set for a showdown, and even the task force of two American aircraft carriers rushed a long way to add to the trouble. At that time, the storm was brewing in the mountains and the military situation was pressing so that people were suddenly moved to "think up strategies when facing a situation." We therefore decided to write this book, a book which would be able to concentrate together the concerns and thoughts each of us had over the past several decades and especially during the last ten years concerning military issues. There is no way of relating in detail how many telephone calls we made, how much mail was sent, and how many nights we stayed awake over the next three years, and the only thing which can serve as evidence for all of this is this small and thin book. We must first apologize to readers for the fact that, even though we were very conscientious and toiled painstakingly in the writing of this book, yet after the written word reflecting ideas were set down much like shooting stars traveling across the sky and cooling into meteorites, all of you (including ourselves) will still be able to find many mistakes and places which are inappropriate. We shall not employ the apologetic words of "We request your kind solicitude" to seek forgiveness but shall rather only make corrections in the second edition (if there is one). Upon the occasion of the publication of this book, we would like to here sincerely thank the Chief-of-Staff Cheng Butao and Assistant Chief-of-Staff Huang Guorong, of the PLA Literature and Arts Publishing House for their unswerving support whereupon this book was able to be so quickly published within such a short period of time. We would also like to thank Xiang Xiaomi, Director of the First Book Editing Department. She has carefully and rigorously proofread the

225

108

The Applied Ethics of Emerging Military and Security Technologies

entire book as she had done with the other four books which we have edited, and provided many very valuable recommendations. We do not know any better way of expressing our thanks aside from the deep gratitude which we feel. Lastly, we would also like to thank our families for the sacrifices they made towards the completion of this book, and this is again something which cannot be expressed in words. The entire book was completed in manuscript form between March 2 and December 8 of 1998 in Gongzhufen - Baizhifang in Beijing. [Written on February 1, 1999]

226

The Applied Ethics of Emerging Military and Security Technologies

109

AUTHORS' BACKGROUND Qiao Liang [0829 5328], whose ancestors came from Hunan Province, was born in Xin [1823] County, Shanxi Province, to a military family in 1955. He is a member of the Chinese Writers' Union. Presently, he is assistant director of the production office of the air force's political department and holds the rank of senior colonel in the air force, along with being a grade one [yi ji 0001 4787] writer. His most important works include Gate to the Final Epoch [Mori Zhi Men 2608 2480 0037 7024]; Spiritual Banner[Ling Qi 7227 4388]; and Great Glacial River [Da Ring He 1129 0393 3109]. He has repeatedly won national and military awards. In addition to his literary creations, he has applied himself over a long period of time to the research of military theory and joined with other writers to pen A Discussion of Military Officer Quality [Junguan Suzhi Lun 6511 1351 4790 6347 6158]; Viewing the Global Military Big Powers [Shijie Junshi Lieqiang Bolan 0013 3954 6511 0057 0441 1730 0590 6031]; and A Listing of the Rankings of Global Military Powers [Quanqiu Junli Paihang Bang 0356 3808 6511 0500 2226 5887 2831]. Wang Xiangsui [3769 3276 4482] was born in Guangzhou to a military family in 1954. He joined the army at the end of 1970. He successively assumed the positions of political instructor, group political commissar, section deputy head, regiment political commissar, and division deputy political commissar. Presently, he works in the Guangzhou Military Region Air Force Political Unit and holds the rank of senior colonel. He has cooperated with other authors to write the books A Discussion of Military Officer Quality; Viewing the Global Military Powers; and A Record of Previous Major Global Wars [Shijie Lici Dazhan Lu 0013 3954 2980 2945 1129 2069 6922].

227

This page intentionally left blank

[5] INTERNATIONAL HUMANITARIAN LAW AND THE CHALLENGES OF CONTEMPORARY ARMED CONFLICTS International Committee of the Red Cross

TABLE OF CONTENTS

I.

Introduction

II.

The Notion and Typology of Armed Conflicts

III.

The Interplay between IHL and Human Rights Law

IV.

The Protective Scope of IHL: Select Issues 1) Humanitarian Access and Assistance 2) The Law of Occupation 3) IHL and Multinational Forces 4) Private Military and Security Companies

V.

Means and Methods of Warfare 1) New Technologies of Warfare 2) The Use of Explosive Weapons in Densely Populated Areas 3) Direct Participation in Hostilities 4) The Arms Trade Treaty

VI.

The Conflation of IHL and the Legal Framework Governing Terrorism

112

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

EXECUTIVE SUMMARY This is the third report on "International Humanitarian Law (IHL) and the Challenges of Contemporary Armed Conflicts" prepared by the ICRC for an International Conference of the Red Cross and Red Crescent, the first two having been submitted to the 28th and 30th International Conferences in Geneva, in December 2003 and November 2007, respectively. These reports aim to provide an overview of some of the challenges posed by contemporary armed conflicts for IHL, to generate broader reflection on those challenges and to outline ongoing or prospective ICRC action, positions and interest. This report, like its predecessors, can only address a part of the ongoing challenges to IHL. The ICRC therefore selected a number of issues which were not addressed in previous reports but which are increasingly the focus of interest among States and for the ICRC, such as new technologies in warfare or the drafting of the new Arms Trade Treaty. At the same time, the report seeks to give an update on some of the issues that had been addressed in the previous reports and remain of ongoing interest. Another report submitted to the Conference, entitled "Strengthening Legal Protection of Victims of Armed Conflict", summarizes the results of a process of research and reflection conducted by the ICRC since 2008 on the adequacy of existing IHL to protect the victims of contemporary armed conflicts. The analysis identifies four areas of IHL in which, in the view of the ICRC, humanitarian concerns are not adequately addressed by existing IHL and where IHL should therefore be strengthened - namely the protection of detainees, of internally displaced persons and of the environment in armed conflict, and the mechanisms of compliance with IHL.

The introduction to this report provides a brief overview of current armed conflicts and of their humanitarian consequences, and thus of the operational reality in which challenges to IHL arise. Chapter II focuses on the notion and typology of armed conflicts, issues that have been the subject of ongoing legal debate over the past several years. It addresses, inter alia, the question of criteria for the determination of an international armed conflict (IAC) and of whether the IHL classification of armed conflicts into international and non-international is sufficient to encompass the types of conflicts taking place today. It also provides a typology of non-international armed conflicts (NIAC) governed by Common Article 3 of the 1949 Geneva Conventions, and examines the application, as well as the applicability of IHL to contemporary forms of armed violence. Chapter III is devoted to the interplay of IHL and human rights law, which is an area of abiding legal interest because of the practical consequences that this relationship may have on the conduct of military operations. It first provides a general overview of some of the differences between IHL and human rights law, highlighting, in particular, the differences in the binding nature of IHL and human rights law on organized non-state armed groups. It then discusses the specific interplay between these two branches of international law in relation to detention, and the use of force, in international and non-international conflicts, respectively. The extraterritorial targeting of persons is also briefly dealt with. The first section of Chapter IV, on the protective scope of IHL, aims to highlight a range of issues related to humanitarian access and assistance, including the legal framework applicable to humanitarian action, as well as the practical constraints that may hamper the

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

113 4

delivery of humanitarian relief. The section on the law of occupation discusses salient legal questions that have arisen in the application of this part of IHL, such when occupation begins and ends, the rights and duties of an occupying power, the use offeree in occupied territory, and the applicability of occupation law to UN forces. These questions were, among others, explored in an expert process organized by the ICRC between 2007 and 2009, and resulted in the preparation of report that will be published before the end of 2011. The next section is devoted to IHL and multinational forces, whether under UN auspices or otherwise, and examines the legal challenges posed in the spectrum of operations in which such forces may be involved. Relevant queries pertain, inter alia, to the applicability of IHL to such forces, the legal classification of situations in which they take part, detention by multinational forces, interoperability, and others. It is submitted that multinational forces, regardless of their specific mandate, are bound by IHL when the conditions for its application have been met. The last section of the Chapter sheds light on the humanitarian challenges posed by the use of private military and security companies and surveys recent and ongoing international initiatives aimed at ensuring that their activities comply with IHL and other relevant bodies of international law. Chapter V, on means and methods of warfare, first addresses new technologies of warfare, including, in the first section, "cyber warfare". This section discusses the specificity of cyberspace as a potential war fighting domain and the particular challenges posed by cyber operations to the observance of the IHL prohibitions of indiscriminate and disproportionate attacks, as well as the obligation to take feasible precautions in attack. The section also reviews some of the legal challenges posed by remote controlled weapons systems, as well as automated and autonomous weapons systems. It recalls that new technologies must abide by existing IHL rules, while recognizing that existing norms do not respond to all the legal and practical challenges posed by new technologies. The use of explosive weapons in densely populated areas is the focus of the next section of the Chapter, which outlines both the human costs and the challenges of respecting IHL rules involved in the use of such weapons. As a result of these factors it is believed that explosive weapons with a wide impact area should generally not be used in densely populated areas. The section on direct participation in hostilities recapitulates the process leading to and the main recommendations contained in the ICRC's Interpretive Guidance on the Notion of Direct Participation in Hostilities under IHL which was published in 2009, and reflects solely the ICRC's views. Different positions expressed in relation to some of the recommendations made in the Interpretive Guidance are also briefly discussed. The last section of the Chapter examines the process leading to current work within the UN on drafting an Arms Trade Treaty, one of the most important objectives of which should be to reduce the human cost of the availability of weapons by setting clear norms for the responsible transfer of conventional arms and their ammunition. The ICRC supports the elaboration of a comprehensive, legally binding Arms Trade Treaty that establishes common international standards in this area. Chapter VI addresses the current conflation of armed conflict and terrorism by unpacking the distinctions between the legal frameworks governing these types of violence. It elaborates on both the legal and policy effects of blurring armed conflict and terrorism, as well as the disadvantages caused by such blurring particularly for the observance of IHL by non-state parties to NIACs. The practical effects of the conflation, that is the potential for curtailing the work of humanitarian organizations in NIACs, is also discussed.

114

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2 INTERNATIONAL HUMANITARIAN LAW AND THE CHALLENGES OF CONTEMPORARY ARMED CONFLICTS I. INTRODUCTION This is the third report on "International Humanitarian Law (IHL) and the Challenges of Contemporary Armed Conflicts" prepared by the ICRC for an International Conference of the Red Cross and Red Crescent, the first two having been submitted to the 28th and 30th International Conferences in Geneva, in December 2003 and November 2007, respectively. These reports aim to provide an overview of some of the challenges posed by contemporary armed conflicts for IHL, to generate broader reflection on those challenges and to outline ongoing or prospective ICRC action or interest. The goal of this section is to briefly outline the operational reality in which those challenges arise. In the last four years, well over 60 countries were the theatre of armed conflicts - whether inter-state or non-international - with all the devastation and suffering that these entailed, chiefly among civilian populations. Indeed, civilians continued to be the primary victims of violations of IHL committed by both state parties and non-state armed groups. Recurring violations in hostilities include deliberate attacks against civilians, destruction of infrastructure and goods indispensable to their survival, and forcible displacement of the civilian population. Civilians have also suffered from indiscriminate methods and means of warfare, especially in populated environments. Fighters have not taken all feasible precautions - both in attack and against the effects of attack - as required by IHL, with the consequent unnecessary loss of civilian life and destruction of civilian property. Individuals deprived of their liberty have also been the victim of serious violations of IHL such as murder, forced disappearance, torture and cruel treatment, and outrages upon their personal dignity. Women in particular have been victim of rape and other forms of sexual violence, in some contexts on a massive scale. Health care providers, services and facilities have come under direct attack or been severely obstructed in attempting to carry out of their duties. Abuses of the protective emblems have also occurred, ultimately endangering all Movement actors in the accomplishment of their humanitarian mission. General insecurity in the field and the ensuing lack of access to nonstate armed groups to gain acceptance and security guarantees, and often deliberate targeting or kidnapping of aid workers or of aid convoys, have prevented humanitarian assistance to reach those in need, leaving the fate of tens of thousands of civilians uncertain. Against this backdrop, some governments continue to deny that there are NIACs occurring within their territory and therefore that IHL applies, rendering difficult or impossible a dialogue with the ICRC on respect for their obligations under IHL. Certain governments have also been reluctant to acknowledge the need for the ICRC and other components of the Movement to engage non-state armed groups on issues relating to their security and access to victims, as well as to disseminate IHL and humanitarian principles, on the grounds that the armed groups in question are "terrorist organisations" or are otherwise outlaws. In the intervening years since the last report, the ICRC has observed two main features of armed conflicts. The first is the diversity of situations of armed conflict, which range from contexts where the most advanced technology and weapons systems were deployed in asymmetric confrontations, to conflicts characterised by low technology and a high degree of fragmentation of the armed groups involved. While the last years have seen the emergence of a number of new lACs, including the recent conflict between Libya and a multinational coalition under NATO command, NIACs remained the predominant form of conflict. This has been generated primarily by state weakness that

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

115 6

has left room for local militias and armed groups to operate, leading to environments where looting and trafficking, extortion and kidnapping have become profitable economic strategies sustained by violence and national, regional and international interests, with all the consequent suffering on civilians. These low-intensity conflicts are often characterised by brutal forms of preying and violence primary targeting civilians, to instil fear, ensure control and obtain new recruits. Direct clashes between the armed groups and the governmental forces tend to be occasional. Hostilities pitting non-state armed groups operating within populated areas against government forces using far superior military means were also a recurring pattern, exposing civilians and civilian dwellings to the confrontations taking place in their midst. The comingling of armed groups with civilians, in violation of IHL, has by some armies been used as a justification to by-pass the taking of all possible precautions to minimise risks to civilians, as required by IHL. Urban warfare has posed particular challenges to government forces, which often continue to employ means and methods of warfare designed for use in an open battlefield and ill-adapted to populated environments, such as certain forms of air power and artillery. In this regard, the effects of the use of explosive force in populated areas on civilians and civilian structures, which in such environments have borne the brunt of the hostilities, has been of increasing concern. Another notable trend of contemporary NIACs is that the lines of distinction between ideological and non-ideological confrontations have gradually blurred, with non-state armed groups arising from organised criminal activity. It must be recalled that, despite some views to the contrary, the underlying motivations of these groups are irrelevant to the legal determination of whether they are involved in a NIAC as defined by IHL. The recent situations of civil unrest in North Africa and the Middle East have in contexts such as Libya degenerated into NIACs, opposing government forces to organised armed opposition movements. In other contexts such as Iraq and Yemen, civil unrest has occurred against the backdrop of pre-existing armed conflicts, thereby raising questions regarding which international legal framework - IHL or human rights rules and standards - governs particular events of violence. This crucial question has also been recurrent in many other situations of armed conflict around the globe. The second main feature of armed conflicts in recent years has been the duration of armed conflicts. In this regard, it is worth noting that the majority of ICRC operations are taking place in countries where the organisation had been present for two, three or four decades, such as for example Afghanistan, Colombia, DR Congo, Israel and the occupied territories, the Philippines, Somalia and Sudan. These enduring situations of armed conflict, which are often fuelled by economic motivations linked to access to natural resources, fluctuate between phases of high and low intensity and instability, without solutions for lasting peace. Some armed conflicts, such as that in Sri Lanka, have ended with the military victory of one party against the other, but this has been by far the exception rather than the rule. Few if any armed conflicts have been definitively resolved through peace negotiations, with in several cases armed conflicts starting up again between old foes despite ceasefires and peace agreements in place. Moreover, unresolved inter-state disputes have led to enduring situations of occupation governed by the Fourth Geneva Convention and customary IHL, although few if any occupying powers acknowledge the application to them of the law of occupation. Unless political solutions are found to address the underlying causes of these prolonged occupations, they will continue to inflict dispossession, violence, and consequent suffering on the affected civilian populations.

116

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2 II. THE NOTION AND TYPOLOGY OF ARMED CONFLICTS Along with the increasing complexity of armed conflicts in practice - as described in the previous section - legal issues related to the notion and typology of armed conflicts have arisen over the past several years as well. In particular, questions have been asked about the adequacy: 1) of the current criteria for determining the existence of an IAC, 2) of existing armed conflict classifications, in particular the criteria for determining the existence of an NIAC, and, 3) of the applicable IHL, as well as its applicability in certain cases. 1) Criteria for the Determination of an International Armed Conflict Under IHL, lACs are those waged between states (or between a state and a national liberation movement provided the requisite conditions have been fulfilled1). Pursuant to Common Article 2 of the 1949 Geneva Conventions, they apply to all cases of declared war, or to "any other armed conflict which may arise" between two or more state parties thereto even if the state of war is not recognized by one of them.2 As explained by Jean Pictet in his commentaries to the four Conventions: "any difference arising between two States and leading to the intervention of armed forces is an armed conflict within the meaning of Article 2, even if one of the Parties denies the existence of a state of war. It makes no difference how long the conflict lasts, or how much slaughter takes place".3 In the decades since the adoption of the Conventions, duration or intensity have generally not been considered to be constitutive elements for the existence of an IAC. This approach has recently been called into question by suggestions that hostilities must reach a certain level of intensity to qualify as an armed conflict, the implication being that the fulfilment of an intensity criterion is necessary before an inter-state use of force may be classified as an IAC. Pursuant to this view, a number of isolated or sporadic inter-state uses of armed force that have been described as "border incursions", "naval incidents", "clashes" and other "armed provocations" do not qualify as lACs because of the low intensity of violence involved, as a result of which states did not explicitly designate them as such. It is submitted that, in addition to prevailing legal opinion which takes the contrary view, the absence of a requirement of threshold of intensity for the triggering of an IAC should be maintained because it helps avoid potential legal and political controversies about whether the threshold has been reached based on the specific facts of given situation. There are also compelling protection reasons not to link the existence of an IAC to a specific threshold of violence. To give but one example: under the Third Geneva Convention, if members of the armed forces of a state in dispute with another are captured by the latter's armed forces, they are eligible for prisoner of war (POW) status regardless of whether there is full-fledged fighting between the two states. POW status and treatment are well-defined under IHL, including the fact that a POW may not be prosecuted by the detaining state for lawful acts of war. It seems fairly evident that captured military personnel would not enjoy equivalent legal protection solely under the domestic law of the detaining state, even when supplemented by international human rights law. 1

Additional Protocol I, articles 1(4) and 96 (3). Under IHL belligerent occupation is considered a species of international armed conflict. The challenges raised in relation to the criteria for determining the existence of an occupation are explored further in this report. 3 Commentary to the Third Geneva Convention, J. Pictet (ed.), ICRC, 1960, p. 23. 2

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

117 8

The fact that a state does not, for political or other reasons, explicitly refer to the existence of an IAC in a particular situation does not prevent its being legally classified as such. The application of the law of IAC was divorced from the need for official pronouncements many decades ago in order to avoid cases in which states could deny the protection of this body of rules. It is believed that that rationale remains valid today. 2) Classification of Armed Conflicts Many queries have been raised in recent and ongoing legal debates about whether the current IHL dichotomy - under which armed conflicts are classified either as international or non-international - is sufficient to deal with new factual scenarios, and whether new conflict classifications are needed. It should be recalled that the key distinction between an international and a NIAC is the quality of the parties involved: while an IAC presupposes the use of armed force between two or more states,4 a NIAC involves hostilities between a state and an organized non-state armed group (the non-state party), or between such groups themselves. There does not appear to be, in practice, any current situation of armed violence between organized parties that would not be encompassed by one of the two classifications mentioned above. What may be observed is a prevalence of NIACs, the typology of which has arguably expanded, as will be discussed below. By way of reminder, at least two factual criteria are deemed indispensable for classifying a situation of violence as a Common Article 3 NIAC:5 i) the parties involved must demonstrate a certain level of organization, and ii) the violence must reach a certain level of intensity. i) Common Article 3 expressly refers to "each Party to the conflict" thereby implying that a precondition for its application is the existence of at least two "parties". While it is usually not difficult to establish whether a state party exists, determining whether a non-state armed group may be said to constitute a "party"' for the purposes of Common Article 3 can be complicated, mainly because of lack of clarity as to the precise facts and, on occasion, because of the political unwillingness of governments to acknowledge that they are involved in a NIAC. Nevertheless, it is widely recognised that a non-state party to a NIAC means an armed group with a certain level of organization. International jurisprudence has developed indicative factors on the basis of which the "organization" criterion may be assessed. They include the existence of a command structure and disciplinary rules and mechanisms within the armed group, the existence of headquarters, the ability to procure, transport and distribute arms, the group's ability to plan, coordinate and carry out military operations, including troop movements and logistics, its ability to negotiate and conclude agreements such as cease-fire or peace accords, etc. Differently stated, even though the level of violence in a given situation may be very high (in a situation of mass riots for example), unless there is an organised armed group on the other side, one cannot speak of a NIAC. ii) The second criterion commonly used to determine the existence of a Common Article 3 armed conflict is the intensity of the violence involved. This is also a factual criterion, the assessment of which depends on an examination of events on the ground. Pursuant to international jurisprudence, indicative factors for assessment include the number, duration 4 5

Except as mentioned above, see note 1.

Given that NIACs under Additional Protocol II have to fulfil certain conditions not found in Common Article 3 and that they are not as common, they will not be the further discussed in this section.

118

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2 and intensity of individual confrontations, the type of weapons and other military equipment used, the number and calibre of munitions fired, the number of persons and types of forces partaking in the fighting, the number of casualties, the extent of material destruction, and the number of civilians fleeing combat zones. The involvement of the UN Security Council may also be a reflection of the intensity of a conflict. The International Criminal Tribunal for the former Yugoslavia (ICTY) has deemed there to be a NIAC in the sense of Common Article 3 whenever there is protracted (emphasis added) armed violence between governmental authorities and organized armed groups or between such groups within a state. The Tribunal's subsequent decisions have relied on this definition, explaining that the "protracted" requirement is in effect part of the intensity criterion. In this context it should be mentioned that a 2008 ICRC Opinion Paper6 defines NIACs as "protracted armed confrontations occurring between governmental armed forces and the forces of one or more armed groups, or between such groups arising on the territory of a State (party to the Geneva Conventions). The armed confrontation must reach a minimum level of intensity and the parties involved in the conflict must show a minimum of organization". a) Typology of Common Article 3 NIACs NIACs falling within the Common Article 3 threshold have involved different factual scenarios, particularly over the past decade. A key development has been an increase in NIACs with an extraterritorial element, due to which questions about the sufficiency of the current classification of armed conflicts have been posed. Provided below is a brief typology of current or recent armed conflicts between states and organized non-state armed groups, or between such groups themselves which may be classified as NIACs. While the first five types of NIAC listed may be deemed uncontroversial, the last two continue to be the subject of legal debate. First, there are ongoing traditional or "classical" Common Article 3 NIACs in which government armed forces are fighting against one or more organized armed groups within the territory of a single state. These armed conflicts are governed by Common Article 3, as well as by rules of customary IHL. Second, an armed conflict that pits two or more organized armed groups between themselves may be considered a subset of "classical" NIAC when it takes place within the territory of a single state. Examples include both situations where there is no state authority to speak of (i.e. the "failed" state scenario), as well as situations where there is the parallel occurrence of a NIAC between two or more organized armed groups alongside an I AC within the confines of a single state. Here, too, Common Article 3 and customary IHL are the relevant legal regime for the NIAC track. Third, certain NIACs originating within the territory of a single state between government armed forces and one or more organized armed groups have also been known to "spill over" into the territory of neighbouring states. Leaving aside other legal issues that may be raised by the incursion of foreign armed forces into neighbouring territory (violations of sovereignty and possible reactions of the armed forces of the adjacent state which could turn the fighting into an IAC), it is submitted that the relations between parties whose conflict has spilled over remain at a minimum governed by Common Article 3 and customary IHL. This position is based on the understanding that the spill over of a NIAC into adjacent territory cannot have the effect of absolving the parties of their IHL obligations simply because an international

"How is the term "Armed Conflict" defined in international humanitarian law?" (17-03-2008) available at: http://www.icrc.org/eng/resources/documents/article/other/armed-conflict-article-170308.htm.

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

119 10

border has been crossed. The ensuing legal vacuum would deprive of protection both civilians possibly affected by the fighting, as well as persons who fall into enemy hands. Fourth, the last decade, in particular, has seen the emergence of what may be called "multinational NIACs". These are armed conflicts in which multinational armed forces are fighting alongside the armed forces of a "host" state - in its territory - against one or more organized armed groups. As the armed conflict does not oppose two or more states, i.e. as all the state actors are on the same side, the conflict must be classified as non-international, regardless of the international component, which can at times be significant. A current example is the situation in Afghanistan (even though that armed conflict was initially international in nature). The applicable legal framework is Common Article 3 and customary IHL. Fifth, a subset of multinational NIAC is one in which UN forces, or forces under the aegis of a regional organization (such as the African Union), are sent to support a "host" government involved in hostilities against one or more organized armed groups in its territory (see also infra section IV. 3)). This scenario raises a range of legal issues, among which is the legal regime governing multinational force conduct and the applicability of the 1994 Convention on the Safety of UN Personnel. It is submitted that if and when UN or forces belonging to a regional organization become a party to a NIAC such forces are bound by the rules of IHL, i.e. Common Article 3 and customary IHL, an issue explored further in this report. Sixth, it may be argued that a NIAC ("cross border") exists when the forces of a state are engaged in hostilities with a non-state party operating from the territory of a neighbouring host state without that state's control or support. The 2006 war between Israel and Hezbollah presented a particularly challenging case both factually and legally. There was a range of opinion on the legal classification of the hostilities that occurred, which may be encapsulated in three broad positions: that the fighting was an IAC, that it was a NIAC, or that there was a parallel armed conflict going on between the different parties at the same time: an IAC between Israel and Lebanon and a NIAC between Israel and Hezbollah. The aim of the "double classification" approach was to take into account the reality on the ground, which was that the hostilities for the most part involved an organized armed group whose actions could not be attributed to the host state fighting across an international border with another state. Such a scenario was hardly imaginable when Common Article 3 was drafted and yet it is submitted that this Article, as well as customary IHL, were the appropriate legal framework for that parallel track, in addition to the application of the law of IAC between the two states. A final, seventh type of NIAC believed by some to currently exist is an armed conflict taking place across multiple states between Al Qaeda and its "affiliates" and "adherents" and the United States ("transnational"). It should be reiterated that the ICRC does not share the view that a conflict of global dimensions is or has been taking place. Since the horrific attacks of September 11th 2001 the ICRC has referred to a multifaceted "fight against terrorism". This effort involves a variety of counter-terrorism measures on a spectrum that starts with nonviolent responses - such as intelligence gathering, financial sanctions, judicial cooperation and others - and includes the use of force at the other end. As regards the latter, the ICRC has taken a case by case approach to legally analyzing and classifying the various situations of violence that have occurred in the fight against terrorism. Some situations have been classified as an IAC, other contexts have been deemed to be NIACs, while various acts of terrorism taking place in the world have been assessed as being outside any armed conflict. It should be borne in mind that IHL rules governing the use of force and detention for security reasons are less restrictive than the rules applicable outside of armed conflicts governed by other bodies of law. As noted in the ICRC's report on IHL and the Challenges of Contemporary Armed Conflicts submitted to the International Conference in 2007, it is

120

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

11

believed to be inappropriate and unnecessary to apply IHL to situations that do not amount to armed conflict.7 b) Classification of Situations of Violence Resulting from Organized Crime The phenomenon of organized crime as such, which is complex and multifaceted, is beyond the scope of this report, but is mentioned because of the ongoing queries surrounding its legal nature. For the purposes of this report organized crime is understood to encompass the totality of unlawful activities carried out by criminal organizations and territorial gangs, including activities that result in resort to armed violence.8 The armed violence is in some cases caused by criminal groups fighting each other to gain control of markets and/or territory in order to pursue unlawful activities. It may in other cases be the result of actions undertaken by governments to suppress criminal organizations or to regain control of territory by means of police or military forces. In certain contexts both types of armed confrontations have been known to reach a high level of intensity, involving the use of heavy weapons and causing numerous casualties. The question that arises is whether organized crime and the responses thereto may be deemed an armed conflict within the meaning of IHL and in particular whether armed groups engaged in organized crime can be parties to an armed conflict. The query should be answered by reliance on the two main criteria used to determine the existence of a NIAC outlined above, namely the level of organization of the forces involved and the intensity of violence. In many contexts, the first criterion may be said to be fulfilled. Criminal groups often have a command structure, headquarters, the ability to procure arms, to plan operations, etc. As for the criterion of intensity of violence, it is sometimes more difficult to establish in practice whether the required threshold for a NIAC has been reached. This must be assessed on a case-by-case basis by weighing up a host of indicative data. Relevant elements are, for example, the collective nature of the fighting or the fact that the state is obliged to resort to its armed forces to deal with the situation. The duration of armed confrontations and their frequency, the nature of the weapons used, displacement of the population, territorial control by armed groups, the number of victims caused and other similar elements may also be taken into account. Pursuant to some views, the specific characteristics of the groups involved in purely criminal activities militate against considering that organized crime and the responses thereto may be deemed a NIAC. Under these views, situations involving purely criminal organizations such as "mafias" or criminal gangs cannot be classified as a NIAC because only organized armed groups with explicit or implied political objectives could be a legitimate party to a NIAC. It should be pointed out that this position is not borne out by a strictly legal reading. Under IHL, the motivation of organized groups involved in armed violence is not a criterion for determining the existence of an armed conflict. Firstly, to introduce it would mean to open the door to potentially numerous other motivation-based reasons for denial of the existence of an armed conflict. Secondly, political objective is a criterion that would in many cases be difficult to apply as, in practice, the real motivations of armed groups are not always readily discernible; and what counts as a political objective would be controversial. Finally, the distinction between criminal and political organizations is not always clear-cut; it is not rare for organizations fighting for political goals to conduct criminal activities in parallel and vice versa. 7

See ICRC Report on IHL and the Challenges of Contemporary Armed Conflicts presented to the 30th International Conference of the Red Cross and Red Crescent, Geneva, October 2007, 30IC/07/8.4, p. 8. 8 It should be noted that all acts of violence perpetrated in a NIAC by an organized non-state armed group are regularly prohibited under domestic law, thus it is difficult to make a distinction based on the unlawfulness of the activities.

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

121 12

Needless to say, the legal classification of violence has important consequences in practice as it determines the applicable legal framework, in particular the rules to be observed in the use of force. If a situation is considered to reach the threshold of a NIAC, IHL governing the conduct of hostilities applies and both governmental forces and criminal organizations party to the NIAC are bound by it. Below the level of NIAC state authorities must respect human rights law norms governing law enforcement operations. Criminal organizations are not bound by these norms, but by domestic law, including the relevant criminal law. Further details on the differences between the rules on the use of force under IHL and human rights law are provided in the sections related to the interplay of IHL and human rights law. 3) Applicable Law The adequacy of IHL has on occasion been challenged not only in terms of its ability to encompass new realities of organized armed violence within existing classifications, but also in terms of the existence of a sufficient body of substantive norms and its applicability in a given situation. The issue is more relevant in the case of non-international than international armed conflicts. It is generally uncontroversial that the Geneva Conventions of 1949 and Additional Protocol I for states party to it, as well as rules of customary IHL, remain a relevant frame of reference for regulating the behaviour of states involved in an IAC. As noted in the ICRC's 2007 Report to the International Conference on IHL and the Challenges of Contemporary Armed Conflicts, the basic principles and rules governing the conduct of hostilities and the treatment of persons in enemy hands (the two main areas of IHL), continue to reflect a reasonable and pragmatic balance between the demands of military necessity and those of humanity. The rules are detailed and time-tested and are also widely accepted, as evidenced by the fact that every country in the world today is a party to the Geneva Conventions and that the vast majority of states are also party to Additional Protocol I. The core treaties have continued to be supplemented by further codifications, particularly in the weapons area. This does not, of course, mean that the law governing lACs cannot be further improved by means of clarification and/or interpretation. Efforts to this end are being conducted by states, international organizations, the ICRC, expert groups and other bodies, including international and domestic courts and tribunals. It is well-known that treaty rules governing NIACs are far fewer than those governing IAC and that they cannot adequately respond to the myriad legal and protection issues that arise in practice. It has even been suggested that NIACs are not really substantively regulated because the application of Common Article 3 is geographically limited to the territory of the state party to the armed conflict. It is submitted that this view is not correct given that the provisions of that article undoubtedly constitute customary law and because of the significant number of other customary IHL rules applicable in NIAC. The ICRC's 2005 Study on Customary International Humanitarian Law (CLS) - requested by the International Conference held a decade earlier - concluded that 148 customary rules out of a total of 161 identified applied in NIACs as well. These rules serve as an additional source for determining the obligations of both states and organized non-state armed groups. Customary IHL rules are of particular significance because they provide legal guidance for parties to all types of NIACs, including the NIACs with an extraterritorial element outlined above. As a matter of customary law, the basic IHL principles and rules governing the conduct of hostilities are, with very few exceptions, essentially identical regardless of the conflict classification. The same may be said with respect to rules governing most aspects of detention, except for procedural safeguards in internment in NIACs, as will be explained below. The ICRC's views on how the law on detention may be strengthened are the subject

122

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

13

of a separate report on Strengthening Legal Protection for Victims of Armed Conflict that is being presented to the 31st International Conference, which identifies other areas of the law that should likewise be further elaborated. While determining the applicable law is clearly important, it is even more important that states acknowledge its applicability when the requisite factual criteria have been fulfilled. In its 2007 report on IHL and the Challenges of Contemporary Armed Conflicts, the ICRC had noted a tendency by some states to broaden the application of IHL to situations that did not in fact constitute an armed conflict. Nowadays, another trend of equal concern is in evidence. This trend takes two forms. First, some states reject the applicability of IHL to situations that on the facts may be said to constitute a NIAC, designating them instead as "counter-terrorist" operations subject to other bodies of law. Second, in other cases, states that previously acknowledged they were engaged in a NIAC against a particular non-state armed group have repudiated that classification, likewise declaring that they are henceforth applying a counter-terrorist framework. In both scenarios the approach seems to be based primarily on the assumption that recognizing the existence of a NIAC (or its continuation) legitimizes the non-state party by granting it a particular legal status. It must be stressed that this is not borne out by IHL given that Common Article 3 clearly provides that the application of its provisions "shall not affect the legal status of the Parties to the [non-international armed] conflict".9 The purpose of Common Article 3 is to regulate the treatment of persons in the hands of the adversary, while, as just mentioned, other customary IHL rules applicable in NIAC govern the conduct of hostilities. By denying the applicability of IHL in a NIAC states are depriving civilians, and their own personnel that may be detained by a non-state party, of the protection of the only body of international law that unequivocally binds non-state armed groups and for whose violations they may be internationally sanctioned. As will be further discussed below non-state armed groups are generally considered not to be bound by human rights law and their unwillingness to apply domestic law as a practical matter may be inferred from their having taken up arms against the state. However, the applicability of IHL to a given situation in no way detracts from the fact that members of the non-state party remain legally subject to domestic law and prosecutable under it for any crimes they may have committed. That is what the drafters of Common Article 3 had in mind when they determined that the application of its provisions does not affect the legal status of the parties to the conflict and what is overlooked when its applicability is denied, to the detriment of victims of armed conflict.

III. THE INTERPLAY BETWEEN IHL AND HUMAN RIGHTS LAW The interplay between IHL and HR law is an issue of abiding legal focus because inter alia of the practical consequences it may have on the conduct of military operations. This report can by no means provide an adequate overview of the relationship between these two branches of international law, but aims to highlight some salient points. 9 See also the Commentary to the Fourth Geneva Convention, Article 3, J. Pictet (ed.), ICRC, 1956, p. 44: "Consequently, the fact of applying Article 3 does not in itself constitute any recognition by the de jure Government that the adverse Party has authority of any kind; it does not limit in any way the Government's right to suppress a rebellion by all the means - including arms - provided by its own laws; nor does it in any way affect that Government's right to prosecute, try and sentence its adversaries for their crimes, according to its own laws. In the same way, the fact of the adverse Party applying the Article does not give it any right to special protection or any immunity, whatever it may be and whatever title it may give itself or claim."

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

123 14

1) On Interplay in General There is no doubt that IHL and human rights law share some of the same aims, that is to protect the lives, health and dignity of persons. It is also generally accepted that IHL and human rights law are complementary legal regimes, albeit with a different scope of application. While human rights law is deemed to apply at all times (and thus constitutes the lex genemlis), the application of IHL is triggered by the occurrence of armed conflict (thus constituting the lex specialis). While the meaning and even the utility of the doctrine of lex specialis have been called into question, it is believed that this interpretive tool remains indispensable for determining the interplay between IHL and human rights law. While these two branches of international law are complementary as a general matter, the notion of complementarity does not provide a reply to the sometimes intricate legal questions of interplay that arise on the ground in concrete cases. Situations of armed conflict cannot be equated to times of peace, and some IHL and human rights rules produce conflicting results when applied to the same facts because they reflect the different reality that each body of law was primarily developed for. Examples for this practical scenario, as well as for cases in which the application of IHL and human rights law produces similar results, will be outlined further below. There are, however, important differences of a general nature related to the interplay between IHL and human rights law that should be mentioned. The first is that human rights law de iure binds only states, as evidenced by the fact that human rights treaties and other sources of human rights standards do not create legal obligations for non-state armed groups.10 Human rights law explicitly governs the relationship between a state and persons on its territory and/or subject to its jurisdiction (an essentially "vertical" relationship), laying out the obligations of states vis a vis individuals across a wide spectrum of conduct. By contrast, IHL governing NIACs expressly binds both states and organized non-state armed groups, as evidenced by Common Article 3 whose provisions enumerate the obligations of the "parties" to a NIAC. IHL establishes an equality of rights and obligations between the state and the non-state side for the benefit of all persons who may be affected by their conduct (an essentially "horizontal" relationship). This does not, of course, mean that the state and nonstate side are equal under domestic law, as members of non-state armed groups, as already mentioned, remain bound by such law and may be prosecuted for any crime they may have committed pursuant to domestic law. Aside from purely legal aspects, there are practical considerations that restrict the ability of non-state armed groups to apply human rights law. Most such groups do not have the capacity to comply with the full range of human rights law obligations because they cannot perform government-like functions on which the implementation of human rights norms is premised. In most NIACs the non-state party lacks an adequate apparatus for ensuring the fulfilment of human rights treaty-based and non-treaty standards ("soft law"). In any event, the vast majority - and probably all - of the human rights obligations that an unsophisticated non-state armed group would be capable of implementing in practice are already binding on it under a corresponding rule of IHL. It should, however, be noted that the exception to what has just been said are cases in which a group, usually by virtue of stable control of territory,

10

While UN Security Council resolutions occasionally "call on" a particular non-state party to a NIAC to respect human rights, the legal affect of such calls cannot be to alter the edifice of human rights law, which explicitly imposes obligations only on states. It is submitted that the exact legal import of this aspect of the relevant Council resolutions remains unclear, also given that states are otherwise reluctant to recognize the applicability of human rights law to non-state armed groups.

124

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

15

has the ability to act like a state authority and where its human rights responsibilities may therefore be recognized de facto. The second major difference between IHL and human rights law is the extraterritorial reach of the respective bodies of rules. It is not controversial that IHL governing lACs applies extraterritorially given that its very purpose is to regulate the behaviour of one or more states involved in an armed conflict in the territory of another. It is submitted that the same reasoning applies in NIACs with an extraterritorial element: the parties to such conflicts cannot be absolved of their IHL obligations when a conflict reaches beyond the territory of a single state if this body of norms is to have a protective effect. Despite the views of a few important dissenters, it is widely accepted that human rights applies extraterritorially based, inter alia, on decisions by international and regional courts. The exact extent of such application, however, remains a work in progress. The jurisprudence is most developed within the European human rights system, but there too it is still evolving: while Council of Europe states have been determined to "carry" their obligations abroad when they engage in detention, based either on effective control over persons or the relevant territory, the case law is unsettled as regards the extraterritorial application of human rights norms governing the use of force. In this context it should be reiterated that the issue of the extraterritorial application of human rights law is relevant for states only. It has not been suggested that non-state armed groups have extraterritorial human rights obligations when they cross an international border, due to the legal and other reasons described above. The third major difference between IHL and human rights law regards the issue of derogation. While IHL norms cannot be derogated from, under the explicit terms of some human rights treaties states may derogate from their obligations provided therein subject to the fulfilment of the requisite conditions. 2) Specific Interplay: Detention and the Use of Force For the purposes of this report, the specific interplay of IHL and human rights law will be briefly examined in relation to two groups of norms that are of central interest in situations of armed conflict - rules on the detention of persons and on the use of force. a) Detention Detention is an inevitable and lawful incidence of armed conflict regulated by a large number of IHL provisions that seek to give specific expression to the overarching principle of humane treatment. By way of simplification, these rules may be divided into four groups. i) Rules on the treatment of detainees (in the narrow sense) These are norms that aim to protect the physical and mental integrity and well-being of persons deprived of liberty, whatever the reasons may be. They include the prohibition of murder, torture11, cruel, inhuman or degrading treatment, mutilation, medical or scientific 11 Under IHL torture is prohibited whether committed by a state or a non-state party to an armed conflict, whereas under human rights law torture is defined as severe physical or mental pain or suffering committed by state agents or by persons whose actions can be attributed to the state. However, the Rome Statute of the International Criminal Court dropped the requirement of state

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

125 16

experiments, as well as other forms of violence to life and health. All of the acts are prohibited under both IHL and human rights law. ii) Rules on material conditions of detention The purpose of these rules is to ensure that detaining authorities adequately provide for detainees' physical and psychological needs, which means food, accommodation, health, hygiene, contacts with the outside world, religious observance, and others. Treaty and customary IHL provide a substantial catalogue of standards pertaining to conditions of detention, as do 'soft law' human rights instruments. A common catalogue of standards could even be derived from both bodies of law. Hi) Fair trial rights Persons detained on suspicion of having committed a criminal offence are guaranteed fair trial rights. The list of fair trial rights is almost identical under IHL and human rights law. Unlike the fair trial provisions of the Third and Fourth Geneva Conventions, Common article 3 does not, admittedly, provide specific judicial guarantees, but it is generally accepted that article 75 (4) of Additional Protocol I - which was drafted based on the corresponding provisions of the 1966 International Covenant on Civil and Political Rights (ICCPR) - may be taken to reflect customary law applicable in all types of armed conflict. IHL reinforces the relevant human rights provisions as it allows no derogation from fair trial rights in situations of armed conflict. iv) Procedural safeguards in internment For the purposes of this report, internment is defined as the non-criminal detention of a person based on the serious threat that his or her activity poses to the security of the detaining authority in an armed conflict. It is in the area of procedural safeguards in internment that differences emerge between IHL applicable to international and NIACs and the corresponding rules of human rights law and where the question of interplay between the two branches of international law thus arises. Outside armed conflict, non-criminal (i.e. administrative) detention is highly exceptional. In the vast majority of cases, deprivation of liberty occurs because a person is suspected of having committed a criminal offense. The ICCPR guarantees the right to liberty of person and provides that anyone detained, for whatever reason, has the right to judicial review of the lawfulness of his or her detention. This area of human rights law is based on the assumption that the courts are functioning, that the judicial system is capable of absorbing whatever number of persons may be arrested at any given time, that legal counsel is available, that law enforcement officials have the capacity to perform their tasks, etc. Situations of armed conflict constitute a different reality, due to which IHL provides for different rules. Internment in IAC In IAC, IHL permits the internment of prisoners of war (POWs) and, under certain conditions, of civilians. POWs are essentially combatants captured by the adverse party in an IAC. A combatant is a member of the armed forces of a party to an IAC who has "the right to participate directly in involvement in torture in its definition of torture as a crime against humanity, but it still requires an organizational policy.

126

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

17

hostilities". This means that he or she may use force against, i.e. target and kill or injure other persons taking a direct part in hostilities and destroy other enemy military objectives. Because such activity is obviously prejudicial to the security of the adverse party, the Third Geneva Convention provides that a detaining state "may subject prisoners of war to internment". It is generally uncontroversial that the detaining state is not obliged to provide review, judicial or other, of the lawfulness of ROW internment as long as active hostilities are ongoing, because enemy combatant status denotes that a person is ipso facto a security threat.12 ROW internment must end and POWs must be released at the cessation of active hostilities, unless they are subject to criminal proceedings or are serving a criminal sentence. Under the Fourth Geneva Convention, internment and assigned residence are the most severe "measures of control" that may be taken by a state with respect to civilians whose activity is deemed to pose an imperative threat to its security. It is uncontroversial that civilian direct participation in hostilities falls into that category. (Despite the fact that only combatants are explicitly authorized under IHL to directly participate in hostilities, the reality is that civilians often do so as well, in both international and NIACs.13) Apart from direct participation in hostilities, other civilian behaviour may also meet the threshold of posing an imperative threat to the security of a detaining power. In terms of process, the Fourth Geneva Convention provides that a civilian interned in IAC has the right to submit a request for review of the decision on internment (to challenge it), that the review must be expeditiously conducted either by a court or an administrative board, and that periodic review is thereafter to be automatic, on a six-monthly basis. Civilian internment must cease as soon as the reasons which necessitated it no longer exist. It must in any event end "as soon as possible after the close of hostilities". It is submitted that the interplay of IHL and human rights rules governing procedural safeguards in internment in IAC must be resolved by reference to the lex specialis, that is the relevant provisions of IHL that were specifically designed for it. Internment in NIAC Common Article 3 does not contain rules on procedural safeguards for persons interned in NIAC even though internment is practiced by both states and non-state armed groups. Additional Protocol II explicitly mentions internment, thus confirming that it is a form of deprivation of liberty inherent to NIAC, but likewise does not list the grounds for internment nor the procedural rights. Due to IHL's lack of specificity and to some of the unresolved issues related to the application of human rights law outlined below, a case-by-case analysis of the interplay of IHL and human rights is necessary. Only a few legal challenges that arise will be mentioned. In a traditional NIAC occurring in the territory of a state between government armed forces and one or more non-state armed groups, domestic law, informed by the state's human rights obligations and IHL, constitutes the legal framework governing the procedural safeguards that must be provided by the state to detained members of such groups. It must be noted that, under some views, domestic law cannot allow non-criminal detention in armed conflict without derogation from the ICCPR even if the relevant state provides judicial review as required under article 9 (4) of the Covenant. Pursuant to other views, derogation would be necessary if the state suspended the right to habeas corpus and provided only administrative review of internment in NIAC (as would be allowed under IHL). According to still other 12

Judicial review under the domestic law of the detaining state could, however, be sought to obtain the release of a ROW who is detained despite the end of active hostilities (which is a grave breach of IHL). 13

See section on Direct Participation in Hostilities further below.

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

127 18

positions, the right to habeas corpus can never be derogated from, an approach, it is submitted, that is appropriate in peacetime, but cannot always accommodate the reality of armed conflict.14 Identifying the legal framework governing internment is even more complicated in NIACs involving states fighting alongside a host state's forces in the latter's territory (e.g. a "multinational" NIAC). In addition to the issue mentioned above, others arise: states members of a coalition may not all be bound by the same human rights treaties; the extent of the extraterritorial reach of human rights law remains unclear, and the question of whether the intervening states must derogate from their human rights obligations in order to detain persons abroad without habeas corpus review is unresolved (in practice, no state has ever done so). Leaving aside state obligations, it should be recalled that the other party to a NIAC is one or more organized non-state armed groups. Domestic law does not allow them to detain or intern members of a state's armed forces (or anyone else), and human rights law likewise does not provide a legal basis for detention by non-state armed groups. As a result, a nonstate party is not legally bound to provide habeas corpus to persons it may capture and detain/intern (nor could it do so in reality, except in cases in which a group, usually by virtue of stable control of territory, has the ability to act like a state authority and where its human rights responsibilities may therefore be recognized de facto). Thus, the suggestion that human rights law must be resorted to when IHL is silent on a particular issue - such as procedural safeguards in internment - overlooks the legal and practical limits of the applicability of human rights law to non-state parties to NIACs. The legal and practical challenges posed by detention in NIACs remain the subject of much legal debate, as well as discussion on the way forward. In order to provide guidance to its delegations in their operational dialogue with states and non-state armed groups, in 2005 the ICRC adopted an institutional position entitled "Procedural Principles and Safeguards for Internment/Administrative Detention in Armed Conflict and Other Situations of Violence". This document, which is based on law and policy, was annexed to the ICRC's report on IHL and the Challenges of Contemporary Armed Conflicts presented to the 2007 International Conference. The question remains, however, whether it might be necessary to elaborate rules governing detention, including those on procedural safeguards in internment in NIAC, by means of further IHL development. The ICRC believes this to be the case, as outlined in its report on Strengthening Legal Protection for Victims of Armed Conflict which has been submitted to the 31st International Conference. b) Use of Force Among issues that are regulated by both IHL and human rights law the greatest differences are found in the respective rules governing the use offeree. IHL rules on the conduct of hostilities recognize that the use of lethal force is inherent to waging war. This is because the ultimate aim of military operations is to prevail over the enemy's armed forces. Parties to an armed conflict are thus permitted, or at least are not legally barred from, attacking each other's military objectives, including enemy personnel. Violence directed against those targets is not prohibited as a matter of IHL regardless of whether it is inflicted by a state or a non-state party to an armed conflict. Acts of violence against civilians and civilian objects are, by contrast, unlawful because one of the main purposes of IHL is to spare them from the effects of hostilities. The basic rules governing the conduct of hostilities were crafted to reflect the reality of armed conflict. First among them is the principle of distinction, according to which parties to an armed conflict must at all times 14

This is particularly the case in a "multinational NIAC", see next paragraph in text.

128

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

19

distinguish between civilians, civilian objects and military objectives and direct their attacks only against the latter. In elaboration of the principle of distinction, IHL also inter alia prohibits indiscriminate attacks, as well as disproportionate attacks (see below), and obliges the parties to observe a series of precautionary rules in attack aimed at avoiding or minimizing harm to civilians and civilian objects. Human rights law was conceived to protect persons from abuse by the state and does not rely on the notion of the conduct of hostilities between parties to an armed conflict, but on law enforcement. Rules on the use of force in law enforcement essentially provide guidance on how life is to be protected by the state when it is necessary to prevent crime, to effect or assist in the lawful arrest of offenders or suspected offenders and to maintain public order and security. The bottom line, as regards the use of lethal force under law enforcement principles governed by human rights law, is that intentional lethal force may be used only as last resort in order to protect life when other means are ineffective or without promise of achieving the intended result (but such means must always be available). Human rights soft law standards and jurisprudence have also clarified that a "strict" or "absolute" necessity standard is attached to any use of lethal force, meaning that intentional use of lethal force may not exceed what is strictly or absolutely necessary to protect life. The principle of proportionality, whose observance is crucial to the conduct of both military and law enforcement operations, is differently conceived in IHL and human rights law. IHL prohibits attacks against military objectives that "may be expected to cause incidental death, injury to persons or damage to civilian objects, or a combination thereof, which would be excessive in relation to the concrete and direct military advantage anticipated". The main distinction between the relevant IHL and human rights rules is that the aim of the IHL principle of proportionality is to limit incidental ('collateral') damage to protected persons and objects, while nevertheless recognizing that an operation may be carried out even if such damage may be caused, provided that it is not excessive in relation to the concrete and direct military advantage anticipated. By contrast, when a state agent is using force against an individual under human rights law, the proportionality principle measures that force taking into account the effect on the individual him or herself, leading to the need to use the smallest amount offeree necessary and restricting the use of lethal force. This very brief overview permits the conclusion that the logic and criteria governing the use of lethal force under IHL and human rights law do not coincide, due to the different circumstances that the respective norms are intended to govern. The key issue therefore is the interplay of these particular norms in situations of armed conflict. The answer is clearer in I AC than in NIAC and also turns on the issue of lex specialis. i) Interplay in I AC In its very first statement on the application of human rights in situations of armed conflict, the 1996 Advisory Opinion on the Legality of the Threat or Use of Nuclear Weapons, the International Court of Justice observed that the protection of the ICCPR does not cease in times of war and that, in principle, the right not to be arbitrarily deprived of one's life applies also in hostilities. The Court added that test of what is an arbitrary deprivation of life is to be determined by the applicable lex specialis, namely, the law applicable in armed conflict which is designed to regulate the conduct of hostilities. It further explained that "whether a particular loss of life [...] is to be considered an arbitrary deprivation of life contrary to Article 6 of the Covenant, can only be decided by reference to the law applicable in armed conflict and not be deduced from the terms of the Covenant itself". The Court has since not indicated a change to its approach on this issue. It is submitted that IHL constitutes the lex specialis governing the assessment of the lawfulness of the use of force in an IAC - when, of course, lethal force is resorted to against

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

129 20

combatants and other persons directly participating in hostilities. This body of rules was specifically designed for the conduct of hostilities in such conflicts and regulates the use of force in sufficient detail. It should not, however, be implied from the above that determining whether a conduct of hostilities or a law enforcement framework should be resorted to in an IAC is an easy task. For example, the challenges posed to the application of the two frameworks in situations of occupation are dealt with further in this report. There are likewise instances of violence in lACs, such as riots or civil unrest, to which the application of an IHL conduct of hostilities framework would clearly not be appropriate. ii) Interplay in NIAC The interplay of IHL rules and human rights standards on the use of force is less clear in NIAC for a range of reasons, only some of which will be briefly mentioned here. The first is the existence and operation of the lex specialis principle in NIAC. While, as already mentioned, IHL applicable in IAC provides a range of rules on the conduct of hostilities, the general lack of corresponding treaty rules in NIAC has led some to argue that there is no lex specialis in NIAC and that human rights law fills the gap. This position, it is submitted, is not borne out by facts. The great majority of IHL rules on the conduct of hostilities are customary in nature and are applicable regardless of conflict classification, as determined by the ICRC's 2005 Customary Law Study. Relevant IHL thus exists. The issue of who may be targeted under IHL, i.e. how to interpret the rule that civilians are protected from direct attack unless and for such time as they take a direct part in hostilities remains the subject of much legal debate, particularly as regards situations of NIAC. The ICRC expressed its views on the subject with the issuance, in 2009, of an Interpretive Guidance on the Notion of Direct Participation in Hostilities under IHL (see further below). It should be recalled, however, that the Guidance deals with direct participation in hostilities under an IHL lens only, without prejudice to other bodies of law - particularly human rights law - that may concurrently be applicable in a given situation. International and regional jurisprudence is not uniform in its approach to the relationship between IHL and human rights, particularly in respect of the scope of protection of the right to life in NIAC. Most cases have dealt with violations of the right to life of civilians in which application of either IHL or human rights law would have essentially produced the same result. Courts have yet to conclusively address the interplay of IHL and human rights law involving the targeting and killing of persons who were directly participating in hostilities. Last, but by no means least, is the issue of the legal framework applicable to the use of force by non-state armed groups. What has been said above in relation to the (non)applicability of human rights law to organized armed groups is equally valid in this area and will not be repeated. What can essentially be concluded from the above is that the use of lethal force by states in NIAC requires a fact-specific analysis of the interplay of the relevant IHL and human rights rules. For states, the legal result reached will depend on the treaties they are party to, customary law, and of course the relevant provisions of domestic law. It is also evident that in NIAC - as well as in IAC - state armed forces must be trained to distinguish and switch between a war-fighting and a law enforcement situation and be provided with clear rules of engagement on the use of force. As regards non-state armed groups, they are clearly legally bound by the relevant IHL rules. The ICRC plans to further explore the challenges surrounding the interface of IHL and human rights law rules on the use of force in situations of armed conflict.

130

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

21

3) Extraterritorial Targeting of Persons The extraterritorial targeting of persons has become a prominent legal and policy issue over the past several years due, inter alia, to questions that have been raised about the lawfulness of this practice. For the purposes of this report, extraterritorial targeting is understood as the use of lethal force against a specific person - or persons - by agents of one state in the territory of another (the "territorial" or "host" state). It cannot be emphasized enough that a large part of the difficulty in coming to appropriate legal and policy conclusions in most actual cases lies in the insufficiently known factual circumstances surrounding them and in the fact that states rarely, if ever, justify their extraterritorial actions in advance or provide accounts of operations after the fact. From a legal point of view, the extraterritorial targeting of a person requires an analysis of the lawfulness of the resort to force by one state in the territory of another (under the ius ad bellum) and an analysis of the international legal framework governing the way in which force is used (under the ius in bello i.e. IHL, or under human rights law, as the case may be). The latter determination will depend on whether the activities of the individual at issue i) take place within an ongoing armed conflict or ii) have no link to an armed conflict. i) In a situation of armed conflict, IHL rules on the conduct of hostilities mentioned above apply. This means that lethal force may be used against combatants, that is persons who have the right to take a direct part in hostilities (a legal status inherent only to IAC), as well as against other persons taking a direct part in hostilities, including civilians when they do so. Who is deemed to be a civilian taking a direct part in hostilities and is therefore not protected from direct attack during such time as he or she takes a direct part in hostilities was elaborated in the ICRC's 2009 Interpretive Guidance on the Notion of Direct Participation in Hostilities under IHL (see further below). Pursuant to the Guidance: - members of armed forces,15 or of organized armed groups of a party to the conflict who perform a continuous combat function are not considered civilians for the purpose of the conduct of hostilities and are thus not protected against direct attack for the duration of their performance of such a function. - civilians are persons who take a direct part in hostilities on a merely spontaneous, sporadic or unorganized basis, and are subject to targeting only for the duration of each specific act of direct participation. It should be noted that the Interpretive Guidance provides the ICRC's view on the restraints applicable to the use of force in direct attack. Pursuant to Recommendation IX, "[T]he kind and degree of force which is permissible against persons not entitled to protection against direct attack must not exceed what is actually necessary to accomplish a legitimate military purpose in the prevailing circumstances". This does not imply a "capture rather than kill" obligation in armed conflict, which is a law enforcement standard, but is aimed at providing guiding principles for the choice of means and methods of warfare based on a commander's assessment of a particular situation. By way of reminder, the targeting of persons under IHL is subject to further important rules, that of the prohibition of indiscriminate and disproportionate attacks and of the obligation to take feasible precautions in attack. In practice most questions have been raised about the lawfulness of the use of lethal force against persons whose activity is linked to an ongoing armed conflict, more specifically of individuals who are directly participating in an ongoing NIAC from the territory of a nonbelligerent state. A non-belligerent state is one that is not involved in an ongoing armed 15

As well as participants in a levee en masse. See section on Direct Participation in Hostilities further below.

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

131 22

conflict itself against a non-state armed group in its territory and/or is not involved in a NIAC with such a group that has spilled over from the territory of an adjacent state. Different legal opinions on the lawfulness of the targeting of a person directly participating in hostilities from the territory of a non-belligerent state may be advanced. Under one school of thought, a person directly participating in hostilities in relation to a specific ongoing NIAC "carries" that armed conflict with him to a non-belligerent state by virtue of continued direct participation (the nexus requirement) and remains targetable under IHL. In other words, provided the requisite ius ad bellum test has been satisfied, he or she can be targeted under IHL rules on the conduct of hostilities. These include the principle of proportionality, under which harm to civilians and damage to civilian objects, or a combination hereof, is not deemed unlawful if it is not excessive in relation to the direct and concrete military advantage anticipated from the attack. Pursuant to other views, which the ICRC shares, the notion that a person "carries" a NIAC with him to the territory of a non-belligerent state should not be accepted. It would have the effect of potentially expanding the application of rules on the conduct of hostilities to multiple states according to a person's movements around the world as long as he is directly participating in hostilities in relation to a specific NIAC. In addition to possible ius ad bellum issues that this scenario would raise there are others, such as the consequences that would be borne by civilians or civilian objects in the non-belligerent state(s). The proposition that harm or damage could lawfully be inflicted on them in operation of the IHL principle of proportionality because an individual sought by another state is in their midst (the result of a "nexus" approach), would in effect mean recognition of the concept of a "global battlefield". It is thus believed that if and when the requisite ius ad bellum test is satisfied, the lawfulness of the use of force against a particular individual in the territory of a non-belligerent state would be subject to assessment pursuant to the rules on law enforcement (see also below). ii) There have been cases in which states have extraterritorially targeted individuals whose activity, based on publicly available facts, was outside any armed conflict, whether international or non-international. Leaving aside ius ad bellum issues, it is clear that the lawfulness of such a use of force cannot be examined under an IHL conduct of hostilities paradigm, but under human rights law standards on law enforcement. As outlined above, the application of a law enforcement framework means inter alia that lethal force may be used only if other means are "ineffective or without promise of achieving the intended result" and that the planning and execution of any action has to be carried out pursuant to the human rights law principles of necessity and proportionality. A legal issue that could be posed in this scenario is the extraterritorial applicability of human rights law based on the fact that the state using force abroad lacks effective control over the person (or territory) for the purposes of establishing jurisdiction under the relevant human rights treaty. It is submitted that customary human rights law prohibits the arbitrary deprivation of life and that law enforcement standards likewise belong to the corpus of customary human rights law. It is important to underline that the application of law enforcement rules does not turn on the type of forces or equipment used in a given operation (police or military), but on the fact that human rights law is the governing legal regime, given the absence of armed conflict. The Basic Principles on the Use of Force and Firearms reflect that approach: "The term 'law enforcement officials' includes all officers of the law, whether appointed or elected, who exercise police powers, especially the powers of arrest or detention. In countries where police powers are exercised by military authorities, whether uniformed or not, or by state security forces, the definition of law enforcement officials shall be regarded as including officers of such services".

132

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

23 IV. THE PROTECTIVE SCOPE OF IHL: SELECT ISSUES

1) Humanitarian Access and Assistance Armed conflicts, whether international or non-international, generate significant needs for humanitarian assistance. As practice has unfortunately demonstrated, civilian populations are often deprived of essential supplies in war, including food, water and shelter, and are unable to access health care and other essential services. The reasons vary. Property may be destroyed as a result of combat operations and farming areas may be unusable due to the dispersion of landmines or other explosive remnants of war. Entire populations may be forced to leave their homes, thus abandoning habitual sources of income. In addition, economic and other infrastructure may be damaged or disrupted affecting, as a result, the stability of entire counties or regions for a prolonged period of time. Under international law, states bear the primary responsibility for ensuring the basic needs of civilians and civilian populations under their control. However, if states are unable or unwilling to meet their responsibilities, IHL provides that relief actions by other actors, such as humanitarian organizations, shall be undertaken, subject to the agreement of the concerned state.16 In order to perform their mission, humanitarian organizations must be granted rapid and unimpeded access to affected populations. Humanitarian access is a precondition for the conduct of proper assessments of humanitarian needs, for the implementation and monitoring of relief operations and for ensuring appropriate follow-up. In practice, however, humanitarian access remains a significant challenge for a number of reasons which, in certain cases, may overlap. a) Constraints on Humanitarian Access Constraints on humanitarian access may be of a political nature. When relief actions are perceived as a threat to the sovereignty of a state per se or because of the perceived "legitimization" of a non-state group as a result of engagement with it for humanitarian purposes, or are perceived as a threat to the dominant position of a non-governmental armed group in a specific region, access by humanitarian organizations to civilian populations in need may be denied. In such cases the relevant authorities often argue that they have the capacity to handle the situation themselves, without external support. They may also claim that proposed relief actions do not meet the conditions of being exclusively humanitarian and impartial, and conducted without any adverse distinction, as required by IHL. This belief may on occasion be brought on by the involvement of military forces in relief operations and the consequent blurring of lines between humanitarian and military actors. If, however, the parties to armed conflicts unjustifiably perceive humanitarian operations as instruments of military or political agendas, access to populations in distress becomes difficult or impossible and the security of humanitarian workers is seriously jeopardized. In some cases, denial of humanitarian access may also be part of a military strategy. When parties to an armed conflict believe that opposing forces are receiving support from the civilian population, they may seek to deprive the population of essential supplies in order to weaken their adversary's capacity to mount military operations. Political constraints on humanitarian access are often complicated by administrative barriers and restrictions, as well as logistical problems. In some instances, access of humanitarian organizations is hampered by difficulties in obtaining visas for their personnel and import 16

The obligations of an occupying power are dealt with further below in this section.

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

133 24

authorisations for relief supplies. Complex procedures and repeated controls may also contribute to delaying the entry and distribution of humanitarian goods. In addition, essential infrastructure, such as roads or railways, may have been destroyed or damaged as a result of conflict, making it difficult to reach affected populations. Security-related concerns are also among the main reasons limiting humanitarian access in practice. It may be extremely difficult for humanitarian actors to reach populations situated in areas where hostilities are ongoing. When the risk of casualties is assessed to be significant, relief operations have to be cancelled or suspended. In yet other cases, humanitarian actors have been deliberately threatened or attacked by armed actors, either for criminal purposes or political reasons, or both. This trend has certainly become more problematic in recent years, as many of today's armed conflicts are more fragmented and complex, involving multiple actors, including semi-organized armed groups and purely criminal organizations. It has therefore become more difficult to enter into contact and engage in a regular security dialogue with all those who have the capability of potentially disrupting humanitarian operations in order to prevent or eliminate the security risk. Due to their vulnerability to attack, many humanitarian organizations have on occasion either withheld from or scaled down their operations in specific contexts or have been obliged to hire security providers. The complexity and consequences of constraints on humanitarian access thus remain, with good reason, a growing focus of international concern. b) Legal Framework Applying to Humanitarian Access and Assistance While access constraints are often related to political, administrative, logistical, or security challenges, they are rarely the result of purely legal obstacles. It should be noted that reliance on the relevant provisions of IHL may, in practice, prove to be a useful tool to secure access to affected populations and to conduct effective humanitarian operations. This implies that practitioners should have a clear understanding of this legal framework and be trained to use it in efforts to ensure that their activities are accepted and respected. IHL rules on humanitarian access and assistance may be grouped depending on whether they relate to: a) lACs, other than occupied territories; b) NIACs; and c) occupied territories. In each case IHL establishes, first, that relief actions may be authorized - and in a situation of occupation shall be authorized - when civilian populations suffer from lack of adequate supplies. Second, it defines under what conditions such operations must be conducted, providing further prescriptions aimed at facilitating the delivery of humanitarian relief to affected populations. It may be observed that, on both counts, there would be benefit from clarification of some of the rules. IHL applicable to humanitarian access and assistance is mainly based on the Fourth Geneva Convention relative to the Protection of Civilians in Time of War and the 1977 Additional Protocols. The Fourth Convention regulates the humanitarian obligations of states parties in relation to the evacuation of or access to besieged or encircled areas (article 17) and the obligations of the parties to allow the free passage of medical supplies, as well as of certain other goods to groups of beneficiaries (article 23). It also enumerates the rights of aliens in the territory of a party to the conflict, including to individual and collective relief (article 38), and prescribes the obligations of an occupying power as regards relief schemes for the benefit of the population of an occupied territory (articles 59-62). The provisions of the Fourth Convention were complemented and reinforced by Additional Protocol I (articles 68-71) and, for NIAC, Common Article 3 of the Geneva Conventions by Additional Protocol II (article 18). In addition to treaty law, some obligations have also crystallized into international customary law. These include rules on the rapid and unimpeded passage of humanitarian relief and the freedom of movement of humanitarian relief personnel (CLS, rules 55-56). Rules of

134

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

25

customary law also provide protection applying specifically to humanitarian relief personnel and objects (CLS, rules 31-32). c) Obligation to Undertake Relief Actions It is not clear to what extent parties to both international and NIACs are bound to accept the deployment of relief actions in territories under their control. While the relevant provisions of the two Additional Protocols stipulate that relief actions "shall be undertaken" when the population lacks supplies essential for its survival, thereby clearly establishing a legal obligation, they further provide that such obligation is subject to the agreement of the state concerned.17 It would thus appear that a balance has to be found between two apparently contradicting requirements: a) that a relief action must be undertaken, and b) that the agreement of the state concerned must be obtained. The question therefore arises as to how to strike this balance in practice. Part of the answer is to be drawn from the generally accepted view that consent cannot be arbitrarily refused, i.e. that any impediment(s) to relief action must be based on valid reasons. In extreme situations, where a lack of supplies would result in starvation, it should be deemed that there is no valid reason justifying a refusal of humanitarian assistance. IHL, applicable in both international and NIACs, strictly prohibits starvation of civilians as a method of warfare. This is, of course, subject to the assumption that a relief operation meets the three conditions provided for under IHL, namely that it is humanitarian and impartial in character, and conducted without any adverse distinction. With regard to occupied territories, there is no legal uncertainty as to the nature of the obligation of the occupying power to allow and facilitate relief operations. The Fourth Geneva Convention and Additional Protocol I explicitly impose on the occupying power the duty to ensure, to the fullest extent of the means available to it, the provision of food and medical supplies, clothing, bedding, means of shelter and other supplies essential to the survival of the civilian population, as well as objects necessary for religious worship. If the occupying power is not in a position to fulfil its duty, the Convention clearly provides that it is bound to accept humanitarian aid on behalf of the affected population. This obligation is not subject to its consent. Thus, in occupied territories, the obligation to accept relief operations is unconditional. d) Delivery of Humanitarian Relief The conditions for the delivery of humanitarian relief are also an area where further clarification is needed, especially with regard to NIACs, as there are very few rules of treaty or customary IHL that regulate this issue. With regard to lACs, the relevant legal framework is more detailed. For example, it defines the types of goods that may be distributed, allows for the prescription of technical arrangements, restricts the possibility of diverting relief consignments and regulates the participation of the personnel involved in relief operations. However, the concrete implications of the rights and obligations of the parties to an armed conflict, whether international or non-international, are not sufficiently defined. It would be helpful, for instance, to have a better understanding of the scope and limitations of the right of control that the parties are allowed to exercise on relief operations. While such control may include the search of relief consignments or the supervision of their delivery, it must not impede the rapid deployment of a relief operation. Linked to this, it would also be useful to further elaborate the concrete implications of the parties' obligation to "facilitate" the passage of humanitarian relief. Best practices in this regard could be shared among all those concerned as well. 7

There is no requirement of consent in situations of occupation, as explained below.

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

135 26

It is submitted that the questions raised above should be examined with regard both to state and non-state parties to armed conflicts. It is an underlying principle of IHL that all belligerents are bound by the same obligations. Therefore, rules on humanitarian access and assistance applying in NIACs should be interpreted and applied in the same way for state and non-state parties. There is however an exception to this principle. Under Additional Protocol II, the consent required for undertaking a relief action is that of the state concerned, and not of the other party, or parties, to the conflict. The rights and obligations of actors providing assistance is also an issue that warrants further analysis. For instance, the extent to which humanitarian organizations are entitled to enjoy freedom of movement in their activities and the correlative right of the parties to armed conflicts to temporarily restrict their freedom for reasons of imperative military necessity should be explored. Lastly, the role of third states, including states whose territory is used for the transit of relief operations, should also be examined. In an IAC, Additional Protocol I provides that "each High Contracting Party", meaning not only those participating in a conflict, must allow and facilitate rapid and unimpeded passage of all relief consignments, equipment and personnel. A similar obligation does not appear in the law governing NIACs. Access to populations in need of assistance and protection in times of armed conflicts depends first and foremost on the degree of acceptance of humanitarian and impartial relief actions by those exercising territorial control. Humanitarian organizations must be able to communicate with all parties involved in situations of armed conflict and explain the reasons and purpose of their activities in a consistent and coherent way. They should also be able to explain that these activities are based on IHL. While the legal framework cannot be the only consideration that should be taken into account in the dialogue, it may certainly be used as a tool facilitating the deployment of humanitarian operations. It should therefore be known and disseminated by all those involved in such operations. 2) The Law of Occupation As mentioned in the report submitted to the 30th International Conference, recent years have been characterized by an increase in extraterritorial military interventions. Along with the continuation of more classical forms of occupation, some of these interventions have given rise to new forms of foreign military presence in the territory of a state, on occasion consensual, but more often not. These new forms of military presence have - to a certain extent - refocused attention on occupation law. Outlined below are some legal questions that have been generated in relation to this specific area of IHL. a) Beginning and End of Occupation To begin with, it should be noted that renewed international attention on the law of occupation has essentially been focused on the substantive rules of occupation law rather than on issues surrounding the conditions that must be established for the beginning and end of occupation. In other words, relatively little attention has been paid to the standards on the basis of which the existence of a state of occupation may be determined. This is unfortunate given that the question of whether there is occupation or not is central to the application of the relevant body of IHL and needs to be answered before any substantive question of occupation law can be addressed. Practice has demonstrated that many states put forward claims of inapplicability of occupation law even as they maintain effective control over foreign territory or a part thereof,

136

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

27

due to a reluctance to be perceived as an occupying power. Their assertions are partly facilitated by the fact that IHL instruments do not provide clear standards for determining when an occupation starts and terminates. Not only is the definition of occupation vague under IHL, but other factual elements - such as the continuation of hostilities and/or the continued exercise of some degree of authority by local authorities, or by the foreign forces during and after the phase out period - may render the legal classification of a particular situation quite complex. In addition, recent military operations have underlined the necessity of more precisely defining the legal criteria on the basis of which a state of occupation may be determined to exist when it involves multinational forces. Are the criteria for the beginning and end of occupation the same in such a case? Who are the occupying power(s) when a coalition of states is involved? Can all the troop contributing countries be considered occupying powers for the purposes of IHL? Linked to the issue of the applicability of occupation law is the question of the determination of the legal framework applicable to invasion by and the withdrawal of foreign forces. It is submitted that a broad interpretation of the application of the Fourth Geneva Convention during both the invasion and withdrawal phases - with a view to maximizing the legal protection conferred on the civilian population - should be favoured. An issue that would benefit from elaboration in the invasion and withdrawal phase is the exact legal protections enjoyed by those who are in the power of a belligerent, but are neither on territory occupied by it nor on its own territory. It is submitted that the range of questions posed above raise important humanitarian challenges and would deserve appropriate legal clarification. b) The Rights and Duties of an Occupying Power The law of occupation has also faced recurrent challenges on the basis that it is ill-suited for contemporary occupation. The reluctance of some states to accept its application is often justified by claims that situations in which they are or might be involved in differ considerably from the classical concept of belligerent occupation. In other words, it has been argued that current occupation law is not sufficiently equipped to deal with the specificities of the new features of occupation. Recent occupations have, in particular, triggered much legal commentary about the failure of occupation law to authorize the introduction of wholesale changes in the legal, political, institutional and economical structure of a territory placed under the under effective control of a foreign power. It has been contended that occupation law places an undue emphasis on preserving the continuity of the socio-political situation of an occupied territory. It has also been claimed that the transformation of an oppressive governmental system or the rebuilding of a society that has completely collapsed could be achieved during an occupation, and be in the interest of the international community, as well as authorized by the lex lata. The far-reaching political and institutional changes undertaken in recent occupations have thus generated tension between the requirement of occupation law that the occupying power respects the laws and institutions in place and the perceived need to fundamentally alter the institutional, social or economic fabric of an occupied territory. It has been contended that, to reduce this tension, IHL should permit certain transformative processes and recognize the occupying power's role in fostering them. Such a position, however, raises the question of the validity of limitations posed by IHL on an occupying power's rights and duties as reflected in article 43 of the 1907 Hague Regulations and article 64 of the Fourth Geneva Convention. Given that occupation law does not expressly give "carte blanche" for various transformations that might be desired by an occupying power, some contemporary

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

137 28

interpretations have aimed to achieve that result by granting an occupying power increasing leeway in the administration of an occupied territory. It is submitted that the limits to an occupying power's freedom - or not - to effect changes in an occupied territory need to be identified more clearly. Prolonged occupation raises a whole set of legal questions in itself. Even though IHL contemplates the possibility that occupation may be of a protracted nature, none of the relevant IHL instruments place limits on the duration of effective control over a foreign territory. However, prolonged occupations place IHL under considerable strain insofar as they call into question some of the underlying principles of occupation law, in particular the provisional character of occupation and the necessity to preserve the status quo ante. As neither the 1907 Hague Regulations nor the Fourth Geneva Convention specify any lawful deviations from existing law in this scenario, many have argued that prolonged occupation necessitates specific regulations in response to the practical problems arising in such cases. The other view is that occupation law is sufficiently flexible to accommodate the humanitarian and legal concerns arising in prolonged occupation. In addition to the issues raised above, it should be noted that human rights law may play an important role in delimiting an occupying power's rights and duties. This body of law is widely recognized as applicable in situations of occupation and, consequently, may impose formal obligations on an occupying power, or serve as a basis for altering existing local laws. The International Court of Justice has pointed to the relevance of human rights law in times of occupation and to an occupying power's legal obligation to take this body of norms into account in both its conduct and in the policies it develops in an occupied territory.18 It is therefore necessary to identify how, and to what extent, human rights law applies in occupied territory and to explore the interplay between human rights law and the law of occupation. c) The Use of Force in Occupied Territory Another challenge raised by recent examples of occupation is the identification of the legal framework governing the use of force by an occupying power. Occupation is often characterized by the continuation or resumption of hostilities between, on the one hand, the occupying forces and, on the other, the armed forces of the occupied territory and/or other organized armed groups more or less affiliated to the ousted government. Force might also be used by an occupying power within the framework of its obligation to restore and maintain public order in an occupied territory. Even though article 43 of the 1907 Hague Regulations has always been interpreted as a central provision of occupation law, its implementation still raises important operational and legal questions, particularly when it comes to the use of force by an occupying power. As some occupations have evidenced, regulation of the use of force in cases of civil unrest and in response to ongoing armed opposition (hostilities) is not clear-cut. Although an occupying power is meant to maintain security by means of law enforcement, uncertainty persists with regard to the applicable legal regime in situations where it is difficult to distinguish civil unrest from hostilities or where an occupying power is confronted by both at the same time in the entirety, or parts of, an occupied territory. The law of occupation is silent on the separation or interaction between law enforcement measures and the use of military force under a conduct of hostilities paradigm, thus leaving a significant degree of uncertainty regarding identification of the relevant legal regime(s) governing the use of force in occupied territory. This inevitably opens the door to different interpretations on how force may be resorted to in an occupied territory, in what circumstances and according to which body of law. Ultimately, uncertainty over the applicable legal regime may affect the protection 18 ICJ, Case Concerning Armed Activities on the Territory of the Congo (Democratic Republic of the Congo v. Uganda), Judgment, 2005.

138

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

29

afforded to an occupied population. It is believed that there is a need to clarify how the rules governing law enforcement and those regulating the conduct of hostilities interact in practice in the context of an occupation. d) The Applicability of Occupation Law to UN Operations Aside from the various challenges posed by contemporary occupations, another set of questions arises in relation to the applicability of occupation law to operations under the command and control of the UN. In the course of its field deployments, the UN may find itself in a position to assume governmental functions in lieu of the relevant territorial sovereign. In such cases, it is critical to determine whether occupation law is applicable, the precise conditions that must be fulfilled for its applicability and, in case it is deemed applicable, whether occupation by an international organization is subject to the same legal constraints imposed on individual states exercising effective control over foreign territory. It may be observed that operations carried out under the auspices of the UN, such as the ones in Kosovo and East Timor, shared many similarities with traditional military occupation. Consequently, where UN operations imply the international administration of a territory - and particularly when the international authorities are vested with extensive executive and legislative powers - the rules governing occupation appear relevant even if only applicable by analogy in most of the cases. In these situations, IHL might provide practical solutions to many of the problems that arise and could inform the policies undertaken by the international administration. It would thus appear that the applicability of IHL to internationally administered territories needs to be still more precisely delineated in light of the specific nature and objectives of such operations. e) ICRC expert process The range of legal challenges raised by contemporary forms of occupation outlined above have been at the core of an exploratory process undertaken by the ICRC on "Occupation and Other Forms of Administration of Foreign Territory". The purpose of this initiative, which began in 2007, was to analyse whether and to what extent the rules of occupation law are adequate to deal with the humanitarian and legal challenges arising in contemporary occupations, and whether they might need to be reinforced or clarified. Three informal meetings involving some thirty experts drawn from states, international organizations, academic circles and the NGO community were organized in 2008 and 2009 with a view to addressing, in more detail, the legal issues raised above. The meetings focused, respectively, on legal questions related to: i) the beginning and end of occupation, ii) the delimitation of the rights and duties of an occupying power/the relevance of occupation law for United Nations administration of territory and, iii) the use of force in occupied territory. The experts participated in their personal capacity and the meetings were held under the Chatham House Rule. The publication of a report on the discussions that took place at the expert meetings is planned for the end of 2011. The report aims to give a substantive account of the main points discussed and the various positions expressed during the expert meetings. The report does not reflect the ICRC's views on the subject matter addressed, but provides a overview of the range of current legal positions on the three broad groups of questions identified. The ICRC believes that the report - which is the final outcome of the exploratory process - will serve to inform and nourish ongoing and future legal debates on the need for clarification of some of the most significant provisions of occupation law.

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

139 30

3) IHL and Multinational Forces Over the years the responsibilities and tasks assigned to multinational forces have transcended the traditional monitoring of ceasefires and the observation of fragile peace settlements. The spectrum of operations involving multinational forces (hereafter peace operations), whether conducted under UN auspices or under UN command and control, has grown increasingly broad and has come to include dimensions such as conflict prevention, peace-keeping, peace-making, peace-enforcement and peace-building. The role of multinational forces has changed, in particular, since the conflict in the former Yugoslavia in the 1990s. The missions of multinational forces in Afghanistan, the Democratic Republic of the Congo, Somalia or Libya are not limited to ensuring cease-fires or monitoring buffer zones but are characterized by their participation in hostilities. Today, the multifaceted nature of these operations and the ever more difficult and violent environments in which their personnel operate - sometimes requiring them to fight on the side of one party to a conflict against another - highlight how important it is for the international community to develop a coherent legal framework that embraces the complexity of peace operations. Insofar as the new features of such operations render it more likely that multinational forces will become involved in the use of force, the question of when and how IHL will apply to their actions becomes all the more relevant. If, at first sight, one may think that everything on this issue has been said, it is submitted that a number of legal questions relating to peace operations remain unsettled and, in light of their importance and consequences, deserve to be closely examined. a) Applicability of IHL to Multinational Forces One of the most sensitive issues in relation to multinational forces is the legal classification of the situation they may be involved in under IHL. As occasionally demonstrated in practice, certain states and international organizations engaged in peace operations have been reluctant to accept that IHL is applicable to their actions, even when the criteria for its applicability have been fulfilled. For a long time the very idea that IHL could be applicable to multinational forces was disregarded. It was often contended that multinational forces, in particular UN forces, could not be a party to an armed conflict, and consequently could not be bound by IHL. That position was justified by reference to the fact that multinational forces generally operate on behalf of the international community as a whole, thus precluding them from either being deemed a "party" to an armed conflict, or a "power" within the meaning of the Geneva Conventions. It was claimed that, based on their international legitimacy, multinational forces had to be considered to be impartial, objective and neutral given that their only interest in a particular armed conflict is the restoration and maintenance of international peace and security. It is submitted that the above position erases the distinction between the ius ad bellum and the ius in bello. The applicability of the latter to multinational forces, similar to any other actor, depends on the factual circumstances on the ground and on the fulfilment of specific legal conditions. The nature of a situation and the correlative applicability of IHL must be determined irrespective of the international mandate assigned to multinational forces by inter alia the UN Security Council, and of the designation given to the parties opposed to them. The mandate and the legitimacy of a mission entrusted to multinational forces are issues which fall within the province of ius ad bellum, and should have no effect on the applicability of IHL to peace operations, as is the case in respect of IHL application to other situations. The distinction between IHL and the ius ad bellum is also essential to preserving the aim of IHL, which is to ensure effective protection for all victims of armed conflict. Whether recourse to the use of force is legitimate or not cannot absolve a participant of his obligations under

140

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

31

IHL, nor deprive anyone of the protection provided by this body of rules.19 Maintaining the distinction is also important in order to maintain the principle of equality of belligerents, mentioned earlier, which lies at the very heart of IHL. Given that multinational forces are more often than not deployed in conflict zones it becomes essential to determine when a situation is an armed conflict in which IHL will constitute an additional legal framework governing a specific operation. The ICRC's view, which has been stated on various occasions, is that multinational forces are bound by IHL when conditions for its applicability have been met. It is submitted that the criteria used to determine the existence of an armed conflict involving multinational forces do not differ from those applied to more "classic" armed conflicts, whether international or non-international. Some legal debates on IHL applicability to peace operations have nevertheless been characterized by recurrent attempts to raise the bar for its threshold of applicability. It has been contended, in particular, that when multinational forces are involved, a higher degree of intensity of violence should be required before an armed conflict may be said to exist. b) Conflict Classification in Multinational Operations It has often been argued that the involvement of multinational forces in an armed conflict necessarily internationalizes the latter and triggers the application of the law governing IAC. However, this opinion is not unanimously accepted. 20 Even though attractive in terms of protective effect, as it means that persons affected would benefit from the full range of IHL rules governing IAC, it is not consistent with operational and legal reality. To give just one example, there is nothing to suggest that states involved in a NIAC would be willing to grant ROW status to captured members of organized non-state armed groups, as would be required under IHL applicable in IAC. There is thus an enduring controversy over the material field of application of IHL in peace operations. The question whether the legal frame of reference should be the law governing IAC or that applicable to NIAC remains unsettled. While with regard to rules regulating the conduct of hostilities there is probably no difference in practice because, as has been explained above, most treaty based rules applicable in IAC are also generally accepted as applying in NIAC as a matter of customary law, the issue is important when it comes, for instance, to the status of persons deprived of liberty (or the legal basis for the ICRC's activities). In approaching this issue the ICRC has opted for an approach similar to that adopted by the International Court of Justice in its 1986 judgment in the case of Military and Paramilitary Activities in and against Nicaragua (Nicaragua v. USA). It involves examining and defining, for the purposes of IHL, each bilateral relationship between belligerents in a given situation. In accordance with this approach, when multinational forces are fighting against state armed forces, the legal framework will be IHL applicable to IAC. When multinational forces with the consent of a host government are opposed to an organized non-state armed group (or groups), the legal frame of reference will be IHL applicable to NIAC. c) Determining Who Is a Party to an Armed Conflict

This is not the approach taken in the 1994 UN Convention on the Safety of United Nations and Associated Personnel and its Optional Protocol of 2005. These instruments criminalize acts (such as attacks on members of United Nations operations or on other associated personnel involved in hostilities with a combat role) that are not prohibited under IHL, thereby, it may be argued, undermining the principle of equality of belligerents under IHL. 20 For the ICRC's view, see section on the typology of non-international armed conflicts above.

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

141 32

The involvement of multinational forces in armed conflicts also raises a set of issues related to the determination of who should be considered a party to an armed conflict among the participants of a peace operation. Should it be argued that only troop contributing countries are a party to the conflict for the purposes of IHL? What about the international organization under whose command and control the multinational forces operate? How should member states of an international organization who are not participating in a military action be regarded under IHL? Can a presumption of being a party to an armed conflict be established for those participating in a coalition, irrespective of the functions they effectively perform therein? These and other similar questions do not seem to have attracted sufficient analysis so far. This presumably results from a reluctance to acknowledge that international organizations and/or troop contributing countries acting on behalf the international community may themselves be parties to an armed conflict. Nonetheless, the issues are important and would need to be examined in more depth. d) Detention by Multinational Forces Peace operations today are also characterized by the recurrent involvement of multinational forces in the detention of individuals. One of the main challenges in that situation is to ensure that multinational forces meet their international obligations, including those based on IHL, when handling detainees. The challenges are particularly acute in relation to procedural safeguards for detention in NIACs, and also with regard to the transfer of detainees to local authorities or to other troop contributing countries. These and other issues are being examined within an intergovernmental project on the "Handling of Detainees in International Military Operations", known as the Copenhagen Process, launched by the Government of Denmark in 2007. The UN Department of Peacekeeping Operations has likewise been working on its Standard Operating Procedures on "Detention in United Nations Peace Operations". Both initiatives aim to draw up common legal and operational rules governing detention in multilateral operations. This is an important and difficult task, as one of the main challenges is to develop common standards that will adequately reflect states' obligations under the applicable bodies of international law. In light of the importance of the challenges raised by detention in armed conflict, particularly in NIAC, the ICRC - as already mentioned - believes that this is an area of IHL that should be strengthened, whether by means of treaty law or otherwise. It has been included in the ICRC's report on Strengthening Legal Protection for Victims of Armed Conflict submitted to the 31st International Conference. e) Legal Interoperability The participation of states and international organizations in peace operations not only gives rise to questions related to the applicable law, but also to its interpretation. This is because the "unity of effort" - in military parlance - sought in peace operations is often impacted by inconsistent interpretations and application of IHL by troop contributing countries operating on the basis of different legal standards. The concept of "legal interoperability" has emerged as a way of managing legal differences between coalition partners with a view to rendering the conduct of multinational operations as effective as possible, while respecting the relevant applicable law. An important practical challenge is to ensure that peace operations are conducted taking into consideration the different levels of ratification of IHL instruments and the different interpretations of those treaties and of customary IHL by troop contributing states.

142

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

33

Legal interoperability is not always easy to implement given the complexity of the legal framework in peace operations, which is made up of several layers. These include the UN Security Council mandate in a given situation, relevant treaty and customary law obligations, status of forces agreements, memoranda of understanding signed by coalition partners, standard operating procedures, and rules of engagement, to name the most important. The many legal sources that must be taken into account may make it objectively difficult for partners in a peace operation to reach a common understanding of their respective obligations as a precondition for enabling legal interoperability and for ensuring that a multinational operation is not carried out based on the lowest common legal denominator. Legal uncertainty, it hardly needs to be said, could ultimately impinge upon the protection afforded by IHL to the victims of armed conflicts. The ICRC is thus of the view that further analysis is needed in order to more precisely assess the overall effect of the legal interoperability question on IHL applicability, and application, in peace operations. f) Responsibility for Internationally Wrongful Acts It should finally be noted that the complexity inherent to peace operations has brought to the fore the question of where legal responsibility lies when internationally wrongful acts occur in the course of such operations. It has become increasingly difficult to provide an adequate legal answer. The issue remains of practical importance, however, as demonstrated by the growing number of cases being litigated in domestic and international courts, and has a direct bearing on the broader question of the relationship between state responsibility and the responsibility of international organizations. More clarity would, for example, be necessary regarding how to determine the attribution of wrongful acts possibly committed in the course of peace operations and the responsibility that arises as a result. Many related questions would deserve to be answered, such as: does international responsibility for a wrongful act lie solely with one actor and, if so, with whom (the lead/framework state, the state whose armed forces committed the violation, the international organization under whose authority or command and control the troop contributing state operates)? Can states and international organizations bear concurrent responsibility? Under what conditions and circumstances can these responsibilities be established? Should it be considered that peace operations represent a situation in which troop contributing States have the obligation to ensure respect for IHL as set forth in common article 1 to the Geneva Conventions by preventing conduct contrary to IHL by cobelligerents? While itself engaged in continued reflection on these issues, the ICRC is also taking part in an expert process launched in 2009 by the Swedish National Defence College which aims to provide responses to some of the questions mentioned above. Its particular focus is on "Responsibility in Multinational Military Operations". 4) Private Military and Security Companies (PMSCs) a) Humanitarian Challenges of Increasing PMSC Presence The past decade has seen a marked trend towards the outsourcing of traditional military functions to PMSCs. The humanitarian challenges raised by the increasing presence of PMSCs in armed conflict situations were mentioned in the 2007 report to the International Conference. Most of the ongoing discussions surrounding PMSCs continue to focus on the issue of the legitimacy of the outsourcing of such functions and on whether there should be limits on the right of states to transfer their "monopoly of force" to private actors. Whatever the responses to the above dilemmas might be, it is realistic to assume that the presence of PMSCs in situations of armed conflict will continue to increase in the medium term. Many

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

143 34

states are in the process of downsizing their armed forces, while the growing complexity of weapons systems means that militaries are increasingly relying on outside technical expertise and training to operate them. Moreover, PMSCs clients are not only states; international organizations, non-governmental organizations and transnational business corporations have contracted their services as well, and it cannot be excluded that multinational military operations or armed opposition groups hire PMSCs to fight on their behalf in the future. In parallel to the marked increase of the number of PMSCs present in situations of armed conflict, their activities have also become more closely related to military operations and involve, among other things, the protection of military personnel and infrastructure, the training and advising of armed forces, the maintenance of weapons systems, the guarding and interrogation of detainees. These activities have brought PMSCs into closer contact with persons protected by IHL and have also increased the exposure of their personnel to the dangers arising from the military operations. b) International Initiatives for the Regulation of PMSCs In response to the increased presence of PMSCs, several international initiatives have been undertaken with a view to clarifying, reaffirming or developing international legal standards regulating their activities and, in particular, ensuring their compliance with standards of conduct reflected in IHL and human rights law. They are briefly mentioned below. i) Montreux Document In 2005 the Swiss Federal Department of Foreign Affairs and the ICRC launched a joint initiative to promote respect for IHL and human rights law in the context of PMSC operations in situations of armed conflict. The initiative involved not only governments, but also drew on the experience and expertise of industry representatives, academic experts and nongovernmental organizations. It resulted, in 2008, in the endorsement of the Montreux Document on Pertinent International Legal Obligations and Good Practices for States Related to Operations of Private Military and Security Companies During Armed Conflict by 17 participating states.21 The Montreux Document refutes the misconception that private contractors operate in a legal vacuum. It does not create new law, but restates and reaffirms the existing legal obligations of states with regard to private military and security companies contracted by them, operating within their territory, or incorporated under their jurisdiction. It also recommends a catalogue of good practices for the practical implementation of existing legal obligations. As of April 2011 another 19 states had expressed their support for the Montreux Document, thus bringing the total number to 36.22 The Swiss Federal Department of Foreign Affairs and the ICRC have also published a brochure introducing and explaining the Montreux Document in a manner accessible to the wider public. The text was transmitted to the UN SecretaryGeneral by the Swiss Government in 2008 and is now also a UN document available in English, French, Spanish, Arabic, Chinese and Russian. 21

Afghanistan, Angola, Australia, Austria, Canada, China, France, Germany, Iraq, Poland, Sierra Leone, South Africa, Sweden, Switzerland, the United Kingdom, Ukraine, and the United States of America. 22 With dates of endorsement: Macedonia (3 February 2009), Ecuador (12 February 2009), Albania (17 February 2009), Netherlands (20 February 2009), Bosnia and Herzegovina (9 March 2009), Greece (13 March 2009), Portugal (27 March 2009), Chile (06 April 2009), Uruguay (22 April 2009), Liechtenstein (27 April 2009), Qatar (30 April 2009), Jordan (18 May 2009), Spain (20 May 2009), Italy (15 June 2009), Uganda (23 July 2009), Cyprus (29 September 2009), Georgia (22 October 2009), Denmark (9 August 2010), Hungary (1 February 2011).

144

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

35

ii) International Code of Conduct for Private Security Service Providers In parallel to the Montreux process, the Swiss Federal Department of Foreign Affairs facilitated a multi-stakeholder initiative aiming at producing an industry-wide code of conduct articulating principles aimed at enabling private security service providers to operate in accordance with IHL and international human rights standards. In November 2010 an International Code of Conduct for Private Security Service Providers was adopted in Geneva by nearly 60 private security providers and, since then, numerous other companies have also expressed their adherence. The Code enunciates an industry commitment to strict standards of conduct both in the area of the use of force and in the treatment of detainees and other persons who find themselves in the power of, or otherwise exposed to, the activities of PMSCs. It was initially foreseen that the Code would include two parts: one describing standards of conduct, management and governance, and another providing an international governance and oversight mechanism ensuring compliance with the Code by signatory companies. As of this writing the finalized Code includes only the first part ("Principles Regarding Conduct of Personnel" and "Commitments Regarding Management and Governance"), whereas the planned oversight and governance mechanism, which should be integrated into the Code at a review conference, is in the process of being developed by a temporary steering committee composed of representatives of three stakeholder groups (industry, governments and civil society). The ICRC welcomed the Code of Conduct as an initiative aimed at ensuring adherence by PMSCs to recognized standards of IHL and human rights law, thereby contributing to the better protection of victims of armed conflict and situations of violence below that threshold. As a co-sponsor of the Montreux Document the ICRC finds it important to recall that initiatives aimed at self-regulation by PMSCs, while undoubtedly important (particularly when they include an industry-run accountability mechanism), cannot replace the primary responsibility of states for ensuring respect for IHL by PMSCs in situations of armed conflict. Hi) UN Working Group (Draft Convention) In July 2005 the UN Commission on Human Rights established a "Working Group on the Use of Mercenaries as a Means of Violating Human Rights and Impeding the Exercise of the Rights of Peoples to Self-determination" composed of five experts, each drawn from one of the world's geographic regions. The Working Group reported back to the UN Human Rights Council in September 2010 on the proposed elements of a possible new international convention to regulate the activities of PMSCs and to encourage respect for human rights by such companies. That same month the UN Human Rights Council adopted a resolution creating an Open-ended Intergovernmental Working Group with a mandate to "consider the possibility of elaborating an international regulatory framework on the regulation, monitoring and oversight of the activities of private military and security companies". The working group is to meet once annually for five-day sessions, the first of which was held in May 2011. c) The Humanitarian Goal The ICRC is not - and cannot be - involved in debates related to the legitimacy of the use of private military and security companies in situations of armed conflict. Its exclusively humanitarian goal is to foster observance of IHL by the personnel of such companies when they operate in armed conflict situations by, among other things, providing guidance on what their legal obligations are and, most importantly, by encouraging states to take measures to ensure respect for IHL by the companies and their personnel and hold them accountable if necessary. It is with this aim that the ICRC focuses in its operational activities on promoting existing legal obligations as reflected in the Montreux Document. The organization also

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

145 36

welcomes other initiatives aimed at ensuring that PMSCs and their personnel do not commit acts that would be contrary to IHL and other bodies of law. Each of the initiatives mentioned above may be deemed complementary and may serve to reinforce the protection of persons affected by armed conflict or situations of violence below that threshold.

V. MEANS AND METHODS OF WARFARE 1) New Technologies of Warfare There can be no doubt that IHL applies to new weaponry and to the employment in warfare of new technological developments, as recognized, inter alia, in article 36 of Additional Protocol I23. Nonetheless, applying pre-existing legal rules to a new technology raises the question of whether the rules are sufficiently clear in light of the technology's specific characteristics, as well as with regard to the foreseeable humanitarian impact it may have. In recent years a wide array of new technologies has entered the modern battlefield. Cyberspace has opened up a potentially new war-fighting domain. Remote controlled weapons systems such as drones are increasingly being used by the parties to armed conflicts. Automated weapons systems are also on the rise, and certain autonomous systems such as combat robots are being considered for future use on the battlefield. Each of these technologies raises a host of legal issues, only some of which will be briefly mentioned below. a) "Cyber Warfare" Over the last several years the interest in legal issues generated by the possible conduct of hostilities in and via cyberspace - has been particularly high. Cyberspace has opened up a potentially new war-fighting domain, a man-made theatre of war additional to the natural theatres of land, air, sea and outer space and is interlinked with all of them. It is a virtual space that provides worldwide interconnectivity regardless of borders. While these features are of great utility in peacetime, interconnectivity also means that whatever has an interface with the internet can be targeted from anywhere in the world. Interconnectivity also means that the effects of an attack may have repercussions on various other systems given that military networks are in many cases dependent on commercial infrastructure. Cyber operations can be broadly described as operations against or via a computer or a computer system through a data stream. Such operations can aim to do different things, for instance to infiltrate a system and collect, export, destroy, change, or encrypt data or to trigger, alter or otherwise manipulate processes controlled by the infiltrated computer system. By these means, a variety of "targets" in the real world can be destroyed, altered or disrupted, such as industries, infrastructures, telecommunications, or financial systems. The potential effects of such operations in are therefore of serious humanitarian concern. For instance, by tampering with the supporting computer systems, one can manipulate an enemy's air traffic control systems, oil pipeline flow systems or nuclear plants. The fact that a particular military activity is not specifically regulated does not mean that it can be used without restrictions. In the ICRC's view, means and methods of warfare which 23 In accordance with article 36 of Additional Protocol I on the study, development or adoption of a new weapon or method of warfare, states parties are under an obligation to determine whether their employment would, in some or all circumstances, be prohibited by the Protocol or any other rule of international law applicable to them.

146

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

37

resort to cyber technology are subject to IHL just as any new weapon or delivery system has been so far when used in an armed conflict by or on behalf of a party to such conflict. If a cyber operations is used against an enemy in an armed conflict in order to cause damage, for example by manipulation of an air traffic control system that results in the crash of a civilian aircraft, it can hardly be disputed that such an attack is in fact a method of warfare and is subject to prohibitions under IHL. This being said, reconciling the emergence of cyberspace as a new war-fighting domain with the legal framework governing armed conflict is a challenging task in several respects and requires careful reflection. The ensuing is an illustration of questions that are being debated: Firstly, the digitalization on which cyberspace is built ensures anonymity and thus complicates the attribution of conduct. Thus, in most cases, it appears that it is difficult if not impossible to identify the author of an attack. Since IHL relies on the attribution of responsibility to individuals and parties to conflicts, major difficulties arise. In particular, if the perpetrator of a given operation and thus the link of the operation to an armed conflict cannot be identified, it is extremely difficult to determine whether IHL is even applicable to the operation. Secondly, there is no doubt that an armed conflict exists and IHL applies once traditional kinetic weapons are used in combination with cyber operations. However, a particularly difficult situation as regards the applicability of IHL arises when the first, or the only, "hostile" acts are conducted by means of a cyber operation. Can this be qualified as constituting an armed conflict within the meaning of the Geneva Conventions and other IHL treaties? Does it depend on the type of operation, i.e. would the manipulation or deletion of data suffice or is physical damage as the result of a manipulation required? It would appear that the answer to these questions will probably be determined in a definite manner only through future state practice. Thirdly, the definition of the term "attack" is of decisive importance for the application of the various rules giving effect to the IHL principle of distinction. It should be borne in mind that Additional Protocol I and customary IHL contain a specific definition of the term which is not identical to that provided for in other branches of law. Under article 49 (1) of Additional Protocol I, "attacks" means acts of violence against the adversary, whether in offence or in defence. The term "acts of violence" denotes physical force. Based on that interpretation, which the ICRC shares, cyber operations by means of viruses, worms, etc., that result in physical damage to persons, or damage to objects that goes beyond the computer program or data attacked could be qualified as "acts of violence", i.e. as an attack in the sense of IHL. It is sometimes claimed that cyber operations do not fall within the definition of "attack" as long as they do not result in physical destruction or when its effects are reversible. If this claim implies that an attack against a civilian object may be considered lawful in such cases, it is unfounded under existing law in the view of the ICRC. Under IHL, attacks may only be directed at military objectives, while objects not falling within that definition are civilian and may not be attacked. The definition of military objectives is not dependent on the method of warfare used and must be applied to both kinetic and non-kinetic means; the fact that a cyber operation does not lead to the destruction of an attacked object is also irrelevant. Pursuant to article 52 (2) of Additional Protocol I, only objects that make an effective contribution to military action and whose total or partial destruction, capture or neutralization offers a definite military advantage, may be attacked. By referring not only to destruction or capture of the object but also to its neutralization the definition implies that it is immaterial whether an object is disabled through destruction or in any other way.

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

147 38

Fourthly, when cyber operations constitute an attack, Additional Protocol I imposes: i) the obligation to direct attacks only against "military objectives" and not to attack civilians or civilian objects, ii) the prohibition of indiscriminate attacks, as well as of attacks that may be expected to cause excessive incidental civilian casualties or damages, and iii) the requirement to take the necessary precautions to ensure that the previous two rules are respected (in particular the requirement to minimise incidental civilian damage and the obligation to abstain from attacks if such damage is likely to be excessive to the value of the military objective to be attacked). It is submitted that these rules operate in the same way whether the attack is carried out using traditional weapons or by reliance on a computer network. Problems that arise in applying these rules are therefore not necessarily unique to cyber operations. Still some issues remain: As already explained above, IHL prohibits indiscriminate attacks. Based on what is publicly known about cyber operations thus far, ensuring compliance with this rule poses very serious challenges. The question that arises is whether cyber operations may be accurately aimed at the intended target and, even if that is the case, whether effects upon civilian infrastructure could be prevented due to the interconnectedness of military and civilian computer networks. An obvious example would be the release of a virus or a range of viruses into the computer systems of a target state. Even if introduced only into its military network a sufficiently virulent virus could seep out into its civilian systems and even beyond its borders and disrupt or destroy the infrastructure that relies on them. Such viruses would be considered indiscriminate under existing IHL because they cannot be directed against a specific military objective and would be a means or method of combat the effects of which cannot be limited as required by IHL. Cyber operations pose not only the question of how to observe the prohibition of indiscriminate attacks, but also of disproportionate attacks. A particular issue that arises, and requires careful reflection, is whether it is in practice possible to fully anticipate all the reverberating consequences/knock-on effects on civilians and civilian objects of an attack otherwise directed at a legitimate military objective. Respect for the principles of distinction and proportionality means that certain precautions in attack, provided for in article 57 of Additional Protocol I, must be taken. This includes the obligation of an attacker to take all feasible precautions in the choice of means and methods of attack with a view to avoiding, and in any event to minimizing, incidental civilian casualties and damages. Since in certain cases cyber operations might cause fewer incidental civilian casualties and less incidental civilian damage compared to the use of conventional weapons, it may be argued that in such circumstances this rule would require that a commander consider whether he or she can achieve the same military advantage by using a means and methods of warfare which resort to cyber technology, if it is practicable. In sum, despite the newness of the technology, legal constraints apply to means and methods of warfare which resort to cyber technology. While there is no IHL provision that explicitly bans them, it is clear that cyber operations in armed conflict may only be undertaken to the degree and in a way that respects existing law. The ICRC has been following and will continue to closely follow developments related to the use of cyberspace for military purposes and assess their potential humanitarian impact with a view to contributing to ensure that the relevant IHL rules are observed. b) Remote Controlled Weapons Systems

148

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

39

One of the main features of remote controlled weapon systems is that they allow combatants to be physically absent from a zone of combat operations. This new technology, like certain other advances in military technology can, on the one hand, help belligerents direct their attacks more precisely against military objectives and thus reduce civilian casualties and damage to civilian objects. It may, on the other hand, also increase the opportunities of attacking an adversary and thus put the civilian population and civilian objects at greater corresponding exposure to incidental harm. Despite the distance between persons operating remote control weapons or weapons systems and a battlefield, the technology requires that a human operator activate, direct and fire the weapon concerned. The responsibility for respecting IHL, including the suspension of an attack if IHL rules cannot be respected, thus clearly belongs to the individual(s) concerned and the relevant party to an armed conflict. Remote controlled drones are a conspicuous example of a remote controlled weapons system. They have greatly enhanced real-time aerial surveillance possibilities, thereby enlarging the toolbox of precautionary measures that may be taken in advance of an attack. But remote controlled weapon systems also entail risks. Studies have shown that disconnecting a person, especially by means of distance (be it physical or emotional) from a potential adversary makes targeting easier and abuses more likely. It has also been noted that challenges to the responsible operation of such a system include the limited capacity of an operator to process a large volume of data, including contradictory data at a given time ("information overload"), and the supervision of more than one such system at a time, leading to questions about the operator's ability to fully comply with the relevant rules of IHL in those circumstances. c) Automated Weapons Systems An automated weapon or weapons system is one that is able to function in a self-contained and independent manner although its employment may initially be deployed or directed by a human operator. Examples of such systems include automated sentry guns, sensor-fused munitions and certain anti-vehicle landmines. Although deployed by humans, such systems will independently verify or detect a particular type of target object and then fire or detonate. An automated sentry gun may fire, or not, following voice verification of a potential intruder based on a password. How the system would differentiate a civilian from a combatant, a wounded or incapacitated combatant from an attacker or persons unable to understand or respond to a verbal warning from the system (possible if a foreign language is used), is unclear. Likewise, sensor-fused munitions, programmed to locate and attack a particular type of military object (e.g. tanks) will, once launched, attack such objects on their own if the object type has been verified by the system. The capacity to discriminate, as required by IHL, will depend entirely on the quality and variety of sensors and programming employed within the system. The central challenge with automated systems is to ensure that they are indeed capable of the level of discrimination required by IHL. Similarly, it is not clear how these weapons could assess the incidental loss of civilian lives, injury to civilians or damage to civilian objects, and therefore comply with the principle of proportionality. d) Autonomous Weapons Systems An autonomous weapon system is one that can learn or adapt its functioning in response to changing circumstances in the environment in which it is deployed. A truly autonomous system would have artificial intelligence that would have to be capable of implementing IHL. Such systems have not yet been weaponized although there is considerable interest within expert literature and considerable funding of relevant research. The deployment of such systems would reflect a paradigm shift and a major qualitative change in the conduct of hostilities. It also raises a range of fundamental legal, ethical and societal issues which need to be considered before such systems are developed or deployed.

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

149 40

The development of a truly autonomous weapon system that can implement IHL represents a monumental programming challenge that may well prove impossible. Developing their capacity to distinguish between a civilian and combatant, an active combatant from a wounded or incapacitated one, or a combatant and a hunter would appear to be a formidable task. It is likewise hard to imagine how autonomous systems would be able to make determinations of military advantage or judgments concerning proportionality or precautions in attack in a variety of changing circumstances. In theory, it may be possible to program an autonomous weapon system to behave more ethically and more cautiously on the battlefield than a human being. After all, emotion, the loss of colleagues and personal self-interest is not an issue for a robot and the record of respect for IHL by human soldiers is far from perfect, to say the least. While there is no evidence yet that this is possible, the potential use of autonomous weapon systems nonetheless evokes a variety of difficult questions such as: is the delegation to machines of life and death choices morally acceptable? If the use of an autonomous weapon system results in a war crime, who would be legally, morally or politically responsible for the choices made by autonomous weapon systems: the programmer, the manufacturer, or the command that deploys them? If responsibility cannot be determined as required by IHL is it legal or ethical to deploy such systems? These queries suggest that the debate about the legal and other implications of the use of autonomous weapons systems will be complex and will need to carefully examine, among other things, their potential humanitarian consequences. In sum, it will evidently take some time before conclusive answers can be given to many of the legal and other questions that the technological developments highlighted in this section give rise to. It may be noted that the crucial question does not seem to be whether new technologies are good or bad in themselves, but instead what are the circumstances of their use. Likewise, new technologies do not change existing law, but rather must abide by it, taking into account that current norms do not sufficiently regulate some of the challenges posed and might need to be elaborated. For the ICRC, it is important to ensure informed discussion of the issues involved, to call attention to the necessity of assessing the potential humanitarian impact and IHL implications of new and developing technologies and to ensure that they are not employed prematurely under conditions in which respect for IHL cannot be guaranteed. 2) Use of Explosive Weapons in Densely Populated Areas Armed conflicts fought in densely populated areas have been known to cause tremendous human suffering. Civilians have paid a particularly high price both directly, in terms of the death, injury and permanent disability caused, as well as indirectly, in terms of the widespread destruction of their homes, livelihoods and infrastructure. Many civilians have, in addition, suffered less measurable but long-term psychological effects as a result of the use of explosive weapons in densely populated areas over extended periods of time. Even without scientific data it would appear safe to say that people who live in densely populated areas are likely to be more adversely affected by aerial bombardments, shelling, and the use of other explosive weapons than those who live in rural areas and are not subject in close proximity to the effects of explosive weapons in built up or urban spaces. The humanitarian and other consequences of military operations in densely populated areas were recalled by the ICRC President in a 2009 statement: "It is not only types of weapons that are changing, but also the environments in which they are often used. The debate has been prompted in part by the growing number of military operations conducted in densely populated urban areas, often using explosive force delivered by heavy weapons, which can have devastating humanitarian consequences for

150

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

41

civilian populations in such environments. [...] But various crucial questions remain with regard to the conduct of hostilities. Are applicable IHL rules sufficient to identify under which circumstances explosive force delivered by heavy weapons might be used in densely populated areas, for example? Should a higher standard be required for the verification of targets and their surroundings or for the issuance of warnings to the civilian population? Perhaps further legal development is required, but if so, how can it feasibly be monitored and enforced?" Legal Constraints on the Use of Explosive Weapons in Densely Populated Areas There are many types of explosive weapons, ranging from grenades to aerial bombs weighing hundreds of kilos. A number of legal instruments define explosive devices, but the definitions tend to be tailored to the purposes of the relevant treaty. It may nevertheless be observed that a recurring element in all conventional definitions of explosive weapons is the requirement that such weapons be activated by the detonation of a high explosive substance creating a blast and fragmentation effect. The use of explosive weapons in densely populated areas exposes the civilian population and infrastructure to heightened - and even extreme - risks of incidental or indiscriminate death, injury or destruction. Their employment is not, however, prohibited by IHL as such. The permissibility of reliance on them must therefore be determined on a case-by-case basis, taking into account IHL rules prohibiting indiscriminate and disproportionate attacks, and imposing obligations to take feasible precautions in attack.24 For the purposes of this discussion it must in particular be reiterated that indiscriminate attacks are those that are not directed at a specific military objective, that employ a method or means of combat which cannot be directed at a specific military objective, or that employ a method or means of combat the effects of which cannot be limited as required by IHL. Explicitly prohibited as a type of indiscriminate attack is "an attack by bombardment by any methods or means which treats as a single military objective a number of clearly separated and distinct military objectives located in a city, town, village, or other area containing a similar concentration of civilians or civilian objects". Equally prohibited is an attack that would violate the IHL principle of proportionality, i.e. "an attack which may be expected to cause incidental loss of civilian life, injury to civilians, damage to civilian objects, or a combination thereof, which would be excessive in relation to the concrete and direct military advantage anticipated". It must be noted that the prohibition of indiscriminate attacks was "intended to take account of the fact that means or methods of combat which can be used perfectly legitimately in some situations could, in other circumstances, have effects that would be contrary to some limitations contained in the [First Additional] Protocol, in which event their use in those circumstances would involve an indiscriminate attack".25 A circumstance that could make the use of a certain weapon indiscriminate is certainly its use in a densely populated area. To be sure, the concept of explosive weapons encompasses a great variety of weapons with varying impact areas so that not all use of explosive weapons in a densely populated area is indiscriminate by definition. While the characterization of a weapon as "explosive" indicates the specific manner in which it affects its target, the decisive criterion for its lawfulness under the prohibition of indiscriminate 24 In this context it must be recalled that the co-mingling of combatants with civilians is a feature of warfare in densely populated areas, which means that the Party under attack also has the obligation to take (to the maximum extent feasible) the precautions against the effects of attack - on civilians and civilian objects - prescribed by IHL. 25 Diplomatic Conference (1974-77), O.R. XV, p. 274, para. 55, CDDH/215/Rev.1, Annex.

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

151 42

attacks will be whether it is able, in light of its impact range and considering the density of the surrounding civilian population and infrastructure, to distinguish between the military objective targeted and civilian persons and objects and to limit its effects as required under IHL. In the same vein, throughout the planning and conduct of military operations involving the use of explosive weapons in densely populated areas the general obligation of the belligerents to take all feasible precautions with a view to sparing the civilian population, civilians and civilian objects must be applied in a particularly careful manner. In particular, all feasible precautions must be taken to verify that targets are military objectives and in the choice of means and methods of attack with a view to avoid and, in any event, minimize incidental harm. It also means that an attack must be cancelled or suspended if it may be expected to violate the principles of distinction or proportionality. In sum, due to the significant likelihood of indiscriminate effects and despite the absence of an express legal prohibition for specific types of weapons, the ICRC considers that explosive weapons with a wide impact area should be avoided in densely populated areas. 3) The Notion of Direct Participation in Hostilities under IHL As mentioned in the reports to the 28th and 30th International, the operational environment of contemporary armed conflict is changing. It is characterized, among others, by a shift of military operations into civilian population centres, by ever more involvement of civilians in military action (both on the side of States and organized armed groups), and by increasing practical difficulties in distinguishing between fighters and civilians. In light of this reality, from 2003 to 2008 the ICRC worked with a group of some 50 international legal experts participating in their personal capacity - on a project aimed at clarifying the notion of "direct participation in hostilities" under IHL. Based on a thorough evaluation of the expert discussions and on further internal research and analysis, the ICRC finalized an outcome document entitled Interpretive Guidance on the Notion of Direct Participation in Hostilities under IHL which reflects solely the ICRC's views. The primary purpose of the Interpretive Guidance is to enhance the protection of the civilian population by clarifying the distinction between civilians and combatants, as well as between civilians who are and, respectively, are not "directly participating in hostilities" under IHL. The Interpretive Guidance does not endeavour to change binding rules of IHL, rather it presents the ICRC's recommendations as to how IHL relating to the notion of direct participation in hostilities should be interpreted in contemporary armed conflicts. It is not meant to be applied on the ground as such, but to be further operationalized by military commanders and others responsible for the conduct of military operations. The text was published in June 2009, along with the proceedings of the expert process. It has been translated into French, Spanish, Chinese and Arabic in the meantime. The ICRC has also engaged in a proactive dialogue with military, governmental, non-governmental, humanitarian and academic circles in order to explain and promote the Interpretive Guidance. Provided below is very brief summary of the main questions posed in the Interpretive Guidance and the answers provided: (i) Who is considered a civilian for the purposes of the principle of distinction? The answer to this question determines the scope of persons protected against direct attack unless and for such time as they directly participate in hostilities. For the purpose of the conduct of hostilities it is important to distinguish members of organized armed forces or groups (whose continuous function is to conduct hostilities on behalf of a party to an armed

152

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

43

conflict) from civilians (who do not directly participate in hostilities, or who do so on a merely spontaneous, sporadic, or unorganized basis). In international armed conflict, all persons who are neither members of the armed forces of a party to the conflict nor participants in a levee en masse are entitled to protection against direct attack unless and for such time as they take a direct part in hostilities. Members of irregular armed forces (e.g. militia, volunteer corps, etc.) whose conduct is attributable to a state party to an armed conflict are considered part of its armed forces. They are not deemed civilians for the purposes of the conduct of hostilities even if they fail to fulfil the criteria required by IHL for combatant privilege and ROW status.26 In non-international armed conflict, all persons who are not members of state armed forces or organized armed groups of a party to the conflict are civilians and, therefore, entitled to protection against direct attack unless and for such time as they take a direct part in hostilities. In NIAC, organized armed groups constitute the armed forces of a non-state party to the conflict and consist only of individuals whose continuous function it is to directly participate in hostilities. The decisive criterion for individual membership in an organized armed group is whether a person assumes a continuous function for the group involving his or her direct participation in hostilities ("continuous combat function"). Continuous combat function does not imply de jure entitlement to combatant privilege, which in any case is absent in NIAC. Rather, it distinguishes members of the organized fighting forces of a non-state party from civilians who directly participate in hostilities on a merely spontaneous, sporadic, or unorganized basis, or who assume exclusively political, administrative or other non-combat functions. Armed violence that does not meet the requisite degree of intensity and organization to qualify as an armed conflict remains an issue of law and order, i.e. is governed by international standards and domestic law applying to law enforcement operations. This is the case even when the violence takes place during an armed conflict, whether international or non-international, if it is unrelated to the armed conflict. (II) What conduct amounts to direct participation in hostilities? The answer to this question determines the individual conduct that leads to the suspension of a civilian's protection against direct attack. The notion of direct participation in hostilities refers to specific hostile acts carried out by individuals as part of the conduct of hostilities between parties to an armed conflict. It should be interpreted synonymously in situations of international and NIAC. In order to qualify as direct participation in hostilities, a specific act must fulfil the following cumulative criteria: 1. The act must be likely to adversely affect the military operations or military capacity of a party to an armed conflict or, alternatively, to inflict death, injury, or destruction on persons or objects protected against direct attack (threshold of harm), and 2. There must be a direct causal link between the act and the harm likely to result either from that act, or from a coordinated military operation of which that act constitutes an integral part (direct causation), and

26

Membership in irregular armed forces belonging to a party to the conflict is to be determined based on the same functional criteria that apply to organized armed groups in non-international armed conflict.

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

153 44

3. The act must be specifically designed to directly cause the required threshold of harm in support of a party to the conflict and to the detriment of another (belligerent nexus). Applied in conjunction, the three requirements of threshold of harm, direct causation and belligerent nexus, permit a reliable distinction between activities amounting to direct participation in hostilities and activities which, although occurring in the context of an armed conflict, are not part of the conduct of hostilities and, therefore, do not entail loss of protection against direct attack. In addition, measures preparatory to the execution of a specific act of direct participation in hostilities, as well as the deployment to and the return from the location of its execution, constitute an integral part of that act. (Hi) What modalities govern the loss of protection against direct attack? The answer to this question deals with the following issues: a) duration of loss of protection against direct attack, b) the precautions and presumptions in situations of doubt, c) the rules and principles governing the use of force against legitimate military targets, and d) the consequences of regaining protection against direct attack. a) As regards the temporal scope of loss of protection, civilians lose protection against direct attack for the duration of each specific act amounting to direct participation in hostilities, whereas members of organized armed groups belonging to a non-state party to an armed conflict cease to be civilians (see (i) above) and lose protection against direct attack for as long as they assume their continuous combat function. b) In practice, civilian direct participation in hostilities is likely to entail significant confusion and uncertainty in the implementation of the principle of distinction. In order to avoid the erroneous or arbitrary targeting of civilians entitled to protection against direct attack, it is therefore of particular importance that all feasible precautions be taken in determining whether a person is a civilian and, if so, whether he or she is directly participating in hostilities. In case of doubt, the person in question must be presumed to be protected against direct attack. c) Loss of protection against direct attack, whether due to direct participation in hostilities (civilians) or continuous combat function (members of organized armed groups), does not mean that no further legal restrictions apply. It is a fundamental principle of customary and treaty IHL that "[t]he right of belligerents to adopt means of injuring the enemy is not unlimited". Even direct attacks against legitimate military targets are subject to legal constraints, whether based on specific provisions of IHL, on the principles underlying IHL as a whole, or on other applicable branches of international law. Thus, in addition to the restraints imposed by IHL on specific means and methods of warfare, and without prejudice to further restrictions that may arise under other applicable branches of international law, the kind and degree of force which is permissible against persons not entitled to protection against direct attack must not exceed what is actually necessary to accomplish a legitimate military purpose in the prevailing circumstances. d) Finally, as has already been mentioned above, IHL neither prohibits nor privileges civilian direct participation in hostilities. When civilians cease to directly participate in hostilities, or when members of organized armed groups belonging to a non-state party to an armed conflict cease to assume their continuous combat function, they regain full civilian protection against direct attack, but are not exempted from prosecution for violations of domestic and international law that might have been committed.

154

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

45

It should be noted that certain aspects of the Interpretive Guidance have generated legal debates in government, academic and NGO circles since its publication. An issue, for example, that has proved controversial is the concept of continuous combat function as described above. While some consider that it has been too narrowly drawn others believe, to the contrary, that it has been too widely conceived. A similar range of opinions have been expressed with regard to the ICRC's view that civilians directly participating in hostilities on an unorganized or sporadic basis may be subject to attack only for the duration of each specific act of direct participation. While some believe that this approach is unacceptable as it recognizes the so-called "revolving door" of protection for persons who sporadically take part in hostilities, others believe that it should be applied to any civilian taking a direct part in hostilities, i.e. even those who do so on an organized basis. The position enunciated in recommendation IX of the Interpretive Guidance, according to which "the kind and degree of force which is permissible against persons not entitled to protection against direct attack must not exceed what is actually necessary to accomplish a legitimate military purpose in the prevailing circumstances" has also been the subject of diverse opinions. The main criticism is that the introduction of an element of necessity into the targeting process against persons directly participating in hostilities is not supported by existing law. It is argued that IHL allows attacks against persons directly participating in hostilities regardless of whether, in the particular circumstances, means other than the use of lethal force would suffice to achieve a desired operational outcome. The ICRC deliberated on each of these critiques, as well as others that have been expressed, in the process of preparing the final text of the Guidance which, in its view, presents a "package" of carefully balanced legal and operational considerations. The organization is closely following the reception of the Interpretive Guidance and the different positions expressed in relation to some of the recommendations made and is ready to engage in further exchanges aimed at both clarifying particular aspects of the Guidance and explaining their interlinking nature. 4) The Arms Trade Treaty Every year, because Inter alia of the inadequately regulated availability and misuse of conventional weapons, hundreds of thousands of civilians are displaced, injured, or killed. In many parts of the world, weapons are so easy to obtain and armed violence is so prevalent that even after an armed conflict, civilians face many of the same threats as when it was ongoing. States party to the Geneva Conventions first expressed concern at the rapid expansion of the arms trade and the unregulated proliferation of weapons during the 26th International Conference of the Red Cross and Red Crescent in 1995. The Conference mandated the ICRC to conduct a study on the implications of arms availability for IHL and the situation of civilians in armed conflicts. The study concluded that the widespread availability of arms facilitates violations of IHL and has harmful consequences both for civilians and for humanitarian assistance operations during and after armed conflicts. As long as weapons are too easily available, serious IHL violations will be made more likely and the provision of humanitarian assistance endangered. Since the conclusion of its study in 1999, the ICRC has called for stricter regulation of international transfers of weapons and ammunition and evaluation of the likely respect by recipients for IHL as a means to reduce the suffering caused by the poorly regulated availability of weapons. a) Elaborating the Arms Trade Treaty

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

155 46

Since 2006, the UN General Assembly has repeatedly recognized that the absence of common international standards for the transfer of conventional arms contributes to armed conflict, the displacement of people, crime and terrorism, which, in turn, undermine peace, reconciliation, safety, security, stability and sustainable social and economic development. In January 2010 the General Assembly decided to convene, in 2012, the UN Conference on the Arms Trade Treaty (ATT) to elaborate a legally binding instrument "containing the highest possible common international standards for the transfer of conventional arms". For the ATT to be truly effective, its scope and transfer criteria will need to be consistent with the object and purpose of the ATT, which is to prevent the problems resulting from the unregulated trade in conventional weapons. As explained by the Chairman of the ATT process in his conclusion of a March 2011 Preparatory Meeting, one of the "Goals and Objectives" of the ATT is to: "Contribute to international and regional peace, security and stability by preventing international transfers of conventional arms that contribute to or facilitate: human suffering, serious violations of international human rights law and international humanitarian law, violations of United Nations Security Council sanctions and arms, embargoes and other international obligations, armed conflict, the displacement of people, organized crime, terrorist acts and thereby undermining peace, reconciliation, safety, security, stability and sustainable social and economic development (...)". To date, states' positions range from favouring a comprehensive treaty scope that regulates the trade of all conventional weapons and their ammunition, to supporting a scope that is limited to the seven categories of weapons under the UN Register of Conventional Arms. Other states favour a scope that lies somewhere in between the two approaches: the seven UN Register categories plus small arms and light weapons (SALW), the seven UN Register categories plus SALW and ammunition, or a comprehensive range of conventional weapons, but not their ammunition. The ATT Chairman's July 14, 2011 draft text lists a broad range of categories of weapons, ammunition, components, technology and equipment. On the transactions to be covered in the ATT, the draft text covers import, export, transfer, brokering, manufacture under foreign license, and technology transfers of the items that will be covered. States have also had discussions on arms transfer criteria, which are the standards that states should apply when determining whether to authorize a transfer of arms. The most commonly proposed criteria for an ATT relate to existing express international obligations prohibiting transfers, such as UN Security Council arms embargoes, and to likely posttransfer uses that states wish to prevent. This latter category of criteria would aim to ensure that the transferred weapons are not used to commit or facilitate violations of international law. b) An IHL Criterion for Arms Transfers The ICRC supports the elaboration of a comprehensive, legally binding ATT that establishes common international standards for the responsible transfer of all conventional weapons and their ammunition. The negotiation and eventual implementation of the Arms Trade Treaty (ATT) will create an historic opportunity to reduce the human cost of the widespread and poorly regulated availability of conventional arms. Under the Geneva Conventions and customary law, states have an obligation to ensure respect for IHL. This entails a responsibility to make every effort to ensure that the arms and ammunition they transfer do not end up in the hands of persons who are likely to use them in violation of IHL. Both the Agenda for Humanitarian Action adopted in resolution 1 at the 28th International Conference of the Red Cross and Red Crescent in 2003, and resolution 3

156

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

47

adopted at the 30th International Conference in 2007 stressed that, in light of the obligation of states to respect and ensure respect for IHL, strict control of the availability of arms and ammunition is required so that they do not end up in the hands of those who may be expected to use them in violation of IHL. In the ICRC's view, the ATT should reflect states' obligation to ensure respect for IHL by requiring that they: a) assess the likelihood that serious violations of IHL will be committed with the weapons being transferred, and b) not authorize transfers when there is a clear risk that the arms will be used to commit serious violations of IHL. If the future ATT were to permit measures short of denial where there is a clear risk that serious violations of IHL will be committed with the weapons being transferred, it is believed that its humanitarian goal would be seriously undermined. Differences in states' respective IHL obligations are unlikely to cause a particular problem in the choice of IHL obligations to be assessed before deciding to transfer weapons. States transferring weapons would need to evaluate the risk of "serious" violations. These are the violations that states already have an obligation to investigate when committed by their nationals or on their territory or over which there is universal jurisdiction under the grave breaches provisions of the 1949 Geneva Conventions (articles 50, 51, 130, 147 of Conventions I, II, III and IV respectively) and of Additional Protocol I of 1977 (articles 11 and 85). According to customary law, serious violations of IHL constitute war crimes, which, in turn, have been listed under Article 8 of the Rome Statute of the International Criminal Court. While not all states are party to the Rome Statute, the list of war crimes under Article 8 serves as a useful reference for acts that states have generally considered serious violations of customary international law. Specific guidelines on making systematic and objective risk assessments can be helpful tools in applying IHL criteria. In 2007 the ICRC published a Practical Guide to applying IHL criteria in arms transfer decisions. The Guide sets out a range of indicators that can be used for making risk assessments, suggests sources of pertinent information, and provides a list of grave breaches and war crimes. c) Scope of Weapons and Activities If one of the objectives of the ATT is to prevent international transfers of conventional arms that contribute to or facilitate human suffering, then it is difficult to imagine a conventional weapon or type of transfer that would not require regulation. Thus, in the ICRC's view, all conventional weapons and ammunition should be included in the scope of the treaty. It is also important that the treaty cover transfers of ammunition if it is to meet its humanitarian goal effectively. Without ammunition, no use can be made of existing stocks of conventional arms; and supplies of ammunition need to be continuously renewed. According to the UN Secretary-General's April 2011 report on Small Arms "expert panels monitoring Security Council arms embargoes have suggested that the popularity of certain types of weapons among armed groups corresponds to the availability of their ammunition (...). Conversely, reports have shown that, in some cases, lack of ammunition has prompted combatants to seek to resolve their disputes peacefully. Preventing resupply in situations of high risk to civilian populations should be a priority." In addition, research has shown that a very large majority of the countries that currently regulate arms transfers also regulate the transfer of ammunition, demonstrating that regulation of the transfer of ammunition is both practicable and desirable. The ATT should also cover all types of transfer, as understood in existing international instruments. Activities such as transit, trans-shipment, loans, leases, as well as brokering

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

157 48

and closely related activities, should fall within the scope of the Arms Trade Treaty to ensure that it is truly comprehensive and effective. In sum, one of the most important objectives of an international Arms Trade Treaty must be to reduce the human cost of the availability of weapons by setting clear norms for the responsible transfer of conventional arms and their ammunition. States, National Red Cross and Red Crescent Societies and civil society all have a role to play in advance of negotiations on an Arms Trade Treaty in 2012 by promoting public awareness of the human cost of poorly regulated arms transfers and by encouraging states to adopt a strong and comprehensive treaty. An effective ATT and its implementation could make a momentous contribution to reducing preventable human suffering during and after armed conflicts.

VI. THE CONFLATION OF IHL AND THE LEGAL FRAMEWORK GOVERNING TERRORISM While armed conflict and acts of terrorism are different forms of violence governed by different bodies of law, they have come to be perceived as almost synonymous due to constant conflation in the public domain. The ICRC's views on the legal classification of what has been called the "war against terrorism" and of the legal status of persons detained was dealt with in previous reports on IHL and the Challenges of Contemporary Armed Conflicts prepared for the 2003 and 2007 International Conferences. This section aims to provide a brief outline of the legal, policy and practical reasons for which it is believed that it is not helpful to conflate armed conflict and terrorism or the respective legal regimes governing these forms of violence. 1) Legal and Policy Effects There are several important distinctions between the legal frameworks governing armed conflict and terrorism, based primarily on the different reality that each seeks to govern. The main divergence is that, in legal terms, armed conflict is a situation in which certain acts of violence are allowed (lawful) and others prohibited (unlawful), while any act of violence designated as terrorist is always unlawful. As already mentioned, the ultimate aim of armed conflict is to prevail over the enemy's armed forces. For this reason, the parties are permitted, or at least are not prohibited from, attacking each other's military objectives. Violence directed at those targets is not prohibited as a matter of IHL, regardless of whether it is inflicted by a state or a non-state party. Acts of violence against civilians and civilian objects are, by contrast, unlawful because one of the main purposes of IHL is to spare civilians, as well as civilian objects, from the effects of hostilities. IHL thus regulates both lawful and unlawful acts of violence and is the only body of international law that takes such a two-pronged approach. There is no similar dichotomy in the international norms governing acts of terrorism. The defining feature of any act legally classified as "terrorist" under either international or domestic law is that it is always penalized as criminal: no act of violence legally designated "terrorist" is, or can be, exempt from prosecution. The current code of terrorist offences comprises 13 so-called 'sectoral' treaties adopted at the international level that define specific acts of terrorism. There is also a draft Comprehensive Convention on International Terrorism that has been the subject of negotiations at the UN for over a decade.27 As has 27

The relationship between the definition of acts deemed terrorist under the draft Convention and acts committed in armed conflict is one of the points of disagreement holding up the conclusion of

158

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

49

been calculated, the treaties currently in force define nearly fifty offences, including some ten crimes against civil aviation, some sixteen crimes against shipping or continental platforms, a dozen crimes against the person, seven crimes involving the use, possession or threatened use of "bombs" or nuclear materials, and two crimes concerning the financing of terrorism. The legal regimes governing armed conflict and terrorism also differ in that only IHL is based on the notion of equality of rights and obligations of the parties to an armed conflict (by way of reminder, equality of rights and obligations under IHL does not mean that such equality exists between the parties to a NIAC under domestic law). Thus, any party to an armed conflict is equally prohibited from directly attacking enemy civilians, but is not prohibited from attacking the adversary's military objectives. The same principle obviously does not apply to acts of terrorism. A crucial reason for not legally conflating armed conflict and acts of terrorism is that the legal framework governing armed conflict already prohibits the great majority of acts that would be designated as 'terrorist' if they were committed in peacetime. IHL both: i) prohibits, as war crimes, specific acts of terrorism perpetrated in armed conflict, and ii) prohibits, as war crimes, a range of other acts that would commonly be deemed 'terrorist' if committed outside armed conflict. i) "Terrorism" is specifically prohibited in article 33 of the Fourth Geneva Convention, as well as in article 4 (2)(d) of Additional Protocol II. In the first case, the prohibition aims to protect civilians who find themselves in the power of an adversary in an IAC. In the second case the prohibition relates to persons not or no longer participating directly in hostilities who similarly find themselves in the power of an adversary in a NIAC. The placement and scope of both provisions make it clear that the aim is to ensure that a party to an armed conflict is barred from terrorizing civilians under its control, particularly by means of inflicting collective punishments. In addition, articles 51 (2) of Additional Protocol I and 13 (2) of Additional Protocol II specifically prohibit acts of terrorism in the conduct of hostilities, providing that '[a]cts or threats of violence the primary purpose of which is to spread terror among the civilian population are prohibited'. The ICTY determined in the 2006 Galic judgment that this prohibition is binding not only as treaty law, but is of a customary law nature as well. ii) Perhaps more important than the fact that IHL specifically prohibits certain acts of terrorism is that most of its "regular" rules on the conduct of hostilities prohibit acts that would be deemed 'terrorist' when committed outside armed conflict. As already mentioned, the principle of distinction informs the totality of the other rules on the conduct of hostilities under IHL. For the purpose of demonstrating why the legal regimes of armed conflict and terrorism need not be blurred it must be recalled that, based on the principle of distinction, IHL in both IAC and NIAC absolutely prohibits direct and deliberate attacks against civilians. This prohibition - of which the prohibition of terrorization discussed above is a specific expression - is also a norm of customary IHL and its violation constitutes a war crime. In addition to direct and deliberate attacks, IHL proscribes indiscriminate and disproportionate attacks, the definitions of which have already been discussed in other sections of this report.

negotiations. The ICRC believes that it is important that the relevant exclusion clause does not undermine IHL.

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

159 50

Like civilians, civilian objects (defined under IHL as "all objects which are not military objectives") cannot be the target of direct and deliberate attacks. In case of doubt as to whether an object normally dedicated to civilian purposes - such as a house or school - is being used to make an effective contribution to military action - and has thus become a military objective - it must be presumed not to be so. While, as mentioned above, one prong of IHL governs (prohibits) acts of violence against civilians and civilian objects in armed conflict, the other prong allows, or at least does not prohibit, attacks against combatants or military objectives. These acts constitute the very essence of armed conflict and, as such, should not be legally defined as "terrorist" under a different body of international law. To do so would imply that they are prohibited acts which must be subject to criminalization under that other international legal framework. This would stand at odds with the dichotomous regulation of acts of violence which is at the core of IHL. It is important to note that the rules on the conduct of hostilities prohibiting attacks against civilians or civilian objects outlined above apply in NIAC as well. There is, however, a crucial legal difference between international and non international armed conflicts. Under IHL, there is no "combatant" or "ROW" status in NIAC. States' domestic law prohibits and penalizes violence perpetrated by private persons or groups, including all acts of violence that would be committed in the course of an armed conflict. A non-state party thus has no right under domestic law to take up arms and engage in hostilities against the armed forces of a government adversary (the essence of combatant status), nor can it expect to be granted immunity from prosecution for attacks against military targets (the essence of combatant privilege). In other words, all acts of violence perpetrated in a NIAC by an organized nonstate armed group are regularly prohibited and usually severely penalized under domestic law, regardless of their lawfulness under IHL. The interplay of IHL and domestic law in a NIAC thus leads to a situation in which members of non-state armed groups are likely to face stiff sentences under domestic law even for acts of violence that are not prohibited by IHL (for example, attacks against military objectives). This inherent contradiction between the two legal frameworks is part of the reason why nonstate armed groups often disregard IHL norms, including those prohibiting attacks against civilians and civilian objects. They have no explicit legal incentive to abide by IHL norms as they can be equally punished upon capture by the government whether they fought according to the laws and customs of war - and respected civilians and civilian objects - or violated the rules. The drafters of IHL treaties were well aware of the problem and introduced certain provisions in Additional Protocol II aimed at remedying the imbalance between the belligerents in a NIAC that arises as a result of domestic law. Article 6(5) of the Protocol provides: "At the end of hostilities, the authorities in power shall endeavour to grant the broadest possible amnesty to persons who have participated in the armed conflict, or those deprived of their liberty for reasons related to the armed conflict, whether they are interned or detained". This is also a rule of customary law applicable in NIAC based on the practice of a number of states that granted amnesties after NIACs either by special agreements, legislation, or other measures. The UN Security Council, the General Assembly, and other UN and regional bodies have likewise encouraged or welcomed amnesties granted by states at the end of armed conflicts. By way of reminder, the amnesties referred to do not relate to war crimes (or other crimes under international law such as genocide or crimes against humanity), that might have been committed in NIAC, as that would be contrary to the obligation of states to investigate and prosecute such acts. The interface between international and domestic law thus results in a lopsided legal situation unfavourable to non-state armed group compliance with IHL. It is submitted that

160

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

51

adding an additional layer of incrimination, that is designating as "terrorist" acts committed in armed conflict that are not prohibited under IHL reduces the likelihood of obtaining respect for its rules even further. As explained above, attacks against military objectives carried out by non-state actors are prohibited by domestic law. The proposition that amnesties, or any other means of acknowledging the behaviour of groups that attempted to fight according to laws of war becomes legally (and politically) very difficult once such acts are designated as "terrorist". As regards attacks against civilians and civilian objects, they are already prohibited under both IHL (war crimes) and domestic law. It is thus not clear what legal advantage is to be gained from also charging them as "terrorist" given the sufficient proscriptions provided for under the existing two legal frameworks. If such labelling is the result of policy or political decisions aimed at disqualifying non-state adversaries by branding them "terrorists", this may prove to be an obstacle to eventual peace negotiations or national reconciliation that are necessary in order to end an armed conflict and ensure peace. In sum, it is believed that the term "terrorist act" should be used, in the context of armed conflict, only in relation to the few acts specifically designated as such under the treaties of IHL. It should not be used to describe acts that are lawful or not prohibited by IHL. While there is clearly an overlap in terms of the prohibition of attacks against civilians and civilian objects under both IHL and domestic law, it is believed that, overall, there are more disadvantages than advantages to additionally designating such acts as "terrorist" when committed in situations of armed conflict (whether under the relevant international legal framework or under domestic law). Thus, with the exception of the few specific acts of terrorism that may take place in armed conflict, it is submitted that the term "act of terrorism" should be reserved for acts of violence committed outside of armed conflict. 2) Practical effects The designation of an non-state armed group party to a NIAC as "terrorist" means that it is likely to be included in lists of proscribed terrorists organizations maintained by the UN, regional organizations and states. This may, in practice, have a chilling effect on the activities of humanitarian and other organizations carrying out assistance, protection, and other activities in war zones. It potentially criminalizes a range of humanitarian actors and their personnel, and may create obstacles to the funding of humanitarian work. The legal avenue by which these effects may be produced are laws and policies adopted at both the international and domestic level aimed at suppressing the financing of terrorism. UN Security Council resolution 1373 of 2001 is illustrative of the risks to humanitarian action posed by the unqualified criminalization of all forms of "support" or "services" to terrorists. The resolution requires states inter alia to: Prohibit their nationals or any persons and entities within their territories from making any funds, financial assets or economic resources or financial or other related services available, directly or indirectly, for the benefit of persons [involved in] terrorist acts or of entities controlled by such persons [...and also to...] refrain from providing any form of support, active or passive, to entities or persons involved in terrorist acts [...] In implementing international requirements at the domestic level, some governments have made it a criminal offence to provide "support", "services" and/or "assistance" to entities or persons involved in terrorist acts, and to "intentionally associate with" such entities or persons. The exact content and scope of the offences vary from one state to another. While some states circumscribe the crimes narrowly to exclude humanitarian action, others do not. In general, the relevant provisions tend to be broadly worded and can, as a result, be

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

161 52

interpreted to include within their ambit any humanitarian activity involving contact with "individuals or entities associated with terrorism". The prohibition in criminal legislation of unqualified acts of "material support", "services" and "assistance to" or "association with" terrorist organizations could thus in practice result in the criminalization of the core activities of humanitarian organizations and their personnel aimed at meeting the needs of victims of armed conflicts and situations of violence below that threshold. These could include: visits and material assistance to detainees suspected of or condemned for being members of a terrorist organization; facilitation of family visits to such detainees; first aid training; war surgery seminars; IHL dissemination to members of armed opposition groups included in terrorist lists; assistance to provide for the basic needs of the civilian population in areas controlled by armed groups associated with terrorism; and largescale assistance activities to internally displaced persons, where individuals associated with terrorism may be among the beneficiaries. In addition, the criminalization based on broad definitions of "support or services" to terrorism" may have the effect of governments including "anti-terrorist" funding conditions or restrictions in donor agreements. The relevant funding clauses may impede the provision of humanitarian services such as those mentioned above and would thus be de facto contrary to the mandates and/or missions of humanitarian organizations. The potential for criminalization of humanitarian action is of concern to the ICRC for the reasons mentioned above, but also for others particular to the organization's mandate and mission. At a basic level, the potential criminalization of humanitarian engagement with organized armed groups designated as "terrorist organizations" may be said to reflect a nonacceptance of the notion of neutral and independent humanitarian action, an approach which the ICRC strives to promote in its operational work in the field. In legal terms, potential criminalization may be said to be incompatible with the letter and spirit of IHL, which in Common Article 3 specifically allows the ICRC to offer its service to the parties to a NIAC. As has already been explained, that includes the non-state party to such a conflict. The ICRC is permitted and must in practice be free to offer its services for the benefit of civilians and other persons affected by an armed conflict who find themselves in the power of or in the area of control of the non-state party. Broad language, or broad interpretation of language, in criminal legislation prohibiting "services" or "support" to terrorism could prove to be a serious impediment for the ICRC to fulfil its IHL mandate in contexts in which armed groups party to a NIAC are designated "terrorist organizations". The fulfilment of the ICRC's mandate under the Statutes of the International Red Cross and Red Crescent Movement, which provide that it may also offer its humanitarian services in situations of violence other than armed conflicts may likewise be effectively hampered in contexts in which such services would involve contacts with persons or entities associated with "terrorism". Potential criminalization of humanitarian action may also be said to preclude respect for the Fundamental Principles of the International Red Cross and Red Crescent Movement which bind the ICRC and other components of the Movement. The principle of neutrality means that the Movement "may not take sides in hostilities or engage at any time in controversies of a political, racial, religious or ideological nature". The ICRC or the Movement could not abide, or be seen to be abiding by this principle if they were directed, as a result of anti-terrorist legislation or other measures, to carry out their activities for the benefit only of persons on one side of the divide in an armed conflict or other situation of violence. ICRC visits to places of detention worldwide, required or allowed for in the universally ratified Geneva Conventions, illustrate an inherent tension between the

162

The Applied Ethics of Emerging Military and Security Technologies 31IC/11/5.1.2

53

prohibition of "services" or "support" language in anti-terrorism legislation and the implementation of the principle of neutrality in the field. The ICRC endeavours to visit all persons detained in relation to an armed conflict regardless of the side to which they belong in order to ensure that they are humanely treated and that other rights are respected. This role, which is widely supported by states, is at the crux of the organization's work in detention and yet could possibly be called into question due to the lack of exemptions for humanitarian activities in anti-terrorism measures. Pursuant to the principle of impartiality, the ICRC and other components of the Movement may not discriminate based on "nationality, race, religious beliefs, class or political opinions" and are bound to "relieve the suffering of individuals being guided solely by their needs, and to give priority to the most urgent cases of distress". The ability of the ICRC and of National Red Cross and Red Crescent Societies to, for example, provide medical assistance to victims of armed conflict and other situations of violence in keeping with the principle of impartiality could be rendered difficult based on the broad language of anti-terrorism legislation. A strict reading could imply that medical services to persons rendered hors de combat by wounds or sickness, as well as to other persons under the control of a non-state party designated as "terrorist" could be prohibited as support or services to "terrorism". This is a result that would call into question the very idea behind the creation of the ICRC - and subsequently of National Red Cross and Red Crescent Societies - over 150 years ago. In sum, there appears to be a need for greater awareness by states of the necessity to harmonize their policies and legal obligations in the humanitarian and anti-terrorism realms in order to properly achieve the desired aims in both. It is submitted that, to this end: -

Measures adopted by governments, whether internationally and nationally, aimed at criminally repressing acts of terrorism should be crafted so as not to impede humanitarian action. In particular, legislation creating criminal offences of "material support", "services" and "assistance" to or "association" with persons or entities involved in terrorism should exclude from the ambit of such offences activities that are exclusively humanitarian and impartial in character and are conducted without adverse distinction.

-

In respect of the ICRC in particular, it should be recognized that humanitarian engagement of non-state armed groups is a task foreseen and expected from the ICRC under Common Article 3 to the Geneva Conventions, which allows the ICRC to offer its services to the parties to NIACs. Criminalization of humanitarian action would thus run counter to the letter and spirit of the Geneva Conventions, i.e. broad language prohibiting "services" or "support" to terrorism could make it impossible for the ICRC to fulfil its conventional (and statutory) mandate in contexts where the armed groups party to a NIAC are designated "terrorist organizations".

[6] TECHNOLOGY AS DIALECTIC: UNDERSTANDING GAME CHANGING TECHNOLOGY Noetic Corporation Introduction Technology has been a key enabler of military innovation since the beginning of armed conflict. Every now and then a new technology is applied to a critical challenge or opportunity in a way that radically changes the dynamics of power between adversaries. That is, it is a game changing technology. At the same time, however, many technologies are conceived that fall by the wayside, have a negative impact, simply maintain the status quo or improve it by only a small margin. Why is this so and how should the Department of Defense (DoD) consider game changing technology when trying to develop and maintain strategic and tactical advantage over current and future adversaries in a period of fiscal constraint? The NeXTech project, sponsored by the Office of the Secretary of Defense Acquisition Technology and Logistics, sought to answer this question. The paper serves as a companion to other papers and articles, published independently, that address specific technological or philosophical aspects of NeXTech. It draws solely on the interviews, conversations and war game findings associated with NeXTech. The views expressed herein represent the consensus of interviews, wargames and discussions groups and do not represent the position of the Department of Defense or any other organization.

Technology Much time could be spent debating the semantic differences between innovative, disruptive, revolutionary and game changing technologies. Regardless of the label, and using game changing as a term of convenience, the underlying phenomenon requiring examination is as follows: A technology or collection of technologies applied to a problem space in a manner that radically alters the extant symmetry of power between adversaries. The use of this technology impacts all parties' policies, doctrines, organizations and cultures. In understanding this phenomenon, it is important not to get too tightly bound to formal definitions or dogma. Truly game changing technology often creates new definitions and transgresses prior boundaries. It also opens up new ways of doing things and avenues of innovation. Unnecessarily rigid thinking often leads to missing or misunderstanding opportunities and threats. Also, not all technology can or should be game changing. In most instances, important investments are required to maintain the status quo or undertake incremental improvement. Furthermore, game changing technology also affects the organization using it, sometimes causing significant and unwanted challenges. Finally, and perhaps most importantly, in understanding the development of game changing technology, it must be recognized that many of the decisive elements that make a technology game changing or not have little to do with the technical aspects of the technology itself. These other elements exist around the technology, influencing and being influenced by it. Understanding these aspects became so important to the NeXTech project that a framework was developed to explicitly identify the constituent elements of game changing technology and understand the complexity of the dialectic that begets them. That framework serves as the basis for the following sections.

164

The Applied Ethics of Emerging Military and Security Technologies TECHNOLOGY AS DIALECTIC: UNDERSTANDING GAME CHANGING TECHNOLOGY

Technology as Failing to recognize or understand the elements that make a technology game changing leads to multiple situations that are unacceptable for the DoD. An adversary may develop a technology affording them significant advantage without our understanding it, technology with game changing potential can fall by the wayside, large investments can be made in technology that was never going to have major impact or the unanticipated consequences of a game changing technology creates significant surprise. Exposing and parsing the constituent components of this dynamic affords a better understanding of what is at play with a given technology, allowing more explicit assumptions, improved forecasts and better informed decisions. Considering the development and use of technology through the lens of dialectic i.e. a method for resolving disagreement between people (or aspects of technology development) holding different points of view, affords us additional analytic tools for understanding the development and use of game changing technology. Fichtean Dialectics offers the most useful metaphor for this analysis and is based on four concepts: +

Everything is transient and finite, existing in the medium of time

+

Everything is composed of contradictions (opposing forces)

+

Gradual changes lead to crises, turning points when one force overcomes its opponent force

+

Change is helical (spiral) not circular or linear

The first step in applying this philosophical approach to game changing technology is to identify the various aspects that are in dialogue. During the NeXTech project four primary aspects were identified: +

Core Elements - the most visible and easily recognizable elements of a capability. The technology itself, the purpose of the technology and the concept for its employment.

+

Perspectives -technology is used differently by, and has different impacts on friendly forces, enemy forces and broader society.

+

Context - the myriad interconnected elements including values, the nature of asymmetry, scope, organization and a variety of others that inform the development and use of technology.

+

Time - the length of time that a technology will take to be developed, reach maturity and its lifespan before being countered or becoming the new status quo.

The dialectic of game changing technology occurs through the convergence of these aspects in various explicit and ineffable ways. The continually changing interplay of all these aspects further contributes to the complexity of understanding game changing technology. However, our ability to understand, extrapolate from and hypothesize about these interactions forms the basis for understanding the future of potentially game changing technologies.

WWW.NOETICGROUP.COM

PAGE 2 OF 10

The Applied Ethics of Emerging Military and Security Technologies TECHNOLOGY AS DIALECTIC: UNDERSTANDING GAME CHANGING TECHNOLOGY

ELEME

SPECm/E

ill

The results of this ongoing argument change over time and can be seen on a continuum from radically advantageous to maintaining the status quo to radically disadvantageous. When viewed on a continuum, game changing technologies should be seen as the outliers, black swans that are difficult to predict or act on. However they are outliers that still behave in the same manner as any other technology development, they just occur faster, have unexpected consequences or have greater impact than anticipated. Given that game changing technologies are black swans, it is important to approach any analysis with humility. Even if the various aspects related to the development of a technology can be accurately identified, the complexity of understanding the various interactions at any given point, let alone over time, defies detailed prediction. At the same time significant improvements can be made over current analytic approaches to allow for better informed decisions by the DoD and others.

WWW.NOETICGROUP.COM

PAGE 3 OF 10

165

166

The Applied Ethics of Emerging Military and Security Technologies TECHNOLOGY AS DIALECTIC: UNDERSTANDING GAME CHANGING TECHNOLOGY

Core Elements TECHNOLOGY Even though the technology element is only one among many in the dynamic of game changing technology, it is far from simple. Aside from the complexity of the technology itself, most game changing technologies are, in fact, the combination of multiple technologies or are dependent on the capability of other enablers. Understanding the series of interdependencies required to enable a particular technology in a system of systems, is a key element of understanding the viability and availability of future technologies. In a national security context, it is also important to consider technologies outside of weapons systems. That is both effects technology (e.g. laser energy systems or precision guided munitions) and enabling technology (e.g. Global Positioning Systems or information technology). Unmanned Aerial Vehicles (UAVs), for example, require sophisticated software, significant computer processing power and other hardware assets in addition to the technology on a regular aircraft. These systems in turn are reliant on Global Positioning Systems and communication networks to pilot themselves, send and receive data.

PURPOSE It is logical to recognize that a technology must serve a purpose, have a problem to solve or a game to change, in order to have significant impact. However, defining and gaining agreement on the actual purpose of technology, and the value associated with that purpose, remains challenging with various groups in any organization interpreting problems differently for reasons of philosophy, perspective or self-interest. This is especially true for national security organizations that must guard against a range of contingencies of varying significance and likelihood with millions of lives and trillions of dollars at stake. Additionally, an existing technology can be used for a new or different purpose with game changing results. The F-22 Raptor and F-35 Lightning II may well be game changing in a large scale conventional conflict against a near peer adversary, which is their intended purpose. However, they are not game changing at all in a counterinsurgency, which was not their purpose. The issue, therefore, is not whether the technology itself has merit but whether the purpose is sufficiently likely to justify such heavy investment.

CONCEPT OF EMPLOYMENT Having great technology and a well understood purpose does not mean that a viable solution is necessarily evident. A concept of employment that guides how the technology is used to solve the problem is a necessary element. The concept does not necessarily need to be formally documented and in many cases it is not formally recognized or understood, however it will always exist for a technology to be used in a game changing manner. In many instances game changing concepts of employment are not developed by the creators of the game changing technology, other groups take the technology and figure out its true potential. More than that, technology does not need to be new to become game changing. The Predator Unmanned Aerial Vehicle (UAV) was initially intended as an Intelligence, Surveillance and Reconnaissance (ISR) platform but has come to revolutionize direct action in counter terrorism operations.

WWW.NOETICGROUP.COM

PAGE 4 OF 10

The Applied Ethics of Emerging Military and Security Technologies TECHNOLOGY AS DIALECTIC: UNDERSTANDING GAME CHANGING TECHNOLOGY

The most frequently cited example of these elements being successfully brought together is Blitzkrieg. The initial purpose of Blitzkrieg was to overcome the problem posed to German forces by the Maginot Line. This was achieved by effectively combining improved, though not wholly new, technology: primarily tanks, aircraft and radios into a combined arms maneuver concept.

Perspective FRIENDLY FORCES Implicit assumptions abound in analysis of game changing technology. This includes the fundamental, underlying perspectives of those developing and employing technology. The United States military is the most potent and technologically advanced fighting force in history. This dominance has not just changed the nature of warfare, the science, engineering and industrial base that supports it has sparked technological innovations that have benefited broader society both at home and around the world. This history fundamentally informs the way in which the DoD thinks about and employs technology. Technology is employed with greater alacrity, faith and reliance than in many other countries. This can lead to technology being developed with an assumption of its unequivocal success and corresponding total dominance of an adversary, which can lead to miscalculation or unanticipated results. This technological advantage can also lead to choices not to employ advanced technology in certain contingencies for fear of showing a capability and losing the element of surprise in future. Less-than-lethal, directed energy weapons are intended to provide tactical escalation offeree options, avoiding the use of lethal force in an effort to minimize unnecessary casualties. That is, it is assumed that an adversary will, in many instances, be overwhelmed by unpleasant sensations and back down. However, if an adversary were to use these same weapons on U.S. forces it would likely have a highly escalatory effect.

ENEMY FORCES The enemies of the United States have a fundamentally different perspective on the use of technology. In part, this is due to different fiscal circumstances, cultural, moral and legal views, as discussed later. Additionally, the reality of U.S. dominance in almost all areas of modern warfare and, in most instances, significant technological advantage, forces a different perspective in a search for asymmetric advantage. These differences are also diverse, depending on the circumstances of each group. An adversary's use of technology will be inextricably linked to their strategic calculus. A small insurgent force fighting for its life will be far more motivated to innovate than a large, bureaucratic fighting force. It is therefore important to explicitly include these factors into any analysis of both parties use of technology. At a NeXTech wargame a foreign exchange officer gave the following solution to a particular scenario: "Place all technology investment into electronic attack then train our forces rigorously in fighting off the grid. Force the Americans to fight in the dark, having trained ourselves to fight at night." Many Americans in the same group disregarded this solution as ineffective, claiming that blue force networks would be too well protected. The importance, however, is not in the merits of the solution but in understanding the potential for, and significance of, radically different approaches but other forces.

WWW.NOETICGROUP.COM

PAGE 5 OF 10

167

168

The Applied Ethics of Emerging Military and Security Technologies TECHNOLOGY AS DIALECTIC: UNDERSTANDING GAME CHANGING TECHNOLOGY

SOCIETY The majority of technological innovation exists outside of the military domain. The late 20th and early 21st century has been a particularly innovative period for personal and social technology with multiple game changers conceived, delivered and now used as part of the status quo of everyday life. In fact, unlike the early part of the 20th century where defense organizations led the uptake of new technologies, general society now leads the adoption of many new technologies. Technology designed with civilian users in mind and used predominantly for civilian purposes increasingly holds military significance. With technology that is game changing at a whole of society level, the military must learn to adapt and be responsive to change it has not instigated and cannot hope to control. Social media has been particularly difficult for government and military organizations to manage. Oppressive regimes trying to quell uprisings in the Middle East and concerned western military services trying to avoid the inadvertent transmission of sensitive information by their own servicemen have both struggled to manage the changes caused by the use of this technology.

Context VALUES Values can have a more disproportionate influence on technology than almost any other element. Values refers to the various legal, ethical, moral, religious and cultural considerations that are naturally raised when something new comes into existence that might change the way we live our lives. Possessing a technical capability does not mean that using it is the right thing to do. Our values deeply affect the way in which technology is conceived, designed, built and used or not used. However, the exponential rate of technological change is both forcing changes to extant norms and outpacing our ability to conceive and agree on new norms. Additionally there are various communication challenges within communities and between the various communities that are required to gain consensus. Even framing the terms of debate can be extremely challenging. A natural desire to maintain our humanity and values by ensuring responsibility and accountability often gets garbled in a context where basic definitions are rarely agreed upon. Furthermore, they are contingent. Something that would ordinarily be unacceptable might become highly desired if dealing with a catastrophic threat. The various perspectives, pressures and compromises associated with this element often lead to apparently illogical, but incredibly human, decisions being taken related to game changing technology. This in turn can lead to inconsistent calculi, unintended consequences and technology that has been shaped more by exogenous factors than the dictates of technical merit or operational need. Non-lethal weapons are an area of significant debate, driven largely by differing values. From a military perspective, these options sound preferable to killing people, however it is not always so simple. As a very simple example, it is legally preferable for a Marine sniper to use a conventional sniper rifle with lethal force than to use a laser to blind an enemy combatant. Similarly, the use of millimeter wave technology to disperse violent crowds raises concerns about targeting civilians and unintended consequences.

WWW.NOETICGROUP.COM

PAGE 6 OF 10

The Applied Ethics of Emerging Military and Security Technologies TECHNOLOGY AS DIALECTIC: UNDERSTANDING GAME CHANGING TECHNOLOGY

ASYMMETRY Game changing technology alters the symmetry of power between adversaries. The party with less power has significantly more to gain in this exchange, especially when using a new capability, regardless of whether the more powerful party possesses that capability or not. In analyzing the use of technology it is important to correctly identify the nature of the asymmetry being exploited. Is the weaker party taking advantage of an unwillingness to accept casualties, a moral abhorrence to the use of a particular technology, an asymmetric cost imposition or an asymmetric change in operational tempo? For the stronger party, the question remains how one retains an asymmetric advantage and a desire to continually innovate when the status quo is already in one's favor. During the NeXTech project, consensus could not be gained on whether Improvised Explosive Devices (lEDs) were game changing or not from the enemy's perspective. Certainly, however, it would not have been game changing for U.S. forces to employ similar technology against the insurgency given the other asymmetric advantages we possess.

SCOPE In many instances, debate about whether, or to what extent, a given technology is game changing or not relates to scope. A highly specific technology may be totally revolutionary within a small sphere leading experts in that field to consider it game changing. For those in general society, however, it may have no impact. It is therefore important in analyzing game changing technology to be explicit about the scope and impact of changes. Software code generation tools, for example, could completely revolutionize aspects of the software development process, decreasing the need for human programmers, reducing time to deployment while increasing software quality. This may not, however, prove game changing for cyber warfare, which in turn may not prove game changing in terms of future power projection activities, even though cyber technology will likely be of critical significance.

ORGANIZATION Organizational dynamics profoundly shape, and are impacted by, game changing technology. New technology tends to be applied to extant organizational constructs, often inappropriately. As the nature of the new technology becomes apparent, it is often at odds with the organization around it and will either change that organization or see its potential diminished. Technologies are also often developed in the context of organizational competition and rivalry which can significantly alter the course of a given solution.

ETCETERA The list of contextual elements that may impact game changing technology goes on and on: levels of funding, the vagaries of congressional oversight and politics, education level of a military or society, access to markets, contracting and acquisition law, geography and historical circumstance all play roles in shaping the ways in which technology is developed and the impact it makes. Not all of these elements apply in all cases and some may be unique. When considering a particular technology it is important to explicitly identify the relevant contextual elements, their interrelationships and the impact they will have.

WWW.NOETICGROUP.COM

PAGE 7 OF 10

169

170

The Applied Ethics of Emerging Military and Security Technologies TECHNOLOGY AS DIALECTIC: UNDERSTANDING GAME CHANGING TECHNOLOGY

Time Time impacts game changing technology from several perspectives. It takes time for a technology to become game changing. The various technology components and enablers must mature independently and in unison. The potential of the technology must be matched with an appropriate concept and this must align, in time, with the problem or opportunity that gives the technology relevance. This helps explain why some technologies have existed for years before passing a tipping point that makes them game changing. Additive manufacturing technology, for example, has existed for over 30 years but has only recently captured the popular imagination as 3D printing. If a technology becomes game changing, it can only remain so for a finite period of time. At a certain point that technology will simply become the status quo, adaptive enemies will develop counters or a work around in a different domain or the rest of the world may simply catch up. Conversely, if the effects of a game changing technology are not felt for a significant period of time, it may not be considered game changing at all. The element of time also creates opportunity for added complexity. Understanding the dynamics of game changing technology is hard enough at a fixed point in time. When considered over time, the various interdependencies and potentialities are expanded exponentially. Adding further to this complexity are the ramifications of strategic shocks. Forecasts for the year 2020, for example, were altered substantially based on the strategic shock of 9/11. The future looks very different from the vantage point of 2003 compared to 1999. This element is especially important for DoD when major platforms can take decades to build and field. All of the major platforms that have faced cancellations or cuts over the past five years were conceived in the 1980s or 1990s.

Convergence Convergence refers to the dialogue or argument that forms the basis of the dialectic of game changing technology. The exact nature of the convergence and the dialectic is unique and changes over time. Identifying the relevant elements involved and the nature of their convergence forms the basis for understanding the dialectic, developing more accurate forecasts, mitigating key risks or planning remedial action to increase outcomes from technology investments. This interaction is best seen through example. AUTONOMOUS SYSTEMS During NeXTech, the most quickly grasped and frequently applied technology was autonomous systems. The technology was sufficiently flexible that it could be used to solve multiple problems ranging from ISR to sabotage, route clearance, deterrence, raiding, 'smart' mining and force on force combat. In fact, the technology was so flexible, that in many instances a concept of employment was not really conceived beyond deploying a swarm of bespoke robots to solve a problem. Autonomous systems are highly appealing from a friendly force perspective, for their flexibility, low cost (relative to deploying humans) and for the low risk of loss of life, providing even further asymmetric advantage for U.S. forces. Their use does, however, create significant values concerns that are not easily resolved. On the one hand, the lower risk of loss of life is highly appealing but allowing non-human entities to potentially make decisions to kill humans is deeply concerning. Additionally, the use of autonomous systems has caused, and will continue to

WWW.NOETICGROUP.COM

PAGE 8 OF 10

The Applied Ethics of Emerging Military and Security Technologies TECHNOLOGY AS DIALECTIC: UNDERSTANDING GAME CHANGING TECHNOLOGY

cause, major upheaval for existing organizational structures and incentives. In part, this is created through issues of time. The technology is evolving much more rapidly than we can conceive agreed institutional policies, strategy and structures for it. All this change has been wrought with the technology currently limited in scope, largely focused on ISR and counter-terrorism roles. At this point, the enemies' perspectives can only be hypothesized, given how new the technology still is. From one perspective, drone strikes in Pakistan have been seen to add disproportionately to enemy grievances and recruiting as well as providing the opportunity to claim that U.S. forces are scared and not willing to fight them like men. At the same time, Hezbollah has used drone technology. However, their purpose was very different to the U.S.; they wanted to be able to invade Israeli airspace for reciprocity's sake and possessed no other means to do so. Most foreign militaries of any sophistication have autonomous systems programs and non-state actors are also developing capability or receiving it from state sponsors. This implies even more disruptive future scenarios where autonomous systems of varying sophistication are deployed in wildly varying manners, based on differing values for different objectives. As suggested by the values concerns for friendly forces, many in general society are deeply concerned about the use of autonomous systems in warfare. However, autonomous systems are being used extensively outside of the military realm. While still raising many concerns, the likely uptake of this technology in general society will likely alter perceptions for autonomous systems used in combat. As the technology continues to develop, becoming more flexible and specific it will likely propagate further, in all realms in spite of the many concerns it raises. Existing, deployed autonomous systems can therefore be seen as highly game changing. Future autonomous systems are likely to be even more game changing but will be shaped significantly by legal, ethical, moral, policy, cultural and religious concerns. From a DoD perspective the vagaries of organizational dynamics and incentives will likely play a major role in shaping the design and use of this technology, potentially leading to sub-optimal outcomes. DIRECTED ENERGY- HIGH POWER LASERS Directed energy weapons were also quickly grasped during NeXTech wargames. This capability takes an existing problem (the need to destroy or damage enemy resources) and applies a new technology (directed energy of varying power) through a similar concept (the application of kinetic force directly to the asset) in order to solve the problem. From a friendly force perspective, the technology is appealing in that it provides greater flexibility, nearly limitless magazines and lower costs per 'round'. High powered lasers, designed to be used as lethal weapons, provide few legal, ethical, moral, policy, religious, or cultural concerns. The limits of current technology mean that the technology itself is large and must be deployed from platforms such as ships or heavy lift aircraft. This means that the technology works neatly within existing organizational constructs and incentive structures. This minimizes organizational disruption and eases implementation but also limits the scope of the technology's impact. Directed energy weapons are similarly straightforward from an enemy perspective. They would provide some additional asymmetric advantage for air denial and would have significant propaganda value but, as currently conceived, would be employed in similar ways to existing weapons systems. The vast majority of enemy forces

WWW.NOETICGROUP.COM

PAGE 9 OF 10

171

172

The Applied Ethics of Emerging Military and Security Technologies TECHNOLOGY AS DIALECTIC: UNDERSTANDING GAME CHANGING TECHNOLOGY

also lack the research and development sophistication to develop these systems, the power generation capability to power them or the platforms required to deploy the technology into an operational environment. However, in time, if the energy output of these large systems is successfully miniaturized, this could all change. Miniaturized directed energy weapons, the size of a rifle or handgun would make for a far more lethal tactical battle space and new asymmetric advantage if only one side in a conflict possessed the capability. This would also lead to organizational changes based on an ability to generate strategic level effects from tactical units. Based on current projections of the development of this technology over the next twenty years, directed energy weapons are not likely to be game changing but would provide further advantage to friendly forces with minimal disruption. This would be a definite improvement over the status quo. Interestingly, lower powered directed energy weapons used for non-lethal purposes are already causing significant legal and ethical concerns (based primarily on public perceptions, health concerns and legal concerns about the targeting of civilians in the battle space) providing military utility that continues to be questioned.

Game Changing Continuum Ultimately, seeking to label specific technologies in tightly bound definitions is a task of limited utility. Binary definitions are limiting and a highly innovative and effective technology is still worth investing in even if it is not quite game changing. More importantly, game changing technology is dynamic, continually changing with each element being contingent on others and changing over time. Today's game changing technologies are tomorrow's status quo. It is therefore important to consider technology on a continuum and understand that any technology will move through this continuum in a non-linear fashion overtime. The continuum ranges through several states: +

Game changing technology that has a positive impact

+

Innovative technology that has a positive impact

+

Technology that maintains the status quo

+

Innovative technology that has a negative impact, either due its effective use by an adversary or unanticipated consequences of its use by friendly forces

+

Game changing technology that a negative impact

Viewed on a continuum, and through a dialectic framework, game changing technologies can be seen as radical outliers. They do not, however, exist on a different plane to other technologies or behave in fundamentally different ways. It is just that their impact occurs faster or has more significant implications than that of other technologies.

WWW.NOETICGROUP.COM

PAGE 10 OF 10

[7] New capabilities in warfare: an overview of contemporary technological developments and the associated legal and engineering issues in Article 36 weapons reviews Alan Backstrom and Ian Henderson* Alan Backstrom, BEng, MEngSci, is an automotive engineering quality manager. He has extensive experience working with original equipment manufactures, system suppliers, subsystem suppliers, and component suppliers, with a particular focus on major design validation techniques, warranty analysis, and accident investigation. Group Captain Ian Henderson, AM, BSc, LIB, LLM, PhD, is a legal officer with the Royal Australian Air Force. Abstract The increasing complexity of weapon systems requires an interdisciplinary approach to the conduct of weapon reviews. Developers need to be aware of international humanitarian law principles that apply to the employment of weapons. Lawyers need to be aware of how a weapon will be operationally employed and use this knowledge *

This paper was written in a personal capacity and does not necessarily represent the views of the Australian Department of Defence or the Australian Defence Force. Thank you to many friends and colleagues who generously provided comments on the draft.

174

The Applied Ethics of Emerging Military and Security Technologies A. Backstrom and I. Henderson - New capabilities in warfare: an overview of contemporary technological developments and the associated legal and engineering issues in Article 36 weapons reviews

to help formulate meaningful operational guidelines in light of any technological issues identified in relation to international humanitarian law. As the details of a weapons capability are often highly classified and compartmentalized, lawyers, engineers, and operators need to work cooperatively and imaginatively to overcome security classification and compartmental access limitations. Keywords: weapon, international humanitarian law, law of armed conflict, warfare, IHL, LOAC, Geneva, additional protocol, weapons review, autonomous, target recognition, reliability.

Article 36 of Protocol Additional to the Geneva Conventions of 12 August 1949, and relating to the Protection of Victims of International Armed Conflicts provides: In the study, development, acquisition or adoption of a new weapon, means or method of warfare, a High Contracting Party is under an obligation to determine whether its employment would, in some or all circumstances, be prohibited by this Protocol or by any other rule of international law applicable to the High Contracting Party.1 As weapons become more technologically complex, the challenges of complying with this apparently simple requirement of international law become more daunting. If a lawyer were to conduct a legal review of a sword, there would be little need for the lawyer to be concerned with the design characteristics beyond those that can be observed by the naked eye. The intricacies of the production and testing methods would equally be legally uninteresting, and even a lawyer could grasp the method of employment in combat. The same cannot be said about some modern weapons, let alone those under development. The use of a guided weapon with an autonomous firing option requires an understanding of the legal parameters; the engineering design, production, and testing (or validation) methods; and the way in which the weapon might be employed on the battlefield.2 While somewhat tongue-in-cheek, there is some truth to the view that a person becomes a lawyer due to not understanding maths, another becomes an engineer due to not understanding English, and the third a soldier due to not understanding either! 1

2

484

Opened for signature 12 December 1977, 1125 UNTS 3, entered into force 7 December 1978 (API). See generally Justin McClelland, 'The review of weapons in accordance with Article 36 of Additional Protocol T, in International Review of the Red Cross, Vol. 85, No. 850, June 2003, pp. 397-415; Kathleen Lawand, 'Reviewing the legality of new weapons, means and methods of warfare', in International Review of the Red Cross, Vol. 88, No. 864, December 2006, pp. 925-930; International Committee of the Red Cross (ICRC), A Guide to the Legal Review of New, Means and Methods of Warfare: Measures to Implement Article 36 of Additional Protocol I of 1977, 2006. For a thorough discussion of what is and is not a 'weapon' for the purposes of legal review, see Duncan Blake and Joseph Imburgia, '"Bloodless weapons"? The need to conduct legal reviews of certain capabilities and the implications of defining them as "weapons'", in The Air Force Law Review, Vol. 66, 2010, p. 157. See Michael Schmitt, 'War, technology and the law of armed conflict', in Anthony Helm (ed.), The Law of War in the 21st Century: Weaponry and the Use of Force, Vol. 82, International Law Studies, 2006, p. 142.

The Applied Ethics of Emerging Military and Security Technologies

175

Volume 94 Number 886 Summer 2012 Our purpose in writing this article is to breakdown those barriers through a multidisciplinary approach that identifies the key legal issues associated with employing weapons, setting out important features of emerging weapons, and then analysing how engineering tests and evaluations can be used to inform the weapon review process. Through the combination of the above methods, we hope to provide a general framework by which the legal and engineering issues associated with weapon development and employment can be understood, regardless of the simplicity or complexity of the weapon. We commence with a brief review of the key legal factors for employing and reviewing weapons, followed by three substantive parts. The first part deals with the target authorization process, regardless of the choice of weapon to be employed. The second part looks at some emerging weapons and the legal issues associated with those weapons. The final part considers the engineering issues associated with weapon reviews and, in particular, how an understanding of engineering processes can assist when reviewing highly complex weapons.

Key legal factors The key legal steps under international humanitarian law3 when conducting an attack can be summarized as: 1. 2. 3. 4. 5. 6.

collecting information about the target; analysing that information to determine whether the target is a lawful target for attack at the time of the attack; appreciating the potential incidental effects of the weapon and taking feasible precautions to minimize those effects; assessing the 'proportionality' of any expected incidental effects against the anticipated military advantage of the overall attack (not just the particular attack of the individual weapon);4 firing, releasing, or otherwise using the weapon such that its effects are directed against the desired target; monitoring the situation and cancelling or suspending the attack if the incidental effects are disproportionate.5

In addition, consideration must also be given to the type of weapon to be employed, and particularly relevant to this article is that there are also ways of employing (using) an otherwise lawful weapon that might result in a banned effect (e.g., indiscriminately firing a rifle). The key legal factors when conducting the review

3 4

5

Also known as the law of armed conflict. See, for example, Australia's declaration of understanding to the effect that military advantage in Articles 51 and 57 of API, above note 1, means 'the advantage anticipated from the attack considered as a whole and not from isolated or particular parts of the attack' - reprinted in Adam Roberts and Richard Guelff, Documents on the Laws of War, 3rd edn, Oxford University Press, Oxford, 2000, p. 500. See above note 1, Article 57(2)(b) of API. 485

176

The Applied Ethics of Emerging Military and Security Technologies A. Backstrom and I. Henderson - New capabilities in warfare: an overview of contemporary technological developments and the associated legal and engineering issues in Article 36 weapons reviews of new weapons (including means and methods of combat) are whether the weapon itself is banned or restricted by international law;6 and if not, whether the effects of the weapon are banned or restricted by international law.7 Finally, the 'principles of humanity and the dictates of the public conscience' must also be kept in mind.8 From an operational point of view, the key points can be expressed as: achieving correct target-recognition, determining how to exercise weapon release authorization, and controlling (or limiting) the weapon effect. With weapons of relatively simple design, the associated legal issues are simple. With the sword example above, the only real issues are whether it is a 'banned weapon';9 and if not, whether the person who wields it does so with discrimination. Any design flaws (e.g., poorly weighted) or manufacturing defects (e.g., metal is too brittle) are unlikely to affect the legal analysis and are primarily the worry of the person using the sword. With more complex weapons like crossbows, the complexity of the weapon design introduces the potential for discrimination to be affected by: • •

design errors (e.g., the weapon does not fire straight or consistent with any sighting mechanism as the design is flawed); or manufacturing errors (e.g., the weapon does not fire straight or consistent with any sighting mechanism as the weapon was not built, within tolerance, to the design).

These types of errors have the potential to be magnified with long-range weapons (such as artillery) and batch variation now also becomes a significant factor as any variations are magnified over the longer range of the weapon. Further, modern

6

7 8 9

486

Weapons can be banned outright, banned based on designed purpose or expected normal use, or the means of employment can be regulated (i.e., banned uses). A weapon may be totally banned through specific law (e.g., biological weapons are prohibited under the Convention on the Prohibition of the Development, Production, and Stockpiling of Bacteriological (Biological) and Toxin Weapons and on Their Destruction, opened for signature 10 April 1972, 1015 UNTS 163, entered into force 26 March 1975), or may be banned generally if in all circumstances it is a weapon that is 'of a nature to cause superfluous injury or unnecessary suffering', see above note 1, Article 35(2) of API, and associated customary international law. Contrast this with, for example, laser weapons, which are generally lawful but are prohibited when they are specifically designed, solely or as one of their combat functions, to cause permanent blindness to unenhanced vision (Protocol (IV) on Blinding Laser Weapons to the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May be Deemed to be Excessively Injurious or to Have Indiscriminate Effects, opened for signature 13 October 1995, 35 ILM 1218, entered into force 30 July 1998). Finally, incendiary weapons are per se lawful, but, for example, may not be employed by air delivery against military objectives located within a concentration of civilians, see Article 2(2) of Protocol III on Prohibitions or Restrictions on the Use of Incendiary Weapons to the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May be Deemed to be Excessively Injurious or to Have Indiscriminate Effects, opened for signature 10 April 1981, 1342 UNTS 137, entered into force 2 December 1983. ICRC, A Guide to the Legal Review of New, Means and Methods of Warfare: Measures to Implement Article 36 of Additional Protocol I of 1977, above note 1, p. 11. Ibid. As there is no specific ban on swords, the issue would be a review under the general prohibition on weapons that cause unnecessary suffering pursuant to Article 35(2) of API, above note 1.

The Applied Ethics of Emerging Military and Security Technologies

177

Volume 94 Number 886 Summer 2012 weapons have a variety of aiming mechanisms that are not solely dependent on the operator, such as inertial guidance, global positioning system (GPS), and electro-optical guidance. Finally, as discussed below, there is even the capacity for the weapon itself to select a target. Weapon technology is advancing in many different areas and there is limited public material available on the avenues of research and the capabilities of the weapons being developed.10 The following emerging weapons are, therefore, purely representative. In any event, the exact capabilities are of less importance to the discussion than are the general modes of operation.

Target recognition and weapon release authorization The following discussion deals with weapons and weapon systems that have some level of functionality to discriminate between targets and, in appropriate circumstances, might attack a target without further human input. For example, a non-command-detonated landmine is a weapon that once placed and armed, explodes when it is triggered by a pressure plate, trip wire, etcetera. Such landmines have a very basic level of target recognition (e.g., a pressure plate landmine is triggered when a plate is stepped upon with a certain minimum amount of weight - e.g., 15 kilograms - and is clearly unlikely to be triggered by a mouse) and require no human weapon-release authorization.11 More complex weapon systems purport to distinguish between civilian trucks and military vehicles such as tanks.12 Automated and autonomous weapon systems need to be distinguished from remotely operated weapon systems. While there has been much discussion lately of unmanned combat systems, these are just remotely operated weapon platforms and the legal issues depend far more on the manner in which they are used than on anything inherent to the technology.13 The following discussion differentiates automated weapons from autonomous weapons, briefly reviews some key legal issues associated with each type of weapon system, and concludes by outlining some methods for the lawful employment of such weapon systems.

10 See Hitoshi Nasu and Thomas Faunce, 'Nanotechnology and the international law of weaponry: towards international regulation of nano-weapons', in Journal of Law, Information and Science, Vol. 20, 2010, pp. 23-24. 11 Of course, this can be the very problem with landmines. Non-command-detonated landmines placed in areas frequented by civilians cannot distinguish between a civilian and a combatant activating the trigger mechanism. 12 'Anti-vehicle mines, victim-activation and automated weapons', 2012, available at: http://www.article36. org/weapons/landmines/anti-vehicle-mines-victim-activation-and-automated-weapons/ (last visited 1 June 2012). 13 For discussions of how such remotely operated systems are, legally, just like any other weapon system and are not deserving of separate categorization or treatment under international humanitarian law, see generally Denver Journal of International Law and Policy, Vol. 39 , No. 4, 2011; Michael Schmitt, Louise Arimatsu and Tim McCormack (eds), Yearbook of International Humanitarian Law 2010, Springer, Vol. 13,2011. 487

178

The Applied Ethics of Emerging Military and Security Technologies A. Backstrom and I. Henderson - New capabilities in warfare: an overview of contemporary technological developments and the associated legal and engineering issues in Article 36 weapons reviews

Automated weapons Automated weapon systems:14 are not remotely controlled but function in a self-contained and independent manner once deployed. Examples of such systems include automated sentry guns, sensor-fused munitions and certain anti-vehicle landmines. Although deployed by humans, such systems will independently verify or detect a particular type of target object and then fire or detonate. An automated sentry gun, for instance, may fire, or not, following voice verification of a potential intruder based on a password.15 In short, automated weapons are designed to fire automatically at a target when predetermined parameters are detected. Automated weapons serve three different purposes. Weapons such as mines allow a military to provide area denial without having forces physically present. Automated sentry guns free up combat capability and can perform what would be tedious work for long hours and without the risk of falling asleep.16 Sensor-fused weapons enable a 'shot and scoot' option and can be thought of as an extension of beyond-visual-range weapons.17 The principal legal issue with automated weapons is their ability to discriminate between lawful targets and civilians and civilian objects.18 The second main concern is how to deal with expected incidental injury to civilians and damage to civilian objects.19 Starting with the issue of discrimination, it is worth noting that automated weapons are not new. Mines, booby traps, and even something as simple as a stake at the bottom of a pit are all examples of weapons that, once in place, do not require further control or 'firing' by a person. Some of these weapons also have an element of discrimination in the way they are designed. Anti-vehicle mines, for example, are 14 Not to be confused with automatic weapons, which are weapons that fire multiple times upon activation of the trigger mechanism - e.g., a machine gun that continues firing for as long as the trigger remains activated by the person firing the weapon. 15 Jakob Kellenberger, ICRC President, 'International humanitarian law and new weapon technologies', 34th Round Table on Current Issues of International Humanitarian Law, San Remo, 8-10 September 2011, Keynote address, p. 5, available at: http://iihl.org/iihl/Documents/JKBSan%20Remo%20Speech.pdf (last visited 8 May 2012). Various types of existing automated and autonomous weapons are briefly discussed, with further useful citations, in Chris Taylor, 'Future Air Force unmanned combat aerial vehicle capabilities and law of armed conflict restrictions on their potential use', Australian Command and Staff College, 2011, p. 6 (copy on file with authors). 16 South Korea is developing robots with heat and motion detectors to sense possible threats. Upon detection, an alert is sent to a command centre where the robots audio or video communications system can be used to determine if the target is a threat. If so, the operator can order the robot to fire its gun or 40 mm automatic grenade launcher. CS. Korea deploys sentry robot along N. Korea border', in Agence France-Presse, 13 July 2010, available at: http://www.defensenews.com/article/20100713/DEFSECT02/ 7130302/S-Korea-Deploys-Sentry-Robot-Along-N-Korea-Border (last visited 6 May 2012). 17 A sensor-fused weapon is a weapon where the arming mechanism (the fuse) is integrated with a target detection system (the sensor). 18 Issues such as fratricide are not, strictly speaking, a concern of international humanitarian law. In any event, other means and methods are adopted to reduce fratricide, such as 'blue-force trackers', safe corridors, and restricted fire zones. 19 See above note 1, Article 51(5)(b) and Article 57(2)(a)(iii) of API. 488

The Applied Ethics of Emerging Military and Security Technologies

179

Volume 94 Number 886 Summer 2012 designed to explode only when triggered by a certain weight. Naval mines were initially contact mines, and then advanced to include magnetic mines and acoustic mines. Of course, the problem with such mines is that there is no further discrimination between military objectives or civilian objects that otherwise meet the criteria for the mine to explode.20 One way to overcome this is to combine various trigger mechanisms (sensors) and tailor the combination towards ships that are more likely to be warships or other legitimate targets than to be civilian shipping. As weapons have become more capable and can be fired over a longer range, the ability to undertake combat identification of the enemy at greater distances has become more important. Non-cooperative target recognition (also called automatic target recognition) is the ability to use technology to identify distinguishing features of enemy equipment without having to visually observe that equipment.21 A combination of technology like radar, lasers, communication developments, and beyond-visual-range weapon technology allows an everincreasing ability to identify whether a detected object is friendly, unknown, or enemy and to engage that target. With each advance though, there is not 'a single problem but rather... a continuum of problems of increasing complexity ranging from recognition of a single target type against benign clutter to classification of multiple target types within complex clutter scenes such as ground targets in the urban environment'.22 Significant work is underway to produce integrated systems where cross-cueing of intelligence, surveillance, and reconnaissance sensors allows for improved detection rates, increased resolution, and ultimately better discrimination.23 Multi-sensor integration can achieve up to 10 times better identification and up to 100 times better geolocation accuracy compared with single sensors.24 With something as simple as a traditional pressure-detonated landmine, the initiating mechanism is purely mechanical. If a weight equal to or greater than the set weight is applied, the triggering mechanism will be activated and the mine will explode. This type of detonation mechanism cannot, by itself, discriminate between civilians and combatants (or other lawful targets). The potential for incidental injury at the moment of detonation is also not part of the c detonate/do-not-detonate' equation. While this equation can be considered with 20 Except where the mine is command-detonated. 21 One example is using laser beams (an alternative is millimetre wave radar) to scan an object and then use processing algorithms to compare the image to pre-loaded 3D target patterns. Target identification can be based on specific features with up to 15cm resolution at a distance of 1000 metres. See 'Lased radar (LADAR) guidance system', Defense Update, 2006, available at: http://defense-update.eom/products/l/ ladar.htm (last visited 8 May 2012). 22 'RADAR Automatic Target Recognition (ATR) and Non-Cooperative Target Recognition (NCTR)', NATO, 2010, available at: http://www.rto.nato.int/ACTIVITY_META.asp?ACT=SET-172 (last visited 8 May 2012). 23 See Andy Myers, 'The legal and moral challenges facing the 21st century air commander', in Air Power Review, Vol. 10, No. 1, 2007, p. 81, available at: http://www.raf.mod.uk/rafcms/mediafiles/ 51981818_1143_EC82_2E416EDD90694246.pdf (last visited 8 May 2012). 24 Covering memorandum, Report of the Joint Defense Science Board Intelligence Science Board Task Force on Integrating Sensor-Collected Intelligence, Office of the Under Secretary of Defense for Acquisition, Technology, and Logistics, US Department of Defense, November 2008, p. 1. 489

180

The Applied Ethics of Emerging Military and Security Technologies A. Backstrom and I. Henderson - New capabilities in warfare: an overview of contemporary technological developments and the associated legal and engineering issues in Article 36 weapons reviews

command-detonated landmines, that is clearly a qualitatively different detonation mechanism. With pressure-detonated landmines, the two main ways of limiting incidental damage are either by minimizing the blast and shrapnel, or by placing the mines in areas where civilians are not present or are warned of the presence of mines.25 However, the triggering mechanisms for mines have progressively become more complex. For example, anti-vehicle mines exist that are designed to distinguish between friendly vehicles and enemy vehicles based on a 'signature' catalogue. Mines that are designed to initiate against only military targets, and are deployed consistent with any design limitations, address the issue of discrimination. Nevertheless, that still leaves the potential for incidental injury and damage to civilians and civilian objects. The authors are not aware of any weapon that has sensors and/or algorithms designed to detect the presence of civilians or civilian objects in the vicinity of 'targets'. So, while some weapons claim to be able to distinguish a civilian object from a military objective and only 'fire' at military objectives, the weapon does not also look for the presence of civilian objects in the vicinity of the military objective before firing. Take the hypothetical example of a military vehicle travelling in close proximity to a civilian vehicle. While certain landmines might be able to distinguish between the two types of vehicles and only detonate when triggered by the military vehicle, the potential for incidental damage to the civilian vehicle is not a piece of data that is factored into the detonate/ do-not-detonate algorithm. This is not legally fatal to the use of such automated weapons, but does restrict the manner in which they should be employed on the battlefield. Along with discrimination there is the second issue of the potential for incidental injury to civilians and damage to civilian objects. The two main ways of managing this issue for automated weapons are controlling how they are used (e.g., in areas with a low likelihood of civilians or civilian objects) and/or retaining human overwatch. Both points are discussed further below under the heading 'Methods for the lawful employment of automated and autonomous weapons'. A third option is to increase the 'decision-making capability' of the weapon system, which leads us to autonomous weapons. Autonomous weapons Autonomous weapons are a sophisticated combination of sensors and software that 'can learn or adapt their functioning in response to changing circumstances'.26 An autonomous weapon can loiter in an area of interest, search for targets, identify suitable targets, prosecute a target (i.e., attack the target), and report the point of 25 Of course, history has shown that many anti-personnel landmines were either emplaced without adequate consideration of, or worse intentional disregard for, the risk to civilians. As a result, a majority of states have agreed to a complete ban on the use of non-command-detonated anti-personnel landmines. See ICRC, 'Anti-personnel landmines', 2012, available at: http://www.icrc.org/eng/war-and-law/weapons/antipersonnel-landmines/ (last visited 8 May 2012). 26 J. Kellenberger, above note 15, p. 5. 490

The Applied Ethics of Emerging Military and Security Technologies

181

Volume 94 Number 886 Summer 2012 weapon impact.27 This type of weapon can also act as an intelligence, surveillance, and reconnaissance asset. An example of a potential autonomous weapon is the Wide Area Search Autonomous Attack Miniature Munition (WASAAMM). The WASAAMM: would be a miniature smart cruise missile with the ability to loiter over and search for a specific target, significantly enhancing time-critical targeting of moving or fleeting targets. When the target is acquired, WASAAMM can either attack or relay a signal to obtain permission to attack.28 There are a number of technical and legal issues with weapons such as the WASAAMM.29 While most of the engineering aspects of such a weapon are likely to be achievable in the next twenty-five years, the 'autonomous' part of the weapon still poses significant engineering issues. In addition, there are issues with achieving compliance with international humanitarian law, and resulting rules of engagement, that are yet to be resolved.30 Of course, if the WASAAMM operated in the mode where it relayed a signal to obtain permission to attack,31 that would significantly reduce the engineering and international humanitarian law (and rules of engagement) compliance issues - but it also would not be a true autonomous weapon if operating in that mode. An area that is related to autonomous weapons is the development of artificial intelligence assistants to help humans shorten the observe, orient, decide, act (OODA) loop. The purpose of such decision-support systems is to address the fact that while 'speed-ups in information gathering and distribution can be attained by well-implemented networking, information analysis, understanding and decision making can prove to be severe bottlenecks to the operational tempo'.32 There is very

27 Chris Anzalone, 'Readying air forces for network centric weapons', 2003, slide 9, available at: http://www. dtic.mil/ndia/2003targets/anz.ppt (last visited 8 May 2012). 28 US Air Force, 'Transformation flight plan', 2003, Appendix D, p. 11, available at: http://www.au.af.mil/au/ awc/awcgate/af/af_trans_flightplan_nov03.pdf (last visited 8 May 2012). 29 Myers also discusses some of the moral aspects, e.g., is it 'morally correct for a machine to be able to take a life'? See A. Myers, above note 23, pp. 87-88. See also ICRC, International Humanitarian Law and the Challenges of Contemporary Armed Conflicts, Report of the 31st International Conference of the Red Cross and Red Crescent, 2011, p. 40. Moral issues are also discussed in Kenneth Anderson and Matthew Waxman, 'Law and ethics for robot soldiers', in Policy Review (forthcoming 2012), available at: http://ssrn. com/abstract=2046375 (last visited 8 May 2012). See generally Peter Singer, 'The ethics of killer applications: why is it so hard to talk about morality when it comes to new military technology?', in Journal of Military Ethics, Vol. 9, No. 4, 2010, pp. 299-312. 30 Ibid. 31 For example, the UK 'Fire Shadow' will feature: 'Man In The Loop (MITL) operation, enabling a human operator to overrule the weapon's guidance and divert the weapon's flight path or abort the attack and return to loiter mode in conditions where friendly forces are at risk, prevailing conditions do not comply with rules of engagement, or where an attack could cause excessive collateral damage', see 'Fire Shadow: a persistent killer', Defense Update, 2008, available at: http://defense-update.com/20080804_nre-shadow-apersistent-killer.html (last visited 8 May 2012). 32 Shyni Thomas, Nitin Dhiman, Pankaj Tikkas, Ajay Sharma and Dipti Deodhare, 'Towards faster execution of the OODA loop using dynamic decision support', in Leigh Armistead (ed.), The 3rd International Conference on Information Warfare and Security, 2008, p. 42, available at: http://academicconferences.org/pdfs/iciw08-booklet-A.pdf (last visited 8 May 2012).

491

182

The Applied Ethics of Emerging Military and Security Technologies A. Backstrom and I. Henderson - New capabilities in warfare: an overview of contemporary technological developments and the associated legal and engineering issues in Article 36 weapons reviews

limited publicly available information on how such decision-support systems might operate in the area of targeting. The key issue is how to use 'computer processing to attempt to automate what people have traditionally had to do'.33 Using sensors and computer power to periodically scan an airfield for changes, and thereby cue a human analyst, has been more successful than using sensors such as synthetic aperture radar to provide automatic target recognition.34 A clear difficulty is that the law relating to targeting is generally expressed in broad terms with a range of infinitely varying facts, rather than as precise formulas with constrained variables, which is why a commander's judgement is often needed when determining whether an object or person is subject to lawful attack.35 As Taylor points out, it is this 'highly contextual nature' of targeting that results in there not being a simple checklist of lawful targets.36 However, if a commander was prepared to forgo some theoretical capability, it is possible in a particular armed conflict to produce a subset of objects that are at any given time targetable. As long as the list is maintained and reviewed, at any particular moment in an armed conflict it is certainly possible to decide that military vehicles, radar sites, etcetera are targetable. In other words, a commander could choose to confine the list of targets that are subject to automatic target recognition to a narrow list of objects that are clearly military objectives by their nature - albeit thereby forgoing automatic target recognition of other objects that require more nuanced judgement to determine status as military objectives through their location, purpose, or use.37 The next step is to move beyond a system that is programmed to be a system that, like a commander, learns the nature of military operations and how to apply the law to targeting activities. As communication systems become more complex, not 'only do they pass information, they have the capacity to collate, analyse, disseminate... and display information in preparation for and in the prosecution of military operations'.38 Where a system is 'used to analyse target data and then provide a target solution or profile'39 then the 'system would reasonably

33 See above note 24, p. 47. 34 Ibid., pp. 47-48. Automatic target recognition systems have worked in the laboratory but have not proved reliable when deployed and presented with real data rather than 'unrealistic controlled data for assessing the performance of algorithms', ibid.y pp. 47 and 53. While now somewhat dated, an article that explains how such target recognition works is Paul Kolodzy, 'Multidimensional automatic target recognition system evaluation', in The Lincoln Laboratory Journal, Vol. 6, No. 1, 1993, p. 117. 35 See C. Taylor, above note 15, p. 9. See generally Ian Henderson, The Contemporary Law of Targeting: Military Objectives, Proportionality and Precautions in Attack under Additional Protocol /, Martinus Nijhoff, Leiden, 2009, pp. 45-51. 36 See C. Taylor, ibid., p. 9; see also I. Henderson, ibid.t pp. 49-50. 37 See above note 1, Art. 52(2) of API. 38 See J. McClelland, above note 1, p. 405. The technical issues (from as simple as meta-data standards for the sensor-collected data and available bandwidth for transmission of data, through to the far more complex) should not be downplayed, particularly with multi-sensor data. See generally, Report of the Joint Defense Science Board Intelligence Science Board Task Force on Integrating Sensor-Collected Intelligence, above note 24, pp. 1-9. 39 See J. McClelland, above note 1, p. 405. 492

The Applied Ethics of Emerging Military and Security Technologies

183

Volume 94 Number 886 Summer 2012 fall within the meaning of "means and methods of warfare" as it would be providing an integral part of the targeting decision process'.40 What might a system look like that does not require detailed programming but rather learns? Suppose an artificial intelligence system scans the battlespace and looks for potential targets (let's call it the 'artificial intelligence target recognition system' (AITRS)). Rather than needing to be preprogrammed, the AITRS learns the characteristics of targets that have previously been approved for attack.41 With time, the AITRS gets better at excluding low-probability targets and better at cueing different sensors and applying algorithms to defeat the enemy's attempt at camouflage, countermeasures, etcetera. In one example, the outcome of the process is that the AITRS presents a human operator with a simplified view of the battlespace where only likely targets and their characteristics are presented for human analysis and decision whether to attack. Importantly though, all of the 'raw information' (e.g., imagery, multispectral imagery, voice recordings of intercepted conversations, etcetera) is available for human review. In example two, while the AITRS still presents a human operator with a simplified view of the battlespace with likely targets identified for approval to attack, the human decision-maker is not presented with 'raw information' but rather analysed data.42 For example, the human might be presented with a symbol on a screen that represents a motor vehicle along with the following: • • •

probability of one human rider: 99 per cent probability of body-match to Colonel John Smith:43 75 per cent probability of voice-match to Colonel John Smith: 90 per cent.44

And finally, in example three it is the AITRS itself that decides whether to prosecute an attack. Assuming the AITRS is also linked to a weapon system then the combination is an autonomous weapon system. It would seem beyond current technology to be able to program a machine to make the complicated assessments required to determine whether or not a particular attack would be lawful if there is an expectation of collateral

40 Ibid.y p. 406. 41 See K. Anderson and M. Waxman, above note 29, p. 10. 42 'Automatically processing the sensor data to reduce critical information to a smaller data packet or to provide a go/no-go response could improve reaction time', in Report of the Joint Defence Science Board Intelligence Science Board Task Force on Integrating Sensor-Collected Intelligence, above note 24, p. 43. 43 Assume Colonel Smith is a person on the high-value target list and issues such as hors de combat (e.g., wounded, sick, surrendering, or otherwise out of combat) and collateral damage aside, is otherwise subject to lawful attack. This type of attack is based on identifying a target as being Colonel Smith. Contrast this with attacks based on characteristics of the target that are associated with 'enemy forces' (such as unloading explosives, gathering at certain locations, and other patterns of behaviour) without knowing the actual identity of the target. The latter are becoming known as 'signature' strikes, while the former are 'personality' strikes. See Greg Miller, 'CIA seeks new authority to expand Yemen drone campaign', in The Washington Post, 19 April 2012, available at: http://www.washingtonpost.com/world/national-security/ cia-seeks-new-authority-to-expand-yemen-drone-campaign/2012/04/18/gl QAsaumRT_story.html (last visited 6 May 2012). 44 See also the example used by Myers, and his discussion of multi-sensor cueing. A. Myers, above note 23, p. 84. 493

184

The Applied Ethics of Emerging Military and Security Technologies A. Backstrom and I. Henderson - New capabilities in warfare: an overview of contemporary technological developments and the associated legal and engineering issues in Article 36 weapons reviews

damage.45 Indeed, one would wonder even where to start as assessing anticipated military advantage against expected collateral damage is like comparing apples and oranges.46 For now, that would mean any such weapon system should be employed in such a manner as to reduce the risk of collateral damage being expected.47 However, a true AITRS that was initially operated with human oversight could presumably 'learn' from the decisions made by its human operators on acceptable and unacceptable collateral damage.48 As pointed out at footnote 46 above, collateral damage assessments are not just about calculating and comparing numbers - a function well suited to current computers. But instead, there is a clear qualitative assessment, albeit one where the things being compared are not even alike. How could a machine ever make such judgements? Perhaps not through direct programming but rather by pursuing the artificial intelligence route. So, along with learning what are lawful targets, our hypothetical AITRS would also learn how to make a proportionality assessment in the same way humans do - through observation, experience, correction in the training environment (e.g., war games), and so on. An AITRS that failed to make reasonable judgements (in the view of the instructing staff) might be treated the same as a junior officer who never quite makes the grade (perhaps kept on staff but not given decision-making authority), whereas an AITRS that proved itself on course and in field exercises could be promoted, entrusted with increasing degrees of autonomy, etcetera. Another technical problem is that the required identification standard for determining whether a person or object is a lawful target is not clear-cut. The standard expressed by the International Criminal Tribunal for the Former Yugoslavia is that of 'reasonable belief'.49 In their rules of engagement, at least two states have adopted the standard of 'reasonable certainty'.50 A third approach, 45 ICRC, International Humanitarian Law and the Challenges of Contemporary Armed Conflicts., above note 29, pp. 39-40; William Boothby, Weapons and the Law of Armed Conflict, Oxford University Press, Oxford, 2009, p. 233. 46 See I. Henderson, above note 35, pp. 228-229. Many facets of military operations require commanders to exercise judgement, and this includes certain legal issues. Having determined what is the military advantage expected from an attack (not an exact quantity in itself) on a command and control node, and estimated the expected incidental civilian injury, death, and damage, somehow these two factors must be compared. The evaluation is clearly somewhat subjective and likely to differ from person to person, rather than objective and mathematical. In this respect, one can think of interpreting and complying with certain aspects of international humanitarian law as part art and not just pure science. 47 W. Boothby, above note 45, p. 233. 48 For a contrary view, see Markus Wagner, 'Taking humans out of the loop: implications for international humanitarian law', in Journal of Law Information and Science, Vol. 21, 2011, p. 11, available at: http:// papers.ssrn.com/sol3/papers.cfm?abstract_id= 1874039 (last visited 8 May 2012), who concludes that autonomous systems will never be able to comply with the principle of proportionality. 49 'The Trial Chamber understands that such an object [normally dedicated to civilian purposes] shall not be attacked when it is not reasonable to believe, in the circumstances of the person contemplating the attack, including the information available to the latter, that the object is being used to make an effective contribution to military action', ICTY, The Prosecutor v Galic, Case No IT-98-29-T, Judgement (Trial Chamber), 5 December 2003, para. 51. 50 International and Operational Law Department: The Judge Advocate General's Legal Centre & School (US Army), Operational Law Handbook 2012, 'CFLCC ROE Card', p. 103, available at: http://www.loc.gov/rr/ frd/Military_Law/operational-law-handbooks.html (last visited 8 May 2012); ICRC, Customary IHL, 494

The Applied Ethics of Emerging Military and Security Technologies

185

Volume 94 Number 886 Summer 2012 reflected in the San Remo Rules of Engagement Handbook is to require identification by visual and/or certain technical means.51 The commander authorizing deployment of an autonomous weapon, and any operator providing overwatch of it, will need to know what standard was adopted to ensure that both international law and any operation-specific rules of engagement are complied with. It is also possible to combine the requirement for a particular level of certainty (e.g., reasonable belief or reasonable certainty) with a complementary requirement for identification to be by visual and/or certain technical means. Presumably, for any identification standard to be able to be coded52 into a computer program that standard would need to be turned into a quantifiable confirmation expressed as a statistical probability. For example, 'reasonable belief would need to be transformed from a subjective concept into an objective and measurable quantity - for example, '95 per cent degree of confidence'. This would then be used as the benchmark against which field experience (including historical data) could produce an empirical equation to profile a potential target. Then new battlespace data can be compared to quantify (assess) the strength of correlation to the required degree of confidence (in the current example, 95 per cent or greater correlation). However, the uncertainty of measurement associated with the battlespace feedback sensors would also need to be quantified as a distinctly separate acceptance criterion. For example, assume in certain operational circumstances that an uncertainty of measurement results in an uncertainty of plus or minus 1 per cent, whereas in other operational circumstances the uncertainty is plus or minus 10 per cent. In the first circumstance, to be confident of 95 per cent certainty, the correlation would need to be not less than 96 per cent. In the second case, though, the required degree of confidence would never be achievable as the required degree of confidence of 95 per cent cannot be achieved due to the measurement uncertainty.53 Methods for the lawful employment of automated and autonomous weapons Most weapons are not unlawful as such - it is how a weapon is used and the surrounding circumstances that affect legality.54 This applies equally to automated and autonomous weapons, unless such weapons were to be banned by treaty

51 52 53 54

'Philippines: Practice Relating to Rule 16. Target Verification', 2012, available at: http://www.icrc.org/ customary-ihl/eng/docs/v2_cou_ph_rulel6 (last visited 8 May 2012). See the sample rules at Series 31 'Identification of Targets', in International Institute of Humanitarian Law, Rules Of Engagement Handbook^ San Remo, 2009, p. 38. Again, a non-coding method would be through artificial intelligence. In this second case, the targeting system could provide cueing for other sensors or a human operator; it just would be programmed to not permit autonomous weapon release. Philip Spoerri, 'Round table on new weapon technologies and IHL - conclusions', in 34th Round Table on Current Issues of International Humanitarian Law, San Remo, 8-10 September 2011, available at: http:// www.icrc.org/eng/resources/documents/statement/new-weapon-technologies-statement-2011-09-13.htm (last visited 8 May 2012). 495

186

The Applied Ethics of Emerging Military and Security Technologies A. Backstrom and I. Henderson - New capabilities in warfare: an overview of contemporary technological developments and the associated legal and engineering issues in Article 36 weapons reviews

(e.g., like non-command-detonated anti-personnel landmines). There are various ways to ensure the lawful employment of such weapons. [The] absence of what is called a 'man in the loop' does not necessarily mean that the weapon is incapable of being used in a manner consistent with the principle of distinction. The target detection, identification and recognition phases may rely on sensors that have the ability to distinguish between military and non-military targets. By combining several sensors the discriminatory ability of the weapon is greatly enhanced.55 One method of reducing the target recognition and programming problem is to not try to achieve the full range of targeting options provided for by the law. For example, a target recognition system might be programmed to only look for highpriority targets such as mobile air defence systems and surface-to-surface rocket launchers - objects that are military objectives by nature and, therefore, somewhat easier to program as lawful targets compared to objects that become military objectives by location, purpose, or use.56 As these targets can represent a high priority, the targeting software might be programmed to only attack these targets and not prosecute an attack against an otherwise lawful target that was detected first but is of lower priority.57 If no high-priority target is detected, the attack could be aborted or might be prosecuted against other targets that are military objectives by nature. Adopting this type of approach would alleviate the need to resolve such difficult issues as how to program an autonomous system to not attack an ambulance except where that ambulance has lost protection from attack due to location, purpose, or use.58 A further safeguard includes having the weapon '"overwatched" and controlled remotely, thereby allowing for it to be switched off if considered potentially dangerous to non-military objects'.59 Such overwatch is only legally (and operationally) useful if the operators provide a genuine review and do not simply trust the system's output.60 In other words, the operator has to value add. For example, if an operator is presented with an icon indicating that a hostile target has been identified, then the operator would be adding to the process if that person separately considered the data, observed the target area for the presence of civilians, or in some other way did more than simply authorize or prosecute an attack based on the analysis produced by the targeting software. In other words, the operator 55 J. McClelland, above note 1, pp. 408-409. 56 See Lockheed Martin, 'Low cost autonomous attack system', in Defense Update, 2006, available at: http:// defense-update.com/products/1/locaas.htm (last visited 8 May 2012). 57 An example would be detecting a T-72 tank but ignoring it as a low-priority target and continuing in search mode until detecting and engaging an SA-8 mobile surface-to-air missile launcher, ibid. 58 The presumption being that the high-priority targets are all clearly military in nature and, therefore, it would be easier to program target recognition software to identify such targets. If the high-priority targets happened to be ambulances being misused as mobile command and control vehicles, programming issues would still remain. See above note 37 and the accompanying text. 59 J. McClelland, above note 1, pp. 408-409. 60 See Report of Defense Science Board Task Force on Patriot System Performance: Report Summary^ Office of the Under Secretary of Defense for Acquisition, Technology, and Logistics, 2005, p. 2. 496

The Applied Ethics of Emerging Military and Security Technologies

187

Volume 94 Number 886 Summer 2012 is either double-checking whether the target itself may be lawfully attacked, or is ensuring that the other precautions in attack (minimizing collateral damage, assessing any remaining collateral damage as proportional, issuing a warning to civilians where required, etcetera) are being undertaken. A problem arises where the operator is provided with large volumes of data,61 as his or her ability to provide meaningful oversight could be compromised by information overload.62 A way to manage this would be for the targeting software to be programmed in such a way that the release of a weapon is recommended only when the target area is clear of non-military objects.63 In other circumstances, the targeting software might simply identify the presence of a target and of non-military objects and not provide a weapon release recommendation, but only a weapon release solution. In other words, the targeting software is identifying how a particular target could be hit, but is neutral on whether or not the attack should be prosecuted, thereby making it clear to the operator that there are further considerations that still need to be taken into account prior to weapon release. Two further legal aspects of automated and autonomous weapons (and remotely operated weapons) that require further consideration are the rules relating to self-defence64 and how the risk to own forces is considered when assessing the military advantage from an attack and the expected collateral damage. The issue of self-defence has two aspects: national self-defence (which is principally about what a state can do in response to an attack) and individual selfdefence (which is principally about what an individual can do in response to an attack).65 Prior to an armed conflict commencing, the first unlawful use of force against a state's warships and military aircraft may be considered as amounting to an armed attack on that state, thereby allowing it to invoke the right of national selfdefence. Would the same conclusion be reached if the warship or military aircraft were unmanned? Imagine an attack on a warship that for whatever reason had none of the ship's company on board at the time of the attack. What is it about attacks on warships that is of legal significance: the mere fact that it is a military vessel that is flagged to the state, the likelihood that any attack on the warship also imperils the ship's company, or a combination of the two? Second, consider the different legal authorities for using lethal force. In broad terms, individual self-defence allows Person A to use lethal force against Person B when Person B is threatening the life of Person A.66 Whether Persons A and B are opposing enemy soldiers or not is an irrelevant factor. Compare this to international humanitarian law, which allows Soldier A to use lethal force against 61 This could be a single system that processes and displays large volumes of data or a single operator who is given multiple systems to oversee. 62 ICRC, International Humanitarian Law and the Challenges of Contemporary Armed Conflicts, above note 29, p. 39. 63 J. McClelland, above note 1, pp. 408-409. 64 Conversations between Patrick Keane and Ian Henderson, 2011-2012. 65 In this context, individual self-defence also encompasses the issue of defending another party against an unlawful attack. 66 Domestic criminal law varies from jurisdiction to jurisdiction and the issue is more nuanced than this simple explanation. 497

188

The Applied Ethics of Emerging Military and Security Technologies A. Backstrom and I. Henderson - New capabilities in warfare: an overview of contemporary technological developments and the associated legal and engineering issues in Article 36 weapons reviews

Soldier B purely because Soldier B is the enemy.67 Soldier B need not be posing any direct threat to Soldier A at all. Indeed, Soldier B may be asleep and Soldier A might be operating a remotely piloted armed aircraft. However, Soldier A must be satisfied, to the requisite legal standard, that the target is in fact an enemy soldier. Identification, not threat, is the key issue. However, during rules of engagement briefings military members are taught that during an armed conflict not only can they fire upon identified enemy, but also that nothing in international humanitarian law (or other law for that matter) prevents them from returning fire against an unidentified68 contact in individual self-defence.69 This well-known mantra will require reconsideration when briefing operators of unmanned assets. In all but the most unusual of circumstances, the remote operator of an unmanned asset will not be personally endangered if that unmanned asset is fired upon. This issue will need to be carefully considered by drafters of rules of engagement and military commanders, as generally returning fire to protect only equipment (and not lives) would be illegal under the paradigm of individual self-defence.70 Compare this to the international humanitarian law paradigm that arguably would allow use of lethal force to protect certain types of property and equipment from attack, based on an argument that whoever is attacking the property and equipment must be either (1) an enemy soldier, or (2) a civilian taking a direct part in hostilities.71 Similarly, how to treat an unmanned asset under international humanitarian law when considering the 'military advantage' to be gained from an attack is not straightforward. While risk to attacking forces is a factor that can be legitimately considered as part of the military advantage assessment,72 traditionally that has been thought of as applying to the combatants and not the military equipment. While it is logical that risk of loss of military equipment is also a factor, it will clearly be a lesser factor compared with risk to civilian life. In conclusion, it is the commander who has legal responsibility 'for ensuring that appropriate precautions in attack are taken'.73 Regardless of how remote in time or space from the moment of an attack, individual and state responsibility attaches to those who authorize the use of an autonomous weapon system.74 It should be noted that this does not mean a commander is automatically 67 Subject to Soldier B being hors de combat. It would also be lawful under international humanitarian law for Soldier A to fire upon Person B for such time as Person B was a civilian taking a direct part in hostilities, but space does not allow a further exploration of that point. 68 Unidentified in the sense of unaware whether the person firing is an enemy soldier, a civilian, etcetera. There is still a requirement to identify the source (i.e., the location) of the threat. 69 The concept of 'unit self-defence' adds little to the present discussion, being a blend of both national and individual self-defence. 70 The legal paradigm of individual self-defence can be invoked to protect equipment where loss of that equipment would directly endanger life. 71 As long as I am satisfied that I have at least one legal basis for using lethal force against a person (e.g., enemy combatant of civilian taking a direct part in hostilities), I do not have to determine which one is actually the case. Space does not allow a full discussion of this point, or the other interesting issue of using force to protect equipment as part of a national security interest under national self-defence outside of an armed conflict. 72 I. Henderson, above note 35, p. 199. 73 C. Taylor, above note 15, p. 12. 74 P. Spoerri, above note 54. 498

The Applied Ethics of Emerging Military and Security Technologies

189

Volume 94 Number 886 Summer 2012 liable if something goes wrong. In war, accidents happen. The point under discussion is who could be found liable, not who is guilty. The above discussion has focused on the intended target of a weapon. The following discussion deals with emerging weapons that highlight the legal issue of weapon effect even where the target is an otherwise lawful target.

Weapon effect Directed energy weapons Directed energy weapons use the electromagnetic spectrum (particularly ultraviolet through to infrared and radio-frequency (including microwave)) or sound waves to conduct attacks.75 As a means of affecting enemy combat capability, directed energy weapons can be employed directly against enemy personnel and equipment, or indirectly as anti-sensor weapons. For example, laser systems could be employed as 'dazzlers' against aided and unaided human eyesight, infrared sensors, and spacebased or airborne sensors,76 and as anti-equipment weapons.77 High-powered microwaves can be employed against electronic components and communications equipment. Lasers and radars are also used for target detection, target tracking, and finally for providing target guidance for other conventional weapons. When directed energy weapons are employed against enemy communication systems, the legal issues are not significantly different from those that would arise if kinetic means were used. Is the target (e.g., a communication system) a lawful military objective and have incidental effects on the civilian population been assessed? As directed energy weapons have the clear potential to reduce the immediate collateral effects commonly associated with high-explosive weapons (e.g., blast and fragmentation),78 the main incidental effect to consider is the second-order consequences of shutting down a communication system such as air traffic control or emergency services. While it is common to state that second-order effects must be considered when assessing the lawfulness of an attack, a proper understanding of what is 'counted' as collateral damage for the purpose of proportionality assessments is required. It is a mistake to think that any inconvenience caused to the civilian population must be assessed. That is wrong. 75 Particle weapons are also being studied but currently appear to remain in the area of theory, see Federation of American Scientists, 'Neutral particle beam', 2012, available at: http://www.fas.org/spp/starwars/ program/npb.htm (last visited 8 June 2012); Carlo Popp, 'High energy laser directed energy weapons', 2012, available at: http://www.ausairpower.net/APA-DEW-HEL-Analysis.html (last visited 8 June 2012). For a good review of'non-lethal' directed energy weapons (including acoustic weapons), see Neil Davison, 'Non-Lethal' Weapons, Palgrave MacMillan, Basingstoke, 2009, pp. 143-219. 76 Laser systems could be employed as 'dazzlers' against space-based or airborne sensors while high-powered microwaves can be employed against electronic components, see Defense Science Board Task Force on Directed Energy Weapons, Office of the Under Secretary of Defense for Acquisition, Technology, and Logistics, US Department of Defense, December 2007, pp. 2, 11 and 13. 77 Particularly for use against missiles, mine-clearing and as anti-satellite weapons, ibid., p. 19. 78 As do other kinetic weapons such as inert concrete bombs. 499

190

The Applied Ethics of Emerging Military and Security Technologies A. Backstrom and I. Henderson - New capabilities in warfare: an overview of contemporary technological developments and the associated legal and engineering issues in Article 36 weapons reviews

Along with death and injury, it is only 'damage' to civilian objects that must be considered.79 Therefore, a directed energy weapon attack on an air traffic control system that affected both military and civilian air traffic80 need only consider the extent to which civilian aircraft would be damaged, along with associated risk of injury or death to civilians, and need not consider mere inconvenience, disruption to business, etcetera.81 Directed energy weapons are also being developed as non-lethal (also known as less-lethal) weapons to provide a broader response continuum for a controlled escalation of force.82 For a variety of operational and legal reasons, it is preferable to have an option to preserve life while still achieving a temporary or extended incapacitation of the targeted individual. However, the very terms used to describe these weapons can cause problems beyond any particular legal or policy constraints.83 The unintended consequences of the weapons (particularly due to the unknown health characteristics of the target) can lead to permanent injury or death. Such consequences are then used to stigmatize the concept of a non-lethal/lessthan-lethal weapon. The important point to remember is that as for any other combat capability (including kinetic weapons), use of directed energy weapons during an armed conflict is governed by international humanitarian law and by any applicable rules of engagement and directions from the combat commander.84 Non-lethal directed energy weapons can be used in combination with traditional, lethal weapons. For example, it is reported that: Another weapon... can broadcast deafening and highly irritating tones over great distances. The long-range device precisely emits a high-energy acoustic beam as far as five football fields away. To a reporter standing across the airstrip from where it was set up in a hangar here, it sounded as if someone was shouting directly into his ear. The device 'has proven useful for clearing streets and rooftops during cordon and search ... and for drawing out enemy snipers who are subsequently destroyed by our own snipers', the 361st Psychological Operations Company, which has tested the system in Iraq, told engineers in a report.85 79 See above note 1, Art. 51(5)(b) and Art. 57(2)(a)(iii) of API. 80 See ICRC, 'Cyber warfare and IHL: some thoughts and questions', 2011, available at: http://www.icrc.org/ eng/resources/documents/feature/2011/weapons-feature-201 l-08-16.htm (last visited 8 May 2012). 81 Space does not permit a full discussion of this point, but other factors warranting discussion are effects on neutrals and any third-order affects (e.g., the effect on emergency health-care flights), although query whether the 'ICRC might have a role in helping to generate international consensus on whether civilians have fundamental rights to information, electrical power, etc., in the same way as they have rights to life and property', ibid. 82 See generally, US Department of Defense, 'Non-lethal weapons program', available at: http://jnlwp. defense.gov/index.html (last visited 8 May 2012); James Duncan, 'A primer on the employment of nonlethal weapons', in Naval Law Review, Vol. XLV, 1998. See also Jiirgen Altmann, 'Millimetre waves, lasers, acoustics for non-lethal weapons? Physics analyses and inferences', in DSF-Forschung, 2008, available at: http://www.bundesstiftung-friedensforschung.de/pdf-docs/berichtaltmann2.pdf (last visited 8 May 2012). 83 See Defense Science Board Task Force on Directed Energy Weapons, above note 76, p. xii. 84 Ibid., p. xiii. 85 Bryan Bender, 'US testing nonlethal weapons arsenal for use in Iraq', in Boston Globe, 5 August 2005, available at: http://www.boston.com/news/nation/articles/2005/08/05/us_testing_nonlethal_weapons_ 500

The Applied Ethics of Emerging Military and Security Technologies

191

Volume 94 Number 886 Summer 2012 This form of directed energy weapon demonstrates two key issues associated with non-lethal weapon technology. First, such weapons are likely to be used against a civilian population - in this case, to clear streets and rooftops.86 Second, the nonlethal weapon may be employed in conjunction with existing weapons to achieve a lethal effect. Other directed energy weapons include active denial systems.87 One of the weapons that has been successfully tested is a heat beam ... that can 'bake' a person by heating the moisture in the first one-64th of an inch of the epidural layer of the skin. It was originally developed for the Department of Energy to keep trespassers away from nuclear facilities.88 The 'irresistible heating sensation on the adversary's skin [causes] an immediate deterrence effect';89 because the heating sensation causes 'intolerable pain [the body's] natural defense mechanisms take over'.90 The 'intense heating sensation stops only if the individual moves out of the beam's path or if the beam is turned off'.91 Because flamethrowers and other incendiary weapons are only regulated and not specifically banned by international humanitarian law, there is no legal reason to deny the use of the active denial system in combat.92 Where active denial systems are being used as an invisible 'fence', then clearly it is a matter for the individual as to whether to approach the fence, and if so, whether to try to breach the perimeter.93 However, if active denial systems are being aimed at a person or group to clear an area,94 an issue that needs consideration with this type of weapon is how would a person who is being subjected to this type of attack either surrender or consciously choose to leave an area when they can neither see the beam,95 may be unaware of even this type of technology, and are reacting to intolerable pain like the 'feeling... [of] touching a hot frying pan'?96 Reacting

86 87

88 89 90 91 92 93 94 95 96

arsenal_for_use_in_iraq/?page=full (last visited 8 June 2012). The Long Range Acoustic Device is described in detail in Altmann, above note 82, pp. 44-53. As Altmann notes, while described as a hailing or warning device, it can potentially be used as a weapon, ibid.y p. 52. For a discussion on attempts to avoid the legal requirement to review new 'weapons' by describing these types of acoustic devices by other names, see N. Davison, above note 75, pp. 102 and 205. Concerns about using non-lethal weapons against the civilian population, or against 'individuals before it is ascertained whether or not they are combatants' are raised in Davison, above note 75, pp. 216-217. Defense Science Board Task Force on Directed Energy Weapons, note 76, pp. 33 and 38. For more details see 'Active denials system demonstrates capabilities at CENTCOM', United State Central Command, available at: http://www.centcom.mil/press-releases/active-denial-system-demonstrates-capabilities-at-centcom (last visited 8 May 2012). B. Bender, above note 85. The Active denial system is described in detail in J. Altmann, above note 82, pp. 14-28. Defense Science Board Task Force on Directed Energy Weapons, above note 76, p. 38. Ibid., p. 42. Ibid. J. Altmann, above note 82, p. 27. Conversation between Patrick Keane and Ian Henderson, 14 April 2012. As opposed to traditional kinetic weapons where the desired effect is to disable (through either wounding or killing). See J. Altmann, above note 82, p. 28. Defense Science Board Task Force on Directed Energy Weapons, above note 76, p. 42.

501

192

The Applied Ethics of Emerging Military and Security Technologies A. Backstrom and I. Henderson - New capabilities in warfare: an overview of contemporary technological developments and the associated legal and engineering issues in Article 36 weapons reviews

instinctively to intolerable pain seems likely to make a person incapable of rational thought.97 Employment of such weapons will need to be well regulated through a combination of the tactics, techniques and procedures, and rules of engagement to ensure that unnecessary suffering is not caused through continued use of the weapon because a person has not cleared the target area.98 In this respect, and noting that the active denial system has 'successfully undergone legal, treaty and US Central Command rules of engagement reviews',99 it is worth recalling that as states' legal obligations vary, and as states may employ weapons differently, the legal review by one state is not determinative of the issue for other states.100 This may prove interesting in the sale of highly technical equipment, as the details of a weapon's capability are often highly classified and compartmentalized. The state conducting the review may not control access to the necessary data. As discussed below, this may require lawyers, engineers, and operators to work together cooperatively and imaginatively to overcome security classification and compartmental access limitations. A similar directed energy weapon using different technology is 'a highpowered white light so intense as to send any but the most determined attackers running in the opposite direction'.101 Concepts for employment of the weapon appear to include using it as a means to identify hostile forces, as evidenced by the statement: clf anyone appears willing to withstand the discomfort, "I know your intent", [Colonel Wade] Hall [a top project official] said. "I will kill you."'102 While initially such statements appear quite concerning, it is instructive to consider whether this is in reality any different from the 'traditional' warnings and escalation of force scenarios such as 'stop or I will shoot' or employment of flares and dazzlers to warn vehicles not to approach too close to military convoys. Where directed energy weapons are used to counter (often improvised) explosive devices,103 the issue is primarily about consequences. If the directed energy weapon is causing a detonation at a safe range from friendly forces, there is a requirement to consider whether any civilians or other noncombatants are in the vicinity of the detonation and, therefore, at risk of injury or death.104

97 Email April-Leigh Rose/Ian Henderson, 24 April 2012. 98 Altmann also recommends investigating risk to eyesight due to potential damage to the cornea; see J. Altmann, above note 82, p. 28. 99 Ibid., p. 38. 100 See J. McClelland, above note 1, p. 411, who makes this point with respect to manufacturer's claims of legality. 101 B. Bender, above note 85. 102 Ibid. 103 See Defense Science Board Task Force on Directed Energy Weapons, above note 76, p. 40. 104 Space does not permit a full exploration of this point, but note that the issues are different if instead of causing a detonation the countermeasure prevents the explosive device from detonating. 502

The Applied Ethics of Emerging Military and Security Technologies

193

Volume 94 Number 886 Summer 2012 Cyber operations Cyber operations are: operations against or via a computer or a computer system through a data stream.105 Such operations can aim to do different things, for instance to infiltrate a system and collect, export, destroy, change, or encrypt data or to trigger, alter or otherwise manipulate processes controlled by the infiltrated computer system. By these means, a variety of 'targets' in the real world can be destroyed, altered or disrupted, such as industries, infrastructures, telecommunications, or financial systems.106 Cyber operations are conducted via software, hardware, or via a combination of software and personnel. A recent example of a cyber operation that was essentially conducted purely by software is the Stuxnet virus. Once in place, the Stuxnet virus appears to have operated independently of any further human input.107 Compare this to a software program that is designed to allow a remote operator to exercise control over a computer - allowing, among other things, the upload of data or modification of data on the target computer. Finally, a non-military example of a cyber operation that requires both hardware and software is credit card skimming. The application of specific international humanitarian law rules to cyber warfare remains a topic of debate.108 However, for the purposes of this article, it is assumed that the key international humanitarian law principles of distinction, proportionality, and precaution, apply, as a minimum, to those cyber attacks that have physical consequences (e.g., the Stuxnet virus altered the operating conditions for the Iranian uranium enrichment centrifuges, which ultimately resulted in physical damage to those centrifuges).109 Four particular legal aspects of cyber weapons are worth mentioning. First, cyber weapons have the distinct possibility of being operated by civilians.110 The 'weapon' is likely to be remote from the battlefield, is technologically sophisticated, and does not have an immediate association with death and injury. The operation of the cyber weapon exposes a civilian operator to 105 Based on this definition, a kinetic attack to shut down a computer system (for example, by dropping a bomb on the building housing the computer) would not be a cyber operation. 106 ICRC, International Humanitarian Law and the Challenges of Contemporary Armed Conflicts, above note 29, p. 36. 107 See Angus Batey, 'The spies behind your screen', in The Telegraphy 24 November 2011; Jack Goldsmith, 'Richard Clarke says Stuxnet was a US operation', in LawFare: Hard National Security Choices, 29 March 2012, available at: http://www.lawfareblog.com/2012/03/richard-clarke-says-stuxnet-was-a-u-s-operation/ (last visited 18 April 2012). 108 See 'Tallinn Manual on the International Law Applicable to Cyber Warfare', 2012, pp. 17-22, available at: http://issuu.com/nato_ccd_coe/docs/tallinn_manual_draft/23 (last visited 8 June 2012). 109 ICRC, International Humanitarian Law and the Challenges of Contemporary Armed Conflicts, above note 29, pp. 36-37. 110 See Adam Segal, 'China's cyber stealth on new frontline', in the Australian Financial Review, 30 March 2012, available at: http://afr.eom/p/lifestyle/review/china_cyber_stealth_on_ new_frontline_ z6YvFROmo3uC87zJvCEq6H (last visited 1 June 2012), referring to 'cyber-militias' at technology companies recruited by the People's Liberation Army. 503

194

The Applied Ethics of Emerging Military and Security Technologies A. Backstrom and I. Henderson - New capabilities in warfare: an overview of contemporary technological developments and the associated legal and engineering issues in Article 36 weapons reviews

lethal targeting (as a civilian taking a direct part in hostilities),111 as well as potential criminal prosecution for engaging in acts not protected by the combatant immunity enjoyed by members of the armed forces.112 These issues are discussed in detail in a recent article by Watts who raises, among other things, the possibility of the need for a complete rethink of how the law on direct participation in hostilities applies in the area of cyber warfare.113 It could also be queried what training such civilian operators might have in the relevant rules of international humanitarian law.114 Second, cyber attacks can have consequences in the real world and not just the virtual world.115 Where those consequences affect the civilian population by causing loss of civilian life, injury to civilians, damage to civilian objects, or a combination thereof, those consequences must be considered under international humanitarian law.116 The discussion of this point for directed energy weapon attacks applies equally to cyber attacks. A further related consideration is that where it could reasonably be expected that a virus introduced into a military system might find its way into civilian systems and cause infrastructure damage, that collateral damage must also be considered.117 A common example of a possible cyber attack that would directly affect civilians is disabling a power station - either just by shutting it down, or by overloading or shutting down a fail-safe, thereby damaging hardware. This can potentially happen to any infrastructure maintained by software. Third, cyber weapons need to be considered not only in relation to international humanitarian law, but also very importantly under jus ad bellum.118 As Blake and Imburgia point out, even if a cyber attack has no kinetic effects, the attack might still be contrary to the UN Charter specifically or international law generally119 and may, if amounting to an 'armed attack', legitimize the use offeree by the affected state in self-defence.

111 See above note 1, Article 51(3) of API. 112 On both these points, see D. Blake and J. Imburgia, above note 1, pp. 195-196. 113 See Sean Watts, 'Combatant status and computer network Attack', in Virginia Journal of International Law, Vol. 50, No. 2, 2010, p. 391. 114 See J. Kellenberger, above note 15, where this point was made with respect to remotely operated weapon systems. 115 ICRC, 'Cyber warfare and IHL: some thoughts and questions', above note 80. 116 See above note 1, Art. 51(5)(b) and Art. 57(2)(a)(iii) of API. It is a matter of policy whether to consider other consequences for the civilian population such as disruption, loss of amenities, etcetera. 117 See ICRC, International Humanitarian Law and the Challenges of Contemporary Armed Conflicts, above note 29, p. 38. 118 Put simply, jus ad bellum is the law regulating the overall resort to the use of force, compared to international humanitarian law (jus in bello) that regulates the individual instances of the application of force during an armed conflict. See Matthew Waxman, 'Cyber attacks as "force" under UN Charter Article 2(4)', in Raul Pedrozo and Daria Wollschlaeger (eds), International Law and the Changing Character of War, International Law Studies, Vol. 87, 2011, p. 43; Sean Watts, 'Low-intensity computer network attack and self-defense', in ibid., p. 59; Michael Schmitt, 'Cyber operations and the jus ad bellum revisited', in Villanova Law Review, Vol. 56, No. 3, 2011, pp. 569-605. 119 D. Blake and J. Imburgia, above note 1, pp. 184-189. Discussed in more detail in M. Schmitt, ibid., who also discusses the current 'fault lines in the law governing the use offeree [that] have appeared because it is a body of law that predates the advent of cyber operations'. 504

The Applied Ethics of Emerging Military and Security Technologies

195

Volume 94 Number 886 Summer 2012 Finally, the very nature of cyber warfare can make it hard to determine who initiated an attack, and issues of attribution go to the very heart of both state responsibility and individual accountability.120

Nanotechnology and weaponization of neurobiology Nano-weapons are hard to define, but encompass not only objects and devices using nanotechnology that are designed or used for harming humans, but also those causing harmful effects in nano-scale if those effects characterise the lethality of the weapon.121 An example of the latter is the Dense Inert Metal Explosive (DIME): DIME involves an explosive spray of superheated micro shrapnel made from milled and powdered Heavy Metal Tungsten Alloy (HMTA), which is highly lethal within a relatively small area. The HMTA powder turns to dust (involving even more minute particles) on impact. It loses inertia very quickly due to air resistance, burning and destroying through a very precise angulation everything within a four-meter range - and it is claimed to be highly carcinogenic and an environmental toxin. This new weapon was developed originally by the US Air Force and is designed to reduce collateral damage in urban warfare by limiting the range of explosive force.122 The 'capacity [of DIME] to cause untreatable and unnecessary suffering (particularly because no shrapnel is large enough to be readily detected or removed by medical personnel) has alarmed medical experts'.123 The other concern with nanotechnology is that elements and chemicals that on a macro scale are not directly harmful to humans can be highly chemically reactive on the nanoscale. This may require a review of what international humanitarian law considers as chemical weapons. Similarly, with the current advances in the understanding of the human genome and in neuroscience, there exists the very real possibility of militarization of this knowledge.124 One of the legal consequences is a need to reappraise maintaining 120 J. Kellenberger, above note 15; ICRC, International Humanitarian Law and the Challenges of Contemporary Armed Conflicts, above note 29, p. 37. 121 H. Nasu and T. Faunce, above note 10, p. 23. 122 Whether such a weapon has been used in actual combat appears to remain a matter of speculation - see generally Dense Inert Metal Explosive (DIME), Global Security, available at: http://www.globalsecurity.org/ military/systems/munitions/dime.htm (last visited 8 May 2012). 123 H. Nasu and T. Faunce, above note 10, p. 22. Along with Art. 35(2) of API, above note 1, on unnecessary suffering, there is also Protocol I of the Convention on Certain Conventional Weapons on Non-Detectable Fragments, (10 October 1980). Amnesty International is of the view that 'further studies are required before it can be determined whether the use of DIME munitions is lawful under international law'. Amnesty International, 'Dense Inert Metal Explosives (DIME)', in Fuelling conflict: foreign arms supplies to Israel/Gaza, 2009, available at: http://www.amnesty.org/en/library/asset/MDE15/012/2009/en/ 5be86fc2-994e-4eeb-a6e8-3ddf68c28b31/mdel50122009en.html#0.12. (last visited 8 May 2012). For a discussion generally of the Protocol I of the Convention on Certain Conventional Weapons on NonDetectable Fragments, see W. Boothby, above note 45, pp. 196-199. 124 See generally Mark Wheelis and Malcolm Dando, 'Neurobiology: a case study for the imminent militarization of biology', in International Review of the Red Cross, Vol. 87, No. 859, 2005, p. 553. See also 505

196

The Applied Ethics of Emerging Military and Security Technologies A. Backstrom and I. Henderson - New capabilities in warfare: an overview of contemporary technological developments and the associated legal and engineering issues in Article 36 weapons reviews

a legal distinction between chemical and biological weapons. It may be that based on the manner in which they can be used we should legally view these weapons as part of a 'continuous biochemical threat spectrum, with the Chemical Weapons Convention and Biological and Toxin Weapons Convention (CWC and BTWC) overlapping in their coverage of mid-spectrum agents such as toxins and bioregulators'.125 There are competing tensions in this area. Quite understandably, chemical and biological weapons have a 'bad name'. At the same time, research is underway into non-lethal weapons such as incapacitating biochemical weapons. Although there is currently no universally agreed definition, incapacitating biochemical agents can be described as substances whose chemical action on specific biochemical processes and physiological systems, especially those affecting the higher regulatory activity of the central nervous system, produce a disabling condition (e.g., can cause incapacitation or disorientation, incoherence, hallucination, sedation, loss of consciousness). They are also called chemical incapacitating agents, biotechnical agents, calmatives, and immobilizing agents.126 A key point to note is that while traditional biological and chemical agents were used against enemy soldiers or non-cooperative civilians, and clearly would be classified as weapons, modern agents may be used to 'enhance' the capability of a state's own military forces. In such cases, it is much less likely that the agents would amount to weapons.127 For example: within a few decades we will have performance enhancement of troops which will almost certainly be produced by the use of diverse pharmaceutical compounds, and will extend to a range of physiological systems well beyond the sleep cycle. Reduction of fear and pain, and increase of aggression, hostility, physical capabilities and alertness could significantly enhance soldier performance, but might markedly increase the frequency of violations of humanitarian law. For example, increasing a person's aggressiveness and hostility in conflict situations is hardly likely to enhance restraint and respect for legal prohibitions on violence.128 Similar concerns have already been expressed about remotely operated weapons. And in a manner similar to using directed energy weapons to disperse civilian

'Brain waves 3: neuroscience, conflict and security, in The Royal Society, available at: http://royalsociety. org/policy/projects/brain-waves/conflict-security (last visited 6 May 2012) for a discussion of, among other things, potential military applications of neuroscience and neurotechnology and current legal issues. 125 M. Wheelis and M. Dando, ibid., p. 560. 126 Michael Crowley and Malcolm Dando, 'Submission by Bradford Nonlethal Weapons Research Project to Foreign Affairs Select Committee Inquiry on Global Security: Non-Proliferation', 2008, pp. 1-2, available at: http://www.brad.ac.uk/acad/nlw/publications/BNLWRP_FAC071108MC.pdf (last visited 8 May 2012). 127 Body armour, for example, is not classified as a weapon. 128 M. Wheelis and M. Dando, above note 124, pp. 562-563. 506

The Applied Ethics of Emerging Military and Security Technologies

197

Volume 94 Number 886 Summer 2012 crowds, there is also the potential to pacify civilians in occupied territories through chemicals included in food distributions.129 Perhaps of even more concern, as it goes directly to the ability to enforce international humanitarian law, particularly command responsibility, is the possibility of 'memories of atrocities committed [being] chemically erased in after-action briefings'.130

The need to understand the role of engineering in the weapon review process The above overview of emerging weapons highlights that as weapons become more complex the ability for non-experts to understand the complex manner in which the weapon operates becomes increasingly difficult. This part of the article focuses on engineering issues and how an understanding of those issues can be factored into the legal review of weapons.

Why a weapon may not perform as intended A weapon may not perform as intended or in accordance with the 'product design specification'131 for a variety of reasons. Those reasons include: inadequate technical specification, design flaws, or poor manufacturing quality control (batch variation). Other factors include 'age of the munition, storage conditions, environmental conditions during employment, and terrain conditions'.132 A simple example of specification failure, or at least a specification that will not be 100 per cent reliable, is an anti-vehicle mine that is not intended to explode when stepped on by a human. For example, if it is a load activated mine, the load might be set to 150 kg. However, biomechanical research: shows very strong evidence that a human being can very easily exert an equivalent force close to and above such pressures. For example, an 8-year-old boy weighing 30 kg, running downhill in his shoes, exerts a ground force of 146 kg. A 9-year-old girl weighing 40 kg running downhill in her bare feet exerts 167 kg of force. An adult male running will exert 213 kg.133 Alternatively, the specification might be correct but the design, manufacturing process, or integration of systems does not consistently lead to the intended result. This may be an engineering quality issue where the implemented engineering

129 Ibid., p. 565. 130 Ibid., p. 565 131 The product design specification is a step before the actual technical specifications for a product. The former is about what a product should do, while the latter is concerned with how the product will do it. 132 Defense Science Board Task Force, Munitions System Reliability., Office of the Under Secretary of Defense for Acquisition, Technology, and Logistics, US Department of Defense, Washington, DC, September 2005, p. 15, available at: http://purl.access.gpo.gov/GPO/LPS72288 (last visited 8 May 2012). 133 'Anti-vehicle mines: discussion Paper', Actiongroup Landmine.de, 2004, p. 5. (footnote omitted), available at: http://www.landmine.de/fileadmin/user_upload/pdf/Publi/AV-mines-discussion-paper.pdf (last visited 8 May 2012). 507

198

The Applied Ethics of Emerging Military and Security Technologies A. Backstrom and I. Henderson - New capabilities in warfare: an overview of contemporary technological developments and the associated legal and engineering issues in Article 36 weapons reviews

processes were inadequately robust leading to product flaws, and as such presents a reliability issue. Where a weapon does not perform as intended, two prime consequences are: • •

The desired combat effect is not achieved. If the weapon fails to perform, own forces are put at risk. If the weapon does not perform to specification, civilians and civilian property are put at risk.134 Where civilians are injured or killed or civilian property damaged, liability may be incurred.135 State liability may be incurred for an internationally wrongful act (i.e., a breach of the international humanitarian law) and criminal liability potentially attaches to the commander who authorized the use, or to the person who employed the weapon, or both.

As weapons systems become more complex, an understanding of reliability analysis will need to become part of the legal review process. Reliability: test and evaluation The purpose of test and evaluation is to provide an objective measurement of whether a system (or a component thereof) performs reliably to a specification. Reliability is the probability of correct functioning to a specified life (measured in time, cycles of operation, etcetera) at a given confidence level. Understanding that reliability is a key factor in weapon performance is intuitively simple but in fact has a level of complexity not always immediately grasped by those unfamiliar with reliability engineering.136 Quantifying reliability is not a 'yes' or 'no' proposition,137 nor can it be achieved by a single pass/fail test, but rather 'is subject to statistical confidence bounds'.138 For example, to obtain an appropriate level of statistical confidence that the failure rate for a given weapon population is acceptable there are a minimum number of tests required. But as resources are always finite the question for responsible engineering practice is how to optimize resources and understand the minimum required resources to assure acceptable reliability? Suppose that undertaking the required number of tests will be too time-consuming or beyond budget allocation. A naive approach would simply reduce the number of tests to meet budget requirements and presume that the test will still give some useful information. But that may not be the case. Arguably, the compromised test can only provide misleading conclusions if the result does not achieve the required level of confidence. For certification purposes, either a certain level of confidence is required or not. While the statistical confidence level may be set appropriately low 134 This has direct military effectiveness consequences, as well as effecting morale, domestic public support, international support, etcetera. 135 Liability may also arise where the means or method of warfare against combatants is unlawful, which may be the case in a defective weapon scenario, for example, firing on a combatant who is hors de combat. 136 See generally, Defense Science Board Task Force on Munitions System Reliability', above note 132. 137 'Just tell me whether it is reliable or not?' asks the hypothetical boss. 138 Defense Science Board Task Force on Munitions System Reliability, above note 132, p. 15. 508

The Applied Ethics of Emerging Military and Security Technologies

199

Volume 94 Number 886 Summer 2012 for non-lethal weapon components where a failure has a low-operational impact and minor to no safety implications (e.g., failure of a tracer bullet), the target recognition system on an autonomous weapon may require a very high statistical confidence to minimize lethal weapon deployment on civilians while still ensuring engagement of enemy targets. If a high statistical assurance is deemed necessary for civilian safety while budgetary constraints preclude the corresponding necessary development testing, then appropriate limits should be implemented regarding the approved applications for that weapon until field experience provides appropriate reliability confidence. How should this be applied in practice? The main steps of weapon acquisition are usefully outlined by McClelland, including the various testing stages during 'demonstration', 'manufacture', and 'in-service'.139 As McClelland notes, this is not a legal process but rather part of the acquisition process; but nonetheless these steps provide decision points that are 'important stages for the input of formal legal advice'.140 For testing to be meaningful, critical issues of performance must be translated into testable elements that can be objectively measured. While many smaller nations might be little more than purchasers of off-the-shelf weapons,141 other governments are involved in envisaging, developing, and testing emerging weapons technology. While the degree of that involvement will vary, that is a choice for governments.142 So, rather than being passive recipients of test results and other weapons data, one pro-active step that could be taken as part of the legal review process is for lawyers to input into the test and evaluation phases by identifying areas of legal concern that could then be translated into testable elements. This may be one way to at least partly address the security and compartmented access difficulties associated with high-technology weapons that were raised above. For example, it is appropriate to assign increased confidence in reliability for military applications involving higher risks factors for civilians. This could be crossreferenced against existing weapons system reliability data as an input to the decision-making process when determining whether a new targeting procedure may be considered lawful. To be effective, the legal requirements need to be expressed in terms that are 'testable, quantifiable, measurable, and reasonable'.143 Part of the challenge will 139 J. McClelland, above note 1, p. 401. Or during design, during initial acceptance, and as part of operational evaluation. 140 Ibid., p. 402. 141 Of course, purchasers of off-the-shelf weapon systems must still satisfy themselves of the legality of a weapon. Even with a fully developed and tested weapon, this can still prove difficult for purchasers of high-technology weapons. For example, a manufacturer may refuse to disclose sufficient information about a high-technology weapon that uses encrypted proprietary software for the end-user to make an informed judgement about the algorithms used to be confident of the weapon's ultimate reliability. 142 See Report on the Defense Science Board Task Force on Developmental Test & Evaluation, Office of the Under Secretary of Defense for Acquisition, Technology, and Logistics, US Department of Defense, May 2008, pp. 6-7, available at: www.acq.osd.mil/dsb/reports/ADA482504.pdf; wherein the recent decrease in US government involvement in design testing was highlighted, and perhaps more worryingly, government access to the contractor's test data was limited. 143 Ibid., p. 38. Noting that this might initially be challenging. For example, ibid., p. 39, for a discussion of where this has not occurred for the operational requirements. 509

200

The Applied Ethics of Emerging Military and Security Technologies A. Backstrom and I. Henderson - New capabilities in warfare: an overview of contemporary technological developments and the associated legal and engineering issues in Article 36 weapons reviews

be bridging the disconnect that often exists between the definitions of technical requirements and the desired operational performance. This disconnect can often be 'traced to the terminology used to define the level of performance required, under what conditions and how it is [to be] measured'.144 This is where lawyers working with systems engineers can influence the process so that the use of tests, demonstrations, and analysis can be adopted as valid methods to predict actual performance. Once a system is in-service, further testing may also be conducted to gain additional insights into the capability and to ensure that the system is actually meeting the requirements of the user. This phase of test and evaluation is particularly critical as it is the only phase that truly relates to the 'real world' use of a system.145 By having lawyers provide meaningful legal criteria against which a class of weapons could be judged, the ongoing legal compliance of that weapon could be factored into an already existing process. Another area for useful input is evaluation and analysis of system and subsystem integration and interaction. When it comes to a system-of-systems, US military experience is that there is no 'single program manager who "owns" the performance or the verification responsibility across the multiple constituent systems, and there is no widely used adjudication process to readily assign responsibility for [system-of-systems] capabilities, with the exception of command and control systems'.146 Compare this to other industries such as leading automotive companies that have highly sophisticated design, production, testing, and quality-approval processes for every component that goes into a vehicle and a resulting detailed assignment of responsibility by component, system, and whole product (comprising multiple systems). Working with systems engineers, layers of quality control process could identify the critical legal issues that require both testing and assignment of responsibility (for example, in case of noncompliance with international humanitarian law) among the weapon manufacturer and the various military stakeholders. Reliability and automatic target recognition Weapons that are designed to explode but fail to when used operationally, and if left on the field after the cessation of hostilities, are known as explosive remnants of war.147 Indeed, munition reliability is even defined as 'a measure of the probability of successful detonation'.148 Due to the effects on the civilian population of unexploded ordnance, legal regulation already exists in this area.149 Less well 144 Ibid., p. 41. 145 For example, there is anecdotal evidence that some weapon failures arise due to 'operational factors that are not assessed as part of the developmental, acceptance and surveillance testing', Defense Science Board Task Force on Munitions System Reliability, above note 132, p. 17. 146 Report on the Defense Science Board Task Force on Developmental Test & Evaluation, above note 142, p. 43. 147 See Defense Science Board Task Force on Munitions System Reliability, above note 132, p. 10. 148 Ibid., p. 14. 149 For example, see the chapter on 'Unexploded and abandoned weapons', in W. Boothby, above note 45, pp. 297-317.

510

The Applied Ethics of Emerging Military and Security Technologies

201

Volume 94 Number 886 Summer 2012 understood is that weapons reliability associated with automatic target recognition has another important aspect. It is not just about a weapon that does not explode, but also about one that selects the wrong target. Here we are trying to determine whether it is reasonable to conclude from the analysis of reconnaissance data that the target possesses certain enemy properties or characteristics, and when is it reasonable to reach such a conclusion. Suppose the difference between the hypothesized enemy characteristic and the reconnaissance measurements is neither so large that we automatically reject the target, nor so small that we readily accept it. In such a case, a more sophisticated statistical analysis, such as hypotheses testing, may be required. Suppose that experience indicates that a 90 per cent match in reconnaissance data with existing information regarding an enemy target type has proven to be a reliable criterion for confirming an enemy target. If the data was a 100 per cent match or a 30 per cent match we could possibly come to an acceptable conclusion using common sense. Now suppose that the data match was 81 per cent, which may be considered relatively close to 90 per cent, but is it close enough to accept as a lawful target? Whether we accept or reject the data as a lawful target, we cannot be absolutely certain of our decision and we have to deal with uncertainty. The higher we set our data-match acceptance criterion the less likely an automatic target recognition system will identify non-targets as lawful targets, but the more probable that the recognition system will fail to identify lawful targets as being lawful targets.150 The desired level for whether or not a weapon explodes might be a 'reliable functioning rate of 95 per cent'.151 This corresponds to an autonomous weapon system that fires at an unlawful target, due to misclassification as 'lawful', one out of every twenty times. Would this be considered acceptable performance for discriminating between lawful and protected targets? So, when a weapon system is looked at in this way, the better definition for reliability is whether the weapon system 'performs its intended function'152 and as the 'fuzing and guidance capabilities become more integrated, the reliability of target acquisition must be measured and assessed'.153 It has been suggested that what is required is a Very high probability of correct target identification ... and a very low probability of friendly or civilian targets being incorrectly identified as valid (i.e., enemy) targets'.154 As there is an inherent trade-off between sensitivity and specificity, consideration also needs to be given to how a weapon will be employed. If a human provides go/nogo authorization based on an independent review, therefore providing additional safeguard against false recognition, then a greater number of false positives generated by the automatic recognition system may be acceptable. However, if the weapon system is autonomous, combat effect (correct employment against 150 See Defense Science Board Task Force on Munitions System Reliability, above note 132, p. 28. 151 Ibid., p. 11. Even this level of reliability is based on controlled conditions and a lower level is allowed in operational conditions to account for 'environmental factors such as terrain and weather', ibid., Appendix III, DoD Policy Memo on Submunition Reliability, p. 1. 152 Ibid., p. 14. 153 Ibid., p. 16. 154 Ibid., p. 23.

511

202

The Applied Ethics of Emerging Military and Security Technologies A. Backstrom and I. Henderson - New capabilities in warfare: an overview of contemporary technological developments and the associated legal and engineering issues in Article 36 weapons reviews

identified enemy targets) must be more carefully balanced against risk to civilians. Noting that one of the purposes of automated and autonomous systems is to undertake high-volume observations that would overwhelm a human operator, where 'observations [are] in the millions... even very-low-probability failures could result in regrettable fratricide incidents'.155 Confidence in the ability of an autonomous system to work in the real world might be developed by deploying such systems in a semi-autonomous mode where a human operator has to give the final approval for weapons release.156 Rigorous post-mission analysis of data would allow, with time, a statistically significant assessment of the reliability of the system to correctly identify lawful targets. A final point on testing: Achieving these gains [capability increases, manpower efficiencies, and cost reductions available through far greater use of autonomous systems] will depend on development of entirely new methods for enabling 'trust in autonomy' through verification and validation (V&V) of the near-infinite state systems that result from high levels of adaptability and autonomy. In effect, the number of possible input states that such systems can be presented with is so large that not only is it impossible to test all of them directly, it is not even possible to test more than an insignificantly small fraction of them. Development of such systems is thus inherently unverifiable by today's methods, and as a result their operation in all but comparatively trivial applications is uncertifiable. It is possible to develop systems having high levels of autonomy, but it is the lack of suitable V&V methods that prevents all but relatively low levels of autonomy from being certified for use. Potential adversaries, however, may be willing to field systems with far higher levels of autonomy without any need for certifiable V&V, and could gain significant capability advantages over the Air Force by doing so. Countering this asymmetric advantage will require as-yet undeveloped methods for achieving certifiably reliable V&V.157 A distinctly separate consideration from weapons testing is weapons research. Should weapons research (as opposed to development) be limited or constrained by legal issues? Generally, there is no legal reason (budgets aside) why research cannot take potential weapons as far as the bounds of science and engineering will allow, not the least of which is because laws change.158 The time for imposing limits based on law is in the production and employment of weapons. Of course, some may, and

155 See Report of Defense Science Board Task Force on Patriot System Performance: Report Summary, above note 60, p. 2. 156 See A. Myers, above note 23, pp. 91-92. 157 US Air Force, 'Technology horizons', available at: http://www.af.mil/information/technologyhorizons.asp (last visited 6 May 2012). 158 See the examples of submarines and airplanes referred to in Anderson and Waxman, above note 29, pp. 6-7. While some aspects of international humanitarian law may change, this presumably does not extend to the cardinal principles of distinction, proportionality, and unnecessary suffering.

512

The Applied Ethics of Emerging Military and Security Technologies

203

Volume 94 Number 886 Summer 2012 do, argue differently on moral and ethical lines.159 That is where such arguments are best made and debated.

Conclusion With the ever-increasing technological complexity of weapons and weapon systems, it is important that, among others, computer scientists, engineers, and lawyers engage with one another whenever a state conducts a review of weapons pursuant to Article 36 of the Protocol Additional to the Geneva Conventions of 12 August 1949 and relating to the Protection of Victims of International Armed Conflicts (API).160 The reviews cannot be compartmentalized, with each discipline looking in isolation at their own technical area. Rather, those conducting legal reviews will require ca technical understanding of the reliability and accuracy of the weapon',161 as well as how it will be operationally employed.162 While that does not mean lawyers, engineers, computer science experts, and operators need to each be multidisciplined, it does mean that each must have enough understanding of the other fields to appreciate potential interactions, facilitate meaningful discussion, and understand their own decisions in the context of impacts on other areas of development. Those who develop weapons need to be aware of the key international humanitarian law principles that apply to the employment of weapons. Lawyers providing the legal input into the review of weapons need to be particularly aware of how a weapon will be operationally employed and use this knowledge to help formulate meaningful operational guidelines in light of any technological issues identified with the weapon in terms of international humanitarian law. Furthermore, all parties require an understanding of how test and validation methods, including measures of reliability, need to be developed and interpreted, not just in the context of operational outcomes, but also in compliance with international humanitarian law. As the details of a weapon's capability are often highly classified and compartmentalized, lawyers, engineers, and operators may need to work cooperatively and imaginatively to overcome security classification and compartmental access limitations. One approach might be to develop clearly expressed legal 159 See Matthew Bolton, Thomas Nash and Richard Moyes, 'Ban autonomous armed robots', Article 36, 5 March 2012, available at: http://www.article36.org/statements/ban-autonomous-armed-robots/ (last visited 6 May 2012): 'Whilst an expanded role for robots in conflict looks unstoppable, we need to draw a red line at fully autonomous targeting. A first step in this may be to recognize that such a red line needs to be drawn effectively across the board - from the simple technologies of anti-vehicle landmines (still not prohibited) across to the most complex systems under development. This is not to ignore challenges to such a position - for example, consideration might need to be given to how automation functions in missile defence and similar contexts - but certain fundamentals seem strong. Decisions to kill and injure should not be made by machines and, even if at times it will be imperfect, the distinction between military and civilian is a determination for human beings to make'. 160 See P. Spoerri, above note 54. 161 K. Lawand, above note 1, pp. 929. 162 ICRC, A Guide to the Legal Review of New, Means and Methods of Warfare: Measures to Implement Article 36 of Additional Protocol I of 1977, above note 1, pp. 17-18.

513

204

The Applied Ethics of Emerging Military and Security Technologies A. Backstrom and I. Henderson - New capabilities in warfare: an overview of contemporary technological developments and the associated legal and engineering issues in Article 36 weapons reviews parameters that can be the subject of meaningful systems testing. Another approach may be to devise multi-parameter acceptance criterion equation sets. Such equation sets would allow for hypothesis testing while factoring in reliability data, confidence levels, and risk factors using input data such as anticipated military advantage, weapon reliability data, reconnaissance measurement uncertainty, and civilian risk factors.

514

Part II Robots and Autonomous Systems

This page intentionally left blank

[8] INTERNATIONAL GOVERNANCE OF AUTONOMOUS MILITARY ROBOTS Gary E. Marchant, Braden Allenby, Ronald Arkin, Edward T. Barrett, Jason Borenstein, Lyn M. Gaudet, Orde Kittrie, Patrick Lin, George R. Lucas, Richard O'Meara, Jared Silberman1 New technologies have always been a critical component of military strategy and preparedness. One new technology on the not-too-

1

The authors of are members of the Autonomous Robotics thrust group of the Consortium on Emerging Technologies, Military Operations, and National Security (CETMONS), a multiinstitutional organization dedicated to providing the basis for the ethical, rational, and responsible understanding and management of the complex set of issues raised by emerging technologies and their use in military operations, as well as their broader implications for national security. Gary Marchant is the Lincoln Professor of Emerging Technologies, Law & Ethics and Executive Director of the Center for Law, Science & Innovation at the Sandra Day O'Connor College of Law, Arizona State University. Braden Allenby is the Lincoln Professor of Engineering and Ethics at Arizona State University and the Founding Chair of CETMONS. Ronald Arkin is the Regents' Professor, College of Computing, the Director of College of Computing Mobile Robot Laboratory and the Associate Dean for Research at Georgia Institute of Technology. Edward T. Barrett is the Director of Research at the U.S. Naval Academy's Stockdale Center for Ethical Leadership and an ethics professor in the Department of Leadership, Ethics, and Law. Jason Bornstein is the Director of Graduate Research Ethics Programs and co-Director of the Center for Ethics and Technology at Georgia Institute of Technology. Lyn M. Gaudet is the Research Director of the Center for Law, Science & Innovation at the Sandra Day O'Connor College of Law, Arizona State University. Orde Kittrie is the Director of the Washington D.C. Program for the Sandra Day O'Connor College of Law, Arizona State University. Patrick Lin is an Assistant Professor, Department of Philosophy, California Polytechnic State University. George R. Lucas is the Class of 1984 Distinguished Chair in Ethics in the Stockdale Center for Ethical Leadership at the United States Naval Academy and a Professor of Ethics and Public Policy at the Graduate School of Public Policy at the Naval Postgraduate School. Richard M. O'Meara is a retired Brigadier General, USA, former Fellow Stockdale Center for Ethical Leadership, USNA, and Adjunct Faculty in Global Affairs, Rutgers University. Jared Silberman is Associate Counsel for Arms Control and International Law at the United States Navy Office of Strategic Systems Programs (SSP) in Washington, DC.

208

The Applied Ethics of Emerging Military and Security Technologies Vol. XII

The Columbia Science and Technology Law Review

distant technological horizon is lethal autonomous robotics, which would consist of robotic weapons capable of exerting lethal force without human control or intervention. There are a number of operational and tactical factors that create incentives for the development of such lethal systems as the next step in the current development, deployment and use of autonomous systems in military forces. Yet, such robotic systems would raise a number of potential operational, policy, ethical and legal issues. This article summarizes the current status and incentives for the development of lethal autonomous robots, discusses some of the issues that would be raised by such systems, and calls for a national and international dialogue on appropriate governance of such systems before they are deployed. The article reviews potential modes of governance, ranging from ethical principles implemented through modifications or refinements of national policies, to changes in the law of war and rules of engagement, to international treaties or agreements, or to a variety of other "soft law" governance mechanisms.

273

2011

The Applied Ethics of Emerging Military and Security Technologies Vol. XII

The Columbia Science and Technology Law Review

209 2011

I. INTRODUCTION Military technology is a field driven by change - the constant pursuit to be better, faster, stronger. Certain technological achievements like guns and planes have happened in the purview of the public and have revolutionized the world of war as we know it. Yet many technological changes have occurred under the radar, in military labs and private test fields, with the majority of citizens unaware of the leaps and bounds of progress. Robotics is one such modern military technology advancement that has largely escaped public attention to date. Combining the most advanced electronic, computer, surveillance, and weapons technologies, the robots of today have extraordinary capabilities and are quickly changing the landscape of battle and dynamics of war. One of the most important achievements has been the creation of robots with autonomous decision-making capability.2 In particular, the development of autonomous robots capable of exerting lethal force, known as lethal autonomous robots ("LARs"), has significant implications for the military and society. A variety of never-before-anticipated, complex legal, ethical, and political issues have been created - issues in need of prompt attention and action. There have recently been growing calls for the potential risks and impacts of LARs to be considered and addressed in an anticipatory and preemptive manner. For example, in October 2010, a United Nations human-rights investigator recommended in a report to the United Nations that "[t]he international community urgently needs to address the legal, political, ethical and moral implications of the development of lethal robotic technologies."3 In September 2010, a workshop of experts on unmanned military systems held in Berlin issued a statement (supported by a majority but not all of the participants) calling upon "the international community to commence a discussion about the pressing dangers that these systems pose to peace and international security and to civilians."4 While there is much room for debate about what substantive policies and restrictions (if any) should apply to LARs, there is broad agreement that now is the time to discuss those issues. The recent controversy over unmanned aerial vehicles ("UAVs") that are nevertheless humancontrolled (often referred to as "drones") demonstrates the importance of anticipating and trying to address in a proactive manner the concerns about the next generation of such weapons - autonomous, lethal robotics.5 This article seeks to provide a background of some of these issues and start the much needed legal and ethical dialogue related to the use of lethal autonomous robotic 2

(2009).

See generally Ronald C. Arkin, Governing Lethal Behavior in Autonomous Robots

3

Patrick Worsnip, U.N. Official Calls for Study of Ethics, Legality of Unmanned Weapons, Wash. Post, Oct. 24, 2010. 4 The Statement of the 2010 Expert Workshop on Limiting Armed Tele-Operated and Autonomous Systems, Berlin, Sept. 22, 2010, available at http://www.icrac.co.cc/Expert%20Workshop%20Statement.pdf (last visited Nov. 14, 2010). 5

P.W. Singer, Military Robots and the Laws of War, New Atlantis, Winter 2009, at 25,

43.

274

210

The Applied Ethics of Emerging Military and Security Technologies Vol. XII

The Columbia Science and Technology Law Review

2011

technologies in the military context. The next part (Part II) of this article provides a brief history and illustrations of autonomous robots in the military, including the pending development of LARs. Part III sets forth a number of important ethical and policy considerations regarding the use of robots in military endeavors. Part IV reviews the current patchwork of guidelines and policies that apply to the use of military robots. Part V considers the role that international treaties and agreements might play in the governance of LARs, while Part VI investigates the potential role of soft-law governance mechanisms such as codes of conduct.

II. BACKGROUND ON AUTONOMOUS MILITARY ROBOTICS In the United States there has been a long tradition of applying innovative technology in the battlefield, which has often translated into military success.6 The Department of Defense ("DOD") naturally extended this approach to robotics. Primary motivators for the use of intelligent robotic or unmanned systems in the battlefield include: Force multiplication - with robots, fewer soldiers are needed for a given mission, and an individual soldier can now do the job of what took many before. Expanding the battle-space - robots allow combat to be conducted over larger areas than was previously possible. Extending the warfighter's reach — robotics enable an individual soldier to act deeper into the battle-space by, for example, seeing farther or striking farther. Casualty reduction - robots permit removing soldiers from the most dangerous and life-threatening missions. The initial generation of military robots generally operate under direct human control, such as the "drone" unmanned aerial vehicles being used by the U.S. military for unmanned air attacks in Pakistan, Afghanistan, and other theaters.7 However, as robotics technology continues to advance, a number of factors are pushing many robotic military systems toward increased autonomy. One factor is that as robotic systems perform a larger and more central role in military operations, there is a need to have them to continue to function just as a human soldier would, if communication channels are disrupted. In addition, as the complexity and speed of these systems increase, it will be increasingly limiting and problematic for performance levels to have to interject relatively slow human decision-making into the process. As one commentator recently put it, "military systems (including weapons) now on the horizon will be too fast, too

6

Material from this section is derived with permission from Arkin, supra note 2.

7

See generally Peter W. Singer, Wired for War (2009); Peter Bergen & Katherine Tiedemann, Revenge of the Drones: An Analysis of Drone Strikes in Pakistan, New America Foundation, Oct. 19, 2009, available at http://www.newamerica.net/ publications/policy/revenge_of_the_drones (last visited Nov. 14, 2010). 275

The Applied Ethics of Emerging Military and Security Technologies Vol. XII

The Columbia Science and Technology Law Review

211 2011

small, too numerous, and will create an environment too complex for humans to direct."8 Based on these trends, many experts believe that autonomous, and in particular lethal autonomous, robots are an inevitable and relatively imminent development.9 Indeed, several military robotic-automation systems already operate at the level where the human is still in charge and responsible for the deployment of lethal force, but not in a directly supervisory manner. Examples include: (i) the Phalanx system for Aegis-class cruisers in the Navy "capable of autonomously performing its own search, detect, evaluation, track, engage and kill assessment functions"10 (Fig. 1); (ii) the MK-60 encapsulated torpedo (CAPTOR) sea mine system - one of the Navy's primary antisubmarine weapons capable of autonomously firing a torpedo and cruise missiles (Fig. 2); (iii) the Patriot anti-aircraft missile batteries; (iv) "fire and forget" missile systems generally; and (v) anti-personnel mines or alternatively other, more discriminating classes of mines (e.g., anti-tank).11 These devices can each be considered to be robotic by some definitions, as they all are capable of sensing their environment and actuating, in these cases through the application of lethal force.

8 Thomas K. Adams, Future Warfare and the Decline of Human Decisionmaking, Parameters, U.S. Army War College Quarterly, Winter 2001-02, at 57-58. 9

Arkin, supra note 2, at 7-10; See generally George Bekey, Autonomous Robots: From Biological Inspiration to Implementation and Control (2005); Robert Sparrow, Building a Better WarBot: Ethical Issues in the Design of Unmanned Systems for Military Applications, 15 Sci. Eng. Ethics 169, 173-74 (2009) [hereinafter Sparrow, Building a Better Warbot}. 10

U.S. Navy, "Phalanx Close-in Weapons Systems," United States Navy Factfile, available at http://www.navy.mil/navydata/fact_display.asp?cid=2100&tid=800&ct=2 (last visited Feb. 12,2010). 11

Antipersonnel mines have been banned by the Ottawa Treaty on antipersonnel mines, although the U.S., China, Russia, and thirty-four other nations are currently not party to that agreement. Convention on the Prohibition of the Use, Stockpiling, Production and Transfer of Anti-Personnel Mines and on their Destruction (Ottawa Treaty), Sept. 18, 1997, 2056 U.N.T.S. 211. Recent developments, however, indicate that the U.S. is evaluating whether to be a part of the Ottawa Treaty. See Mark Landler, White House Is Being Pressed to Reverse Course and Join Land Mine Ban, N.Y. Times, May 8, 2010, at A9. 276

212

The Applied Ethics of Emerging Military and Security Technologies Vol. XII

The Columbia Science and Technology Law Review

2011

Figure 1. Phalanx Close-in Weapons System (United States Navy Photograph)

Figure 2. Tomahawk Cruise Missile on Display (United States Navy Photograph) In 2001, Congress issued a mandate that stated that by 2010 one-third of all U.S. deep-strike aircraft should be unmanned and by 2015 one-third of all ground vehicles should be likewise unmanned.12 More recently, the United States Department of Defense ("DOD") issued in December 2007 an Unmanned Systems Roadmap spanning twentyfive years, reaching until 2032, that likewise anticipated and projected a major shift

Adams, supra note 8, at 57-58. 277

The Applied Ethics of Emerging Military and Security Technologies Vol. XII

The Columbia Science and Technology Law Review

213 2011

toward greater reliance on unmanned vehicles in U.S. military operations.13 As early as the end of World War I, the precursors of autonomous unmanned weapons appeared in a project on unpiloted aircraft conducted by the U.S. Navy and the Sperry Gyroscope Company.14 Multiple unmanned robotic systems are already being developed or are in use that employ lethal force such as the ARV (Armed Robotic Vehicle), a component of the Future Combat System ("PCS"); Predator UAVs (unmanned aerial vehicles) equipped with hellfire missiles, which have already been used in combat but under direct human supervision; and the development of an armed platform for use in the Korean Demilitarized Zone, to name a few.15 The TALON SWORDS platform (Fig. 3) developed by Foster-Miller/QinitiQ has already been put to test in Iraq and Afghanistan and is capable of carrying lethal weaponry (M240 or M249 machine guns, or a Barrett .50 Caliber rifle). Three of these platforms have already served for over a year in Iraq and as of April 2008 and were still in the field, contrary to some unfounded rumors.16 A newer version, referred to as MAARS (Modular Advanced Armed Robotic System), is ready to replace the earlier SWORDS platforms in the field. The newer robot can carry a 40mm grenade launcher or an M240B machine gun in addition to various non-lethal weapons. The President of QinitiQ stated the purpose of the robot is to "enhance the warfighter's capability and lethality, extend his situational awareness and provide all these capabilities across the spectrum of combat."17

13 U.S. Department of Defense, DOD Unmanned Systems Roadmap: 2007-2032 (2007), available at http://www.fas.org/irp/program/collect/usroadmap2007.pdf (last visited Sept. 21, 2010). 14

Adams, supra note 8, at 57.

15

See Arkin, supra note 2, at 10.

16

Foster-Miller Inc., Products & Service: TALON Military Robots, EOD, SWORDS, and Hazmat Robots (2008), available at http://www.foster-miller.com/lemming.htm (last visited Feb. 10, 2010). 17 QinetiQ, Press Release: QinitiQ North America Ships First MAARS Robot, June 5, 2008, available at http://www.qinetiq.com/home/newsroom/news_releases_homepage/ 2008/2nd_quarter/qinetiq_north_america0.html (last visited Feb.12, 2010).

278

214

The Applied Ethics of Emerging Military and Security Technologies Vol. XII

The Columbia Science and Technology Law Review

2011

Figure 3. Foster-Miller TALON SWORDS Robot (Department of Defense Photograph) It is interesting to note that soldiers have already surrendered to UAVs even when the aircraft has been unarmed. The first documented instance of this occurred during the 1991 Gulf War. An RQ-2A Pioneer UAV, used for battle damage assessment for shelling originating from the U.S.S. Wisconsin, was flying toward Faylaka Island, when several Iraqis hoisted makeshift white flags to surrender, thus avoiding another shelling from the battleship.18 Anecdotally, most UAV units during this conflict experienced variations of attempts to surrender to the Pioneer. A logical assumption is that this trend will only increase as UAVs' direct-response ability and firepower increase. The development of autonomous, lethal robotics raises questions regarding if and how these systems can conform as well or better than our soldiers with respect to adherence to the existing Laws of War. This is no simple task however. In the fog of war it is hard enough for a human to be able to effectively discriminate whether or not a target is legitimate. Fortunately for a variety of reasons, it may be anticipated, despite the current state of the art, that in the future autonomous robots may be able to perform better than humans under these conditions, for the following reasons:19 18

Rebecca Maksel, Predators and Dragons, Air & Space Magazine, July 1, 2008, available at http://www.airspacemag.com/history-of-flight/Predators_and_Dragons.html (last visited Sept. 21,2010). 19

Arkin, supra note 2, at 29-30. 279

The Applied Ethics of Emerging Military and Security Technologies Vol. XII

The Columbia Science and Technology Law Review

215 2011

The ability to act conservatively: i.e., they do not need to protect themselves in cases of low certainty of target identification. Autonomous, armed robotic vehicles do not need to have self-preservation as a foremost drive, if at all. They can be used in a self-sacrificing manner if needed and appropriate without reservation by a commanding officer. The eventual development and use of a broad range of robotic sensors better equipped for battlefield observations than humans currently possess. They can be designed without emotions that cloud their judgment or result in anger and frustration with ongoing battlefield events. In addition, "[f]ear and hysteria are always latent in combat, often real, and they press us toward fearful measures."20 Autonomous agents need not suffer similarly. Avoidance of the human, psychological problem of "scenario fulfillment" is possible, a factor believed partly contributing to the downing of an Iranian Airliner by the U.S.S. Vincennes in 1988.21 This phenomenon leads to distortion or neglect of contradictory information in stressful situations, where humans use new incoming information in ways that only fit their preexisting belief patterns, a form of premature cognitive closure. Robots can be developed so that they are not vulnerable to such patterns of behavior. Robots can integrate more information from more sources far faster before responding with lethal force than a human possibly could in real time. These data can arise from multiple remote sensors and intelligence (including human) sources, as part of the Army's network-centric warfare concept and the concurrent development of the Global Information Grid.22 "[Military systems (including weapons) now on the horizon will be too fast, too small, too numerous and will create environments too complex for humans to direct."23 When working in a team of combined human soldiers and autonomous systems as an organic asset, they have the potential capability of independently and objectively monitoring ethical behavior in the battlefield by all parties and reporting infractions that might be observed. This presence alone might possibly lead to a reduction in human ethical infractions.

20

Michael Walzer, Just and Unjust Wars 251 (4th ed., 1977).

21 Scott D. Sagan, Rules of Engagement, in Avoiding War: Problems of Crisis Management 443, 459-61 (Alexander L. George ed., 1991) 22

DARPA (Defense Advanced Research Projects Agency), Broad Agency Announcement 07-52, Scalable Network Monitoring, Strategic Technology Office, Aug. 2007, available at .https ://www. fbo .gov/index?s=opportunity&mode=form&tab= Core&id=b524ff8d8f7390061d4c5d5444c9e620&tab=documents&tabmode=list (last visited Sept. 22, 2010). 23

Adams, supra note 8, at 58. 280

216

The Applied Ethics of Emerging Military and Security Technologies Vol. XII

The Columbia Science and Technology Law Review

2011

The trend is clear: Warfare will continue and autonomous robots will ultimately be deployed in the conduct of warfare. The ethical and policy implications of this imminent development are discussed next, followed by a discussion of governance options. III. ETHICAL AND POLICY ASPECTS There are numerous ethical, policy, and legal issues relating to creation and deployment of lethal autonomous robots. Although an exhaustive list of these issues will not be offered here, a number of the key ones will be outlined. Rather than defending a particular point of view on the technology, the primary aim is to promote awareness of these issues and to encourage lawyers, policymakers, and other relevant stakeholders to consider what may be appropriate legal and regulatory responses to LARs as they are being developed. A. Responsibility and Risks Australian philosopher Robert Sparrow has been a prominent voice in debates about the ethics of lethal autonomous robots. For instance, he examines the complexities associated with assigning ethical and legal responsibility to someone, or something, if an autonomous robot commits a war crime.24 He considers several possible solutions to this puzzle, including "the programmer," "the commanding officer," and "the machine" itself, but concludes that each option has its profound share of difficulties.25 Remaining neutral on whether Sparrow is correct, assigning responsibility to a LAR's behavior (or misbehavior) is an important matter that warrants further investigation. Along related lines, Peter Asaro doubts whether a robot can be punished in a meaningful way since it is unlikely to possess any form of moral agency, observing that traditional notions from criminal law such as "rehabilitation" and "deterrence" do not seem applicable here.26 One of the principal justifications for relying on autonomous robots is that they would have access to a greater amount of information than any human soldier. Yet this advantage raises key questions, including whether a robot should ever be permitted to refuse an order and, if so, under what conditions. Refusing an order on the battlefield is of course a serious matter, but should a robot be given more latitude to do so than a human soldier? Battlefield units containing human soldiers and LARs will raise additional ethical issues relating to risk and responsibility. According to Sparrow, "human beings will start

24

Robots}. 25

Robert Sparrow, Killer Robots, 24 J. Applied Phil. 66 (2007) [hereinafter Sparrow, Killer Id. at 69-73.

26 Peter Asaro, Robots and Responsibility from a Legal Perspective, Proceedings of the IEEE 2007 International Conference on Robotics and Automation, Workshop on RoboEthics, April 14, 2007, Rome, Italy.

281

The Applied Ethics of Emerging Military and Security Technologies Vol. XII

The Columbia Science and Technology Law Review

217 2011

to be placed in harm's way as a result of the operations of robots."27 But will soldiers genuinely be aware of the kinds of risks they are being exposed to when working with robotic counterparts? If not, Sparrow fears that soldiers might place too much trust in a machine, assuming, for example, it has completed its assigned tasks.28 And even if soldiers are fully aware of the risks associated with reliance on robotics, how much additional risk exposure is justifiable? B. Legal Status of Civilians In any treatment of LARs, the question of potential liability to civilians must be considered. Civilians initially responsible for empowering or placement of LARs may not necessarily be absolved of legal responsibility should the LAR perform unintended consequences. For example, it is entirely possible that a failure to recognize and obey an attempt for surrender could invoke a violation of the Law of Armed Conflict ("LOAC"). If a civilian software writer failed to initially write code that recognized the right to surrender through a flag of truce or other means, then a "reach back" to civilian liability for the breach might be possible. Likewise, if the civilian software writer failed in attempts to program identifiable legally protected structures such as places of worship, hospitals, or civilian schools, the code writer may be subject to potential liability in those scenarios as well. The use of smaller yet lethal robots is gaining acceptance in battlefield operations. It is entirely possible that a small three-foot or less autonomous robot might be thrown or "launched" into an open building or window with lethal gun-firing capabilities much like a whirling dervish.29 Should this whirling dervish either not be programmed to accept, or fail to recognize, internationally established symbols of surrender such as a white flag, would the robot's responsible forces, probably including civilian software code writers, escape legal liability for these omissions? Similarly, well-intentioned and seemingly complete software programming might go astray in other ways. Under Law of War concepts, and specifically the Hague Convention (Hague VIII), it is forbidden to lay unanchored automatic contact mines unless they will become harmless at one hour or less after the mines are no longer under

27

Sparrow, Building a Better WarBot, supra note 9, at 172.

28

Id at 172-73.

29 For example, the TALON robot has been cited by its manufacturer for its extensive use in military operations in Afghanistan and Iraq. It is small and is capable of a variety of uses including the ability to deliver weapons fire. The manufacturer's web site specifically states that "TALON's multi-mission family of robots includes one specifically equipped for tactical scenarios frequently encountered by police SWAT units and MPs in all branches of the military. TALON SWAT/MP is a tactical robot that can be configured with a loudspeaker and audio receiver, night vision and thermal cameras and a choice of weapons for a lethal or less-than-lethal response". Qinetiq, TALON Robots - TALON SWAT/MP, available at http://www.qinetiqna.com/Collateral/Documents/English-US/QDS09-049-TALON-SWAT-MP.pdf (last visited Feb. 28, 2010).

282

218

The Applied Ethics of Emerging Military and Security Technologies Vol. XII

The Columbia Science and Technology Law Review

2011

the control of the person (nation) who laid them.30 If one examines unmanned vehicles, particularly unmanned under-water sea vehicles launched from a manned "mother ship," in the event that the mother ship becomes lost or disabled the unmanned "robotic" ship vehicle must, consistent with international navigational regulations and protocols, be able on its own to comply with the navigational regulations and responsibilities. No matter the degree of robotic autonomy, a legal responsibility is likely to exist for the actions of the un-tethered "loose" unmanned vehicle. Consider the situation described by P.W. Singer in "Wired for War" in which a robot programmed to perform sentry work failed to identify a threatening action.31 In that scenario in which a bar patron suffers from the hiccups, a robot trained to act as a sentry with accompanying lethal force could reasonably fail to identify the wellintentioned furtive hand gun gesture of the bartender so that the patron might be scared so as to lose the hiccups. If the robot sentry failed to identify this mimicked handgun gesture and mistakenly shot the bartender "thinking" a lethal threat existed, it is conceivable that the software writer of the code might be responsible for the mistaken actions. The above examples illustrate how seemingly complete autonomous robotic systems may still pose legal liability issues upon the civilians initially responsible for their use within battle space operations. Amidst these scenarios, the civilian software code writer's work and ultimate responsibilities may enjoy a much longer and unanticipated legal life. C. Complexity and Unpredictability Unfortunately, a full awareness of the risks from autonomous robots may be impossible. Wallach and Allen discuss how predicting the relevant dangers can be fraught with uncertainty.32 For example, a semi-autonomous anti-aircraft gun accidentally killed several South African soldiers.33 Roger Clarke pointed out years ago that, "[c]omplex systems are prone to component failures and malfunctions, and to 30 The Law of War in conjunction with the laying of contact underwater mines is covered in Article I to the Hague Convention VIII; October 18, 1907. Article I states: "It is forbidden [] [t]o lay unanchored automatic contact mines, except when they are so constructed as to become harmless one hour at most after the person who laid them ceases to control them; [t]o lay anchored automatic contact mines which do not become harmless as soon as they have broken loose from their moorings; [and] [t]o use torpedoes which do not become harmless when they have missed their mark." Convention (VIII) Relative to the Laying of Automatic Submarine Contact Mines, Oct. 18, 1907, 36 U.S.T. 541. 31

Singer, supra note 7, at 81.

32 Wendell Wallach & Colin Allen, Moral Machines: Teaching Robots Right from Wrong 189-214(2009). 33 Noah Shachtman, Robot Cannon Kills 9, Wounds 14, Wired.com, Oct. 18, 2007, available at http://blog.wired.com/defense/2007/10/robot-cannon-ki.html (last visited Feb. 8, 2010).

283

The Applied Ethics of Emerging Military and Security Technologies Vol. XII

The Columbia Science and Technology Law Review

219 2011

intermodule inconsistencies and misunderstandings."34 Blay Whitby echoes the notion by arguing that computer programs often do not behave as predictably as software programmers would hope.35 Experts from computing, robotics, and other relevant communities need to continue weighing in on the matter so the reliability of LARs can be more thoroughly assessed. Perhaps robot ethics has not received the attention it needs, given a common misconception that robots will do only what we have programmed them to do. Unfortunately, such a belief is sorely outdated, harking back to a time when computers were simpler and their programs could be written and understood by a single person. Now, programs with millions of lines of code are written by teams of programmers, none of whom knows the entire program; hence, no individual can predict the effect of a given command with absolute certainty, since portions of large programs may interact in unexpected, untested ways. Even straightforward, simple rules such as Asimov's Laws of Robotics can create unexpected dilemmas.36 Furthermore, increasing complexity may lead to emergent behaviors, i.e., behaviors not programmed but arising out of sheer complexity.37 Related major research efforts also are being devoted to enabling robots to learn from experience. Learning may enable the robot to respond to novel situations, an apparent blessing given the impracticality and impossibility of predicting all eventualities on the designer's part. But this capability raises the question of whether it can be predicted with reasonable certainty what the robot will learn. Arguably, if a robot's behavior could be adequately predicted, the robot would just be programmed to behave in certain ways in the first place instead of requiring learning. Thus, unpredictability in the behavior of complex robots is a major source of worry, especially if robots are to operate in unstructured environments, rather than the carefully-structured domain of a factory or test laboratory. D. Just-War Theory An overarching concern is whether the use of LARs is consistent with timehonored principles, rules, and codes that guide military operations, including just-war theory, the Law of Armed Conflict ("LOAC"), and the Rules of Engagement. Scholars are already starting to analyze whether LARs will be capable of fulfilling the requirements of just-war theory. Noel Sharkey, for example, doubts that robots will be

34 Roger Clarke, Asimov's Laws of Robotics: Implications for Information Technology-Part II, 27 Computer 57, 65 (1994). 35

Blay Whitby, Computing Machinery and Morality, 22 AI & Society 551, 551-563

(2008). 36

Issac Asimov, I, Robot (1950).

37 See, e.g., Ray Kurzweil. The Age of Spiritual Machines: When Computers Exceed Human Intelligence (1999); Ray Kurzweil. The Singularity is Near: When Humans Transcend Biology (2005).

284

220

The Applied Ethics of Emerging Military and Security Technologies Vol. XII

The Columbia Science and Technology Law Review

2011

capable of upholding the principle of discrimination, which involves being able to distinguish between combatants and non-legitimate targets such as civilians and surrendering soldiers.38 While discussing military robots and the principle of proportionality, Sparrow argues that "decisions about what constitutes a level of force proportionate to the threat posed by enemy forces are extremely complex and context dependent and it is seemingly unlikely that machines will be able to make these decisions reliably for the foreseeable future."39 However, at this point, it remains an open question whether the differences between LARs and existing military technology are significant enough to bar the former's use. Further, Asaro examines whether military robots may actually encourage wars by altering pre-conflict proportionality calculations and last resort efforts.40 A fundamental impediment to war is the loss of human life, especially the lives of fellow citizens; casualties are a significant reason why wars are not more common. Sending an army of machines to war—rather than friends and relatives—may not exact the same physical and emotional toll on a population.41 Assuming the existence of a just cause, one could celebrate this new calculus, which more readily permits legitimate self-defense. However, this reduced cost may, in turn, reduce the rigor with which non-violent alternatives are pursued and thus encourage unnecessary—and therefore unjust—wars. While this possible moral hazard obviously does not require us to maximize war costs, it does require efforts to inform and monitor national security decision-makers. Finally, Singer suggests that these LARs weapons could undermine counterinsurgency efforts, where indigenous respect and trust is crucial to creating a reasonable chance of success.42 Unmanned weapons may be perceived as indicative of flawed characters and/or tepid commitments, and are incapable of developing necessary personal relationships with local citizens. And even if remote controlled or autonomous weapons are more discriminate than soldiers, they are commonly perceived as less so. E. Civilian Applications Technology developed for military purposes frequently has civilian applications and vice versa. For instance, Singer notes how the REMUS, the Remote Environmental 38 Noel Sharkey, Cassandra or False Prophet of Doom: AI Robots and War, 23(4) IEEE Intelligent Systems 14, 16-17 (2008); Noel Sharkey, The Ethical Frontiers of Robotics, 32 Science 1800, 1800-01 (2008) ("[N]o computational system can discriminate between combatants and innocents in a close-contact encounter"). 39

Sparrow, Building a Better WarBot, supra note 9, at 178.

40 Peter Asaro, How Just Could a Robot War Be?, in Adam Briggle, Katinka Waelbers, and Philip Brey (eds.), Current Issues in Computing and Philosophy 1, 7-9 (2008). 41

Robert Sparrow, Predators or Plowshares? Arms Control of Robotic Weapons, IEEE Tech. Soc'y Magazine, Spring 2009, at 25, 26 (hereinafter "Sparrow, Predators or Plowshares?"}. 42

Singer, supra note 7, at 299. 285

The Applied Ethics of Emerging Military and Security Technologies Vol. XII

The Columbia Science and Technology Law Review

221 2011

Monitoring Unit, was originally used by oceanographers but later an altered version of it was deployed in Iraq.43 Further, many federal and state agencies have sought permission to use military technology such as UAVs.44 Consequently, once LARs are developed for military use, what might the implications be down the road for their use in civilian contexts?45 F. Broader Ethical and Social Considerations Any decisions and policies regarding military development and use of LARs will impact, and be impacted by, broader technological and social considerations.46 The failure to acknowledge these considerations upfront, and include them in the analysis, can lead to dysfunctional results, and unnecessary failure of legal and policy initiatives. Accordingly, we will highlight some relevant considerations, and their potential implications. Consider first what an LAR actually is: one of many potential functions platformed on a generic technology base, which itself may be highly variable. Thus, a "lethal" function, such as a repeating kinetic weapon mounted on a robotic system, is one that is intentionally programmed to identify, verify, and eliminate a human target. At other times, the same basic robotic system might be fitted for cargo carrying capacity, for surveillance, or for many other functions, either in a military or a civil capacity. "Autonomous" means that the platform is capable of making the necessary decisions on its own, without intervention from a human. This, again, may involve the lethality function, or it may not: one might, for example, tell a cargo robot to find the best way to a particular destination, and simply let it go.47 Similarly, "robot" may sound obvious, but it is not. Tracked machines such as the Talon or PackBot, or UAVs such as the Predator or Raven, are fairly obvious robotic technologies, but what about "smart" land mines that are inert until they sense the proper 43

Id. at 37-38.

44 Anne Broache, Police Agencies Push for Drone Sky Patrols, CNET News, Aug. 9, 2007, available at http://news.cnet.com/Police-agencies-push-for-drone-sky-patrols/2100-l 1397_36201789.html (last visited Feb. 12, 2010). 45 See generally Gary Marchant and Lyn Gulley, National Security Neuroscience and the Reverse Dual-Use Dilemma, 1 Am. J. Bioethics Neuroscience 20, 20-22 (2010). 46 This consideration led the Lincoln Center for Applied Ethics at Arizona State University, the Inamori International Center for Ethics and Excellence at Case Western Reserve University, and the U.S. Naval Academy Stockdale Center for Ethical Leadership to found the Consortium for Emerging Technology, Military Operations, and National Security, or CETMONS. See CETMONS, http://cetmons.org (last visited June 17, 2010). See generally also Max Boot, War Made New (2006); John Keegan, A History of Warfare (1993). 47 See, e.g., DARPA, http://www.darpa.mil/grandchallenge/index.asp (last visited June 17, 2010), detailing the progress in autonomous vehicles as a result of DARPA's Grand Challenge initiative.

286

222

The Applied Ethics of Emerging Military and Security Technologies Vol. XII

The Columbia Science and Technology Law Review

2011

triggering conditions and are capable of jumping, or exploding, or doing whatever else they are built to do?48 What about bombs that, once deployed, glide above the battlefield until finding an enemy target, which are then attacked while sparing cars, buses, and homes?49 And what about a grid of surveillance/attack cybersects (insect size robots or cyborgs consisting of biological insects with robotic functions integrated into them)? Each cybersect taken alone may be too insignificant and dumb to be considered a robot, but the cybersect grid as a whole may actually be quite intelligent.50 Going one step further, what should we call a weapons platform that is wirelessly connected directly into a remote human brain (in recent experiments, a monkey at Duke University with a chip implanted in its brain that wirelessly connected it to a robot in Japan kept the Japanese robot running by thought, so that the robot was in essence an extension of its own physicality)?51 Even now, the Aegis computer fire control system deployed on Navy vessels comes with four settings: "semiautomatic", where humans retain control over the firing decision; "automatic special," where humans set the priorities but Aegis determines how to carry the priorities out; "automatic," where humans are kept in the loop but the system works without their input; and "casualty" where the system does what it thinks necessary to save the ship.52 This brief digression raises serious questions about what an LAR actually is. Certainly, it has a technology component, but in some ways this is almost trivial compared to its social, political, ethical, and cultural dimensions. In fact, one might well argue that in many important ways a LAR is more of a cultural construct than a technology, with its meaningful dimension being intent rather than the technology system.53 This is an important point, for it suggests that any legal or regulatory approach that focuses on technology may be misplaced; conversely, it means that the underlying technologies which come together in an LAR will continue to evolve independent or irrespective of any direct controls on LARs - including the functionality of the physical hardware, the sophistication of the software, and the integrated technological capability

48

See, e.g., Dynamic Networking and Smart Sensing Enable Next-Generation Landmines, http://www.computer.Org/portal/web/csdl/doi/10.1109/MPRV.2004.4 (last visited June 17, 2010). 49

See, e.g., The Calibration of Destruction, The Economist, Jan. 30, 2010, at 87, 87-88.

50

Gary Kitchener, Pentagon Plans Cyber-Insect Army, BBC News, Mar. 16, 2006, http://news.bbc.co.Uk/2/hi/americas/4808342.stm, last visited June 17, 2010. 51

See, e.g., K. C. Jones, Monkey Brains in U. S. Make Robot Walk in Japan, InformationWeek, Jan. 16, 2008, available at http://www.informationweek.com/news/personal_tech/showArticle.jhtml?articleID=205801020, last visited June 18,2010. 52

See, e.g., Singer, supra note 7, at 124-25.

53

Similarly, commercial jets were understood to be transportation technologies until reconceptualized by Al-Qaeda terrorists into a weapon. Many individuals, not just engineers, underestimate the social and cultural dimensions of technology. See, e.g., Wiebe Bijker, Thomas P. Hughes & Trevor Pinch (eds.), The Social Construction of Technological Systems (1997). 287

The Applied Ethics of Emerging Military and Security Technologies Vol. XII

The Columbia Science and Technology Law Review

223 2011

we call "autonomy." It is because the technologies are separate from the use that the discussion of LARs is frequently confused: LARs are often discussed as if they were a "military technology," when in fact they are a set of technologies that can be integrated in ways that are effective and desirable given current military conditions. Let us begin by identifying two levels at which technologies function: Level I, or the shop floor level; and Level II, or the institutional and social system level.54 Thus, for example, if one gives a vaccine to a child to prevent her from getting a particular disease, one is dealing with a Level I technology system: the desired goal, no disease, is inherent in use of the technology. On the other hand, if one starts a vaccine program in a developing country in order to encourage economic growth because of better health, it is a Level II system: use of vaccines may contribute to such a goal, but there are many intervening systems, pressures, policies, and institutions. To return to LARs, then, one might begin by asking why deploy such a technology in any form? Here, one has serious coupling between Level I and Level II issues. The immediate Level I response is that deployment of LARs would save soldiers' lives on the side that deployed them; many explosive devices in Iraq and Afghanistan that might otherwise have killed and maimed soldiers have been identified, and eliminated, by robots. But this is in some ways only begging a serious Level II question. In World War I, for example, generals thought little of killing 100,000 men at a go by sending them into the teeth of concentrated machine gun fire. Consequently, simply avoiding casualties is an inadequate explanation. That world, however, has changed, especially for the U.S. military, which faces a particularly stark dilemma. It is charged by its citizens with being able to project force anywhere around the world, under virtually any conditions. But, for a number of reasons, American civilians have become increasingly averse to any casualties. So the U. S. military finds itself in the dilemma of being required to project its power without American soldiers dying. Additionally, the long-term demographics are not good: Americans, like other developed countries, are looking at an aging demographic, with the immediate implication that there are fewer young people to fill boots on the ground.55 The institutional and social context of military operations for the United States is increasingly one where better military productivity becomes paramount, with productivity measured as mission accomplishment per soldier lost. And robots can potentially contribute significantly to achieving such productivity. It's not just about saving soldier's lives, a Level I technology. It's also about building the capability to continue to project power with fewer casualties, and to do so because culture and society are changing to make fatalities, whether soldier or civilian, less acceptable, which are Level II trends.

54

The analysis of technology systems also includes Level III, the earth systems level; see Braden R. Allenby, Earth Systems Engineering and Management: A Manifesto, 41 Envtl. Sci. & Tech. 7960, 7960-66 (2007). 55 See Noel Shachtman, Army Looks to Keep Troops Forever Young, Wired Danger Room, Apr. 24, 2009, http://www.wired.com/dangerroom/2009/04/army-looks-to-keep-troops-foreveryoung/ (last visited April 26, 2009).

288

224

The Applied Ethics of Emerging Military and Security Technologies Vol. XII

The Columbia Science and Technology Law Review

2011

In sum, LARs raise a broad range of complex ethical and social issues, which we have only begun to address here. Suffice it to say, though, that any attempt to regulate or govern such technology systems must address these issues in addition to the more concrete technological and legal issues. The various models available to attempt this task are discussed in the following sections.

IV. EXISTING GOVERNANCE MECHANISMS FOR MILITARY ROBOTS At present, there are no laws or treaties specifically pertaining to restrictions or governance of military robots, unmanned platforms, or other technologies currently under consideration within the purview of this article. Instead, aspects of these new military technologies are covered piecemeal (if at all) by a patchwork of legislation pertaining to projection offeree under international law; treaties or conventions pertaining to specific technologies and practices; international humanitarian law; and interpretations of existing principles of the Law of Armed Conflict (LOAC).56 There are, for example, multiple conventions in international law which purport to deal with specific technologies and practices, such as agreements pertaining to biological weapons,57 chemical weapons,58 certain types of ammunition,59 the hostile use of environmental modification,60 land mines,61 incendiary weapons,62 blinding laser See generally Stephen E. White, Brave New World: Neurowarfare and the Limits of International Humanitarian Law, 41 Cornell Int'l LJ. 177 (2008); Mark Edward Peterson, The UAVand the Current and Future Regulatory Construct for Integration into the National Airspace System, 71 J. Air L. & Com. 521 (2006); Geoffrey S. Corn, Unarmed but How Dangerous? Civilian Augmentees, the Law of Armed Conflict, and the Search for a More Effective Test for Permissible Civilian Battlefield Functions, 2 J. Nat'l Sec. L.& Pol'y 257 (2008); Andrew H. Henderson, Murky Waters: The Legal Status of Unmanned Undersea Vehicles, 53 Naval L. Rev. 55 (2006); Jason Borenstein, The Ethics of Autonomous Military Robots, 2 Studies in Ethics, Law & Tech. Issue 1, Article 2 (2008); John J. Klein, The Problematic Nexus: Where Unmanned Combat Air Vehicles and the Law of Armed Conflict Meet, Air & Space Power J., Chronicles Online J. (2003), available at http://www.airpower.maxwell.af.mil/airchronicles/cc/klein.html (last visited Oct. 1, 2010); Anthony J. Lazarski, Legal Implications of the Uninhabited Combat Aerial Vehicle—Focus: Unmanned Aerial Vehicles, 16 Aerospace Power J. 74 (2002). 57

Convention on the Prohibition of the Development, Production and Stockpiling of Bacteriological (Biological) and Toxin Weapons and or their Destruction, Apr. 10, 1972, 26 U.S.T. 583, 1015 U.N.T.S. 163. 58

Convention on the Prohibition of the Development, Production, Stockpiling and use of Chemical Weapons and on Their Destruction, Jan. 13, 1993, 1974 U.N.T.S. 45. 59

The 1999 Hague Declaration Concerning Expanding Bullets, July 29, 1899, 1. Am. J. Int'l L. 157-59 (Supp.). 60 Convention on the Prohibition of Military or Any Hostile Use of Environmental Modification Techniques, May 18, 1977, 31 U.S.T. 333, 1108 U.N.T.S. 151. 61

Protocol on Prohibitions or Restrictions on the Use of Mines, Booby-Traps and Other

289

The Applied Ethics of Emerging Military and Security Technologies Vol. XII

The Columbia Science and Technology Law Review

225 2011

weapons,63 and numerous others.64 The United States is not a party to all of these conventions, and to the extent their requirements do not rise to the level of customary international law, the United States is not specifically bound by them. On the other hand, the United States has taken considerable interest in the articulation of standards which purport to regulate conduct generally on the battlefield, including how weapons are used. Thus, while no international agreements specifically regulate the use of LARs today, it is possible that such agreements might be negotiated and implemented in the future, as discussed later in this article. In the interim, it bears mention that there are a variety of other potential existing constraints found in military doctrines, professional ethical codes, and public "watchdog" activities (as well as in international law) that might pertain to the present governance dilemma regarding military robotics. These constraints, generally, were created to address a variety of issues which are not wholly consistent with or applicable to the challenges created by the development and use of robots for military and security purposes. Yet, their existence does provide an architecture upon which to build a system of governance regarding the military use of robots on the battlefield. As we contemplate employing this existing architecture toward the governance of military robotics, it bears noting that governance systems that are successful in obtaining compliance with a particular policy, rule, or directive share a number of important characteristics. Successful systems of "good governance" involve clearly defined and articulated expectations: that is, they identify the precise problems to be solved, changes to be made, or goals to be sought through governance in straightforward terms. The solutions proposed to these problems, moreover, are realistic: that is, they do not attempt to articulate ideal norms of what ought to be, but rather provide feasible norms describing what can, in fact, be accomplished, under existing political, cultural and legal constraints. Successful systems of governance, moreover, are holistic and inclusive, in the sense that all stakeholders are identified and involved in some fashion in making the rules. Finally, they issue rules or principles that are subject to assessment: that is, the results are capable of measurement and evaluation of effectiveness, in a manner that allows for Devices (Protocol II), Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects, May 3, 1996, 2048 U.N.T.S. 133; Convention on the Prohibition of the Use, Stockpiling, Production and Transfer of Anti-Personnel Mines and on their Destruction, Sept. 18, 1997, 2056 U.N.T.S. 211. 62 Protocol on Prohibitions or Restrictions on the Use of Incendiary Weapons (Protocol III), Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects, Oct. 10, 1980, 1342 U.N.T.S. 171. 63

Protocol on Blinding Laser Weapons (Protocol IV), Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects, Oct. 13, 1995, 35 I.L.M. 1218. 64 See generally International Committee of the Red Cross, International Humanitarian Law - Treaties & Documents, available at http://www.icrc.org/ihl.nsf/TOPICS7OpenView (last visited Nov. 3, 2009).

290

226

The Applied Ethics of Emerging Military and Security Technologies Vol. XII

The Columbia Science and Technology Law Review

2011

subsequent amendment and improvement of the requirements when appropriate.65 If these principles of good governance are not adhered to, expectations and pronouncements often go unheeded. In light of these canons of best practice for good governance, we argue that the goal of technological innovation governance should be to insure that all technological innovation is accomplished within the framework of a culture that respects the long-term effects of such work, while considering, insofar as possible, the likely ramifications of the proposed innovation and development. Appropriate governance should also insure that future end-users or consumers of the specified technological innovations are aware of those ramifications, ideally in the design phase, but at very least well before development or application of the innovations in question. All this should be accomplished, moreover, without placing too heavy of a legislative hand on, nor otherwise discouraging, the creative and competitive energies that generate much-needed innovation. Measured against the foregoing standards, contemporary governance architecture regarding the innovation and use of military robots would appear wholly inadequate to the task. And yet, there is considerable professional, national and international infrastructure upon which to hang a regime of articulated goals and proscriptions. At the professional level, for example, there are multiple codes for ethical guidance regarding both best practices and limits on acceptable professional practice for a wide range of academic and professional disciplines. These ethical codes might conceivably find themselves applied in innovation in the field of robotics, especially for participants from professions such as engineering, computer science, biology, medicine, law, and psychology. As a general rule, these ethical codes or guidelines for professional practice are grounded in the traditional responsibilities of their individual professions, and do not contemplate the challenges which can be said to presently exist for innovation generally, or within the field of robotics specifically. Professions, for example, are often regulated at the state level based upon varying degrees of oversight by private organizations and societies. Those codes speak primarily to issues of the professional's relationship and responsibilities toward clients and customers, as well as toward likely competitors; they likewise address important moral and legal issues such as privacy, intellectual property, and education, but often lack any concrete obligations relating to broader social responsibilities for technology development.66 65 There has been a good deal of discussion in recent years about the subject of good governance, especially in the development area. The United Nations, for example, lists eight characteristics of good governance, which are: consensus oriented, participatory, adherence to the rule of law, effect and efficient, accountable, transparent, responsive, equitable and inclusive. United Nations Economic and Social Commission for Asia and the Pacific, What is Good Governance?, United Nations, 2009, available at http://www.unescap.org/pdd/prs/ProjectActivities/Ongoing/gg/governance.asp (last visited Oct. 4, 2010). See also Sam Agere, Good Governance, Promotion of Good Governance Principles, Practices and Perspectives (2000). 66

A general review of various professional codes of ethics reveals a paucity of information which might be considered relevant to innovators of new technologies and practices. The American Psychological Association Code of Ethics is intended to protect "the welfare and protection of the individuals and groups with whom psychologists work" and to "improve the 291

The Applied Ethics of Emerging Military and Security Technologies Vol. XII

The Columbia Science and Technology Law Review

227 2011

Some of these internal ethical codes also appear to contemplate the future contexts in which professionals will have to operate. For example, a "Pledge of Ethical Conduct" printed in the commencement program for the College of Engineering at the University of California, Berkley in May 1998, reads: I promise to work for a BETTER WORLD where science and technology are used in socially responsible ways. I will not use my EDUCATION for any purpose intended to harm human beings or the environment. Throughout my career, I will consider the ETHICAL implications of my work before I take ACTION. While the demands placed upon me may be great, I sign this declaration because I recognize that INDIVIDUAL RESPONSIBILITY is the first step on the path to PEACE.67 To date, the most relevant initiative relating to the ethics of military technologies such as robotics is a "Code of Ethics" for robots being proposed by the Republic of South Korea (although the terms of the Code have yet to be fleshed out).68 The main focus of condition of individuals, organizations, and society." American Psychological Association, Ethical Principles of Psychologists and Code of Conduct, 2010 Amendments, Preamble, available at http://www.apa.org/ethics/code/index.aspx (last visited Oct. 4, 2010). Psychologists are instructed to "strive to benefit those with whom they work and take care to do no harm," although this is stated as a general principle that is aspirational and non-enforceable. Id., Principle A. The American Medical Association provides nine principles which ".. .define the essentials of honorable behavior for the physician." American Medical Association, AMA Code of Medical Ethics, Principles of Medical Ethics (2001), available at http://www.amaassn.org/ama/pub/physician-resources/medical-ethics/code-medical-ethics/principles-medicalethics.shtml (last visited Oct. 4, 2010). Interestingly, the principles do not contain the traditional do no harm proscription but do require the provision of competent medical care ".. .with compassion and respect for human dignity and rights." Id., Principle I. In most cases, except emergencies, physicians retain the right to ".. .choose whom to serve, with whom to associate, and the environment in which to provide medical care." Id., Principle VI. The American Society of Civil Engineers requires engineers to ".. .hold paramount the safety, health and welfare of the public" and to ".. .strive to comply with the principles of sustainable development in the performance of their professional duties." American Society of Civil Engineers, Code of Ethics, available at http://www.asce.org/inside/inside/codeofethics.cfm, (last visited Nov. 24, 2009). A private professional association, the Information Systems Audit and Control Association (ISACA), which purports to serve 'IT governance professionals' requires their members to "...support the implementation of, and encourage compliance with, appropriate standards, procedures and controls for information systems." Information Systems Audit and Control Association, ISACA Code of Professional Ethics, available at http://www.isaca.org/Certification/Code-of-Professional-Ethics/Pages/default.aspx (last visited Nov. 25, 2009). 67 University of California, Berkley, Pledge of Ethical Conduct (1998), available at http://courses.cs.vt.edu/cs3604/lib/WorldCodes/Pledge.html (last visited Jan. 15, 2010). 68 Republic of Korea, Ministry of Information and Communication quoted in Stefan Lovgren, Robot Code of Ethics to Prevent Android Abuse, Protect Humans, National Geographic News, Mar. 16, 2007, available at http://news.nationalgeographic.com/news/2007/03/070316-

292

225

The Applied Ethics of Emerging Military and Security Technologies Vol. XII

The Columbia Science and Technology Law Review

2011

the charter appears to deal with social problems, such as human control over robots and humans becoming addicted to robot interaction (e.g., using robots as sex toys).69 The document will purportedly deal with legal issues, such as the protection of data acquired by robots and establishing clear identification and traceability of the machines.70 These internal professional codes and norms are complemented by a host of nongovernmental organizations ("NGOs") which contribute to the transparency of innovation programs, especially those performed on behalf of the State. The goals and agendas of these organizations are as varied as their names, but their methodologies generally help to educate the end-user or consumer about what is being developed and what the future may portend. Such NGOs often succeed in establishing a record of evidence and impact regarding a particular thread of innovation, placing this evidence before the public and state funders (legislatures, policy-makers, and appropriate government agencies), and providing news media with the expertise to report on the likely ramifications of proposed technological innovations.71 An NGO specifically focused on promoting arms control for military robots has recently been formed, called the International Committee for Robot Arms Control ("ICRAC").72 At the national level in the United States, existing governance can be described as decentralized, and in one sense, reactionary. It reflects the push and pull of multiple constituencies and philosophies regarding the efficacy of support for technological innovation. U.S. federal law and regulation reflect the belief that innovation is best encouraged on the one hand by vigorous and unrestrained marketplace competition,73 robot-ethics.html (last visited Nov. 26, 2009).

70

Id.

71 An example is the International POPs Elimination Network (IPEN), "a unique global network of people and public interest organizations" that "share a common commitment to achieve a toxic-free future." Welcome to the International POPs Elimination Network, International POPs Elimination Network, available at http://www.ipen.org/ (last visited Oct. 4, 2010). IPEN is composed of over 700 public interest health and environmental organizations from more than 80 countries which describes itself as a "global network of more than 700 public non-governmental organizations working together in over 80 countries. Added to these groups is the literature which has come out of think tanks and academic institutions which speaks to the intersection of ethics and technological innovation. 72

International Committee for Robot Arms Control, available at http://www.icrac.co.uk/ (last visited Oct. 4, 2010). 73 The President's Council on Bioethics recognized this fact in its Report on the state of biotechnology in 2003: "Whether one likes it or not, progress in biology and biotechnology is now intimately bound up with industry and commerce.. ..Whatever one finally thinks about the relative virtues and vices of contemporary capitalism, it is a fact that progress in science and technology owes much to free enterprise. The possibility of gain adds the fuel of interest to the fire of genius, and even as the profits accrue only to some, the benefits are, at least in principle, available to all. And the competition to succeed provides enormous incentives to innovations, growth, and progress. We have every reason to expect exponential increases in biotechnologies,

293

229

The Applied Ethics of Emerging Military and Security Technologies Vol. XII

The Columbia Science and Technology Law Review

2011

while recognizing, on the other hand, the need for the government to organize federal funding, encourage innovation, and regulate the more egregious results of commercialization.74 Within the U.S., for example, there appears to be no urgency regarding the coordination of governance of emerging technologies such as robotics within the federal government generally; nor is there any evidence of a prevailing belief that the present governance architecture requires any type of thorough overhaul to respond to the challenges of the 21st century. Indeed the President's Council of Advisors on Science and Technology, in its report of April 2008 on nanotechnology, concluded: [T]here are no ethical concerns that are unique to nanotechnology today. That is not to say that nanotechnology does not warrant careful ethical evaluation. As with all new science and technology development, all stakeholders have a shared responsibility to carefully evaluate the ethical, legal, and societal implications raised by novel science and technology developments. However, the[re is] ... no apparent need at this time to reinvent fundamental ethical principles or fields, or to develop novel approaches to assessing societal impacts with respect to nanotechnology.75 Turning to military uses of robotics, specifically, the development and use of robots for military purposes continues to be constrained, as mentioned above, by various restrictions regarding the projection offeree found in international law that are translated into national laws and regulations. There are, as cited above, multiple conventions which purport to deal with specific technologies and practices. Even though the United States is not a party to all of these conventions, nor necessarily bound by all of them, it is nonetheless the case that the U. S. has taken considerable interest in the articulation of standards which purport to regulate conduct generally on the battlefield, including how weapons are used. There are five principles which run through the language of the various humanitarian law treaties76 (the rules) which the United States acknowledges and and, therefore, in their potential uses in all aspect of human life." The President's Council on Bioethics, Beyond Therapy: Biotechnology and the Pursuit of Happiness (2003), at 303, available at http://bioethics.georgetown.edu/pcbe/reports/beyondtherapy/beyond_therapy_fmal_webcorrected .pdf. 74 See, e.g., Harris-Kefauver Act, Pub. L. No. 87-781, 76 Stat. 780 (1962) (codified as amended at 21 U.S.C. § 301 et seq. (1998)) (commonly referred to as the 1962 Drug Amendments); National Research Act of 1974, Pub. L. No. 93-348, 88 Stat. 342 (1974); 21st Century Nanotechnology Research and Development Act, Pub. L. No. 108-153, 117 Stat. 1923 (2003). 75 President's Council of Advisors on Science and Technology, National Nanotechnology Initiative: Second Assessment and Recommendations of the NNAP (2008), available at http://www.nano.gov7PCAST_NNAP_NNI_Assessment_2008.pdf (last visited Jan. 15, 2009) (emphasis added). 76

Humanitarian law is that international law comprised of a set of rules which seek to limit 294

230

The Applied Ethics of Emerging Military and Security Technologies Vol. XII

The Columbia Science and Technology Law Review

2011

generally honors regarding the conduct of warfare. These are: (i) a general prohibition on the employment of weapons of a nature to cause superfluous injury or unnecessary suffering, (ii) military necessity, (iii) proportionality, (iv) discrimination, and (iv) command responsibility. These principles, as discussed below, impose ethical and arguably legal restraints on at least some uses of lethal autonomous robots. First, some weapons, it is argued, are patently inhumane, no matter how they are used or what the intent of the user is. This principle has been recognized since at least 1907,77 although consensus over what weapons fall within this category tends to change over time. The concept here is that some weapons are design-dependent: that is, their effects are reasonably foreseeable even as they leave the laboratory. In 1996, the International Committee of the Red Cross at Montreux articulated a test to determine if a particular weapon would be the type which would foreseeably cause superfluous injury or unnecessary suffering.78 The so-called "SIrUS" criteria would ban weapons when their use would result in: • • • •

A specific disease, specific abnormal physiological state, a specific and permanent disability or specific disfigurement; or Field mortality of more than 25% or a hospital mortality of more than 5%; or Grade 3 wounds as measured by the Red Cross wound classification scale; or Effects for which there is no well-recognized and proven treatment.79

The operative term here is specific', the criteria speak to technology specifically designed to accomplish more than render an adversary hors de combat. This test for determining weapons exclusion is a medical test and does not take into consideration the issue of military necessity. For this reason, these SIrUS criteria have been roundly the effect of armed conflict. Primary conventions include the Geneva Conventions of 1949, supplemented by the Additional Protocols of 1977 relating to the protection of victims of armed conflicts; the 1954 Convention for the Protection of Cultural Property in the event of armed Conflict and additional protocols; the 1972 Biological Weapons Convention; the 1980 Conventional Weapons Conventions and its five protocols; the 1997 Ottawa Convention on antipersonnel mines; and the 2000 Optional Protocol to the Convention on the Rights of the Child on the involvement of children in armed conflict. International Committee of the Red Cross, What is International Humanitarian Law? (2004), available at http://www.icrc.org/Web/Eng/siteeng0.nsf/htmlall/section_ihl (last visited Nov. 25, 2009). 77 See Hague Convention (IV) Respecting the Laws and Customs of War on Land and Its Annex: Regulation Concerning the Laws and Customs of War on Land, Oct. 18, 1907, 36 Stat. 2227, available at http://www.unhcr.org/refworld/docid/4374cae64.html (last visited Oct. 9, 2010). 78

D. Holdstock, International Committee of the Red Cross: the medical profession and the effects of weapons, 12 Medicine, Conflict and Survival 254 (1996). 79

International Committee of the Red Cross, The SIrUS Project: Towards a Determination of Which Weapons Cause "Superfluous Injury or Unnecessary Suffering" (1997), available at http://www.loc.gov/rr/frd/Military_Law/pdf/SIrUS-project.pdf. See also Andrew Kock, Should War be Hell?, Jane's Defense Weekly, May 10, 2000, at 23.

295

The Applied Ethics of Emerging Military and Security Technologies Vol. XII

The Columbia Science and Technology Law Review

231 2011

criticized and rejected by the United States specifically, and by the international community generally, notwithstanding support for the general principle against the use of inhumane weapons.80 The second principle, military necessity, requires a different analysis. This principle "...justifies measures of regulated force not forbidden by international law which are indispensable for securing the prompt submission of the enemy, with the least possible expenditures of economic and human resources."81 Military necessity recognizes the benefit to friend and foe alike of a speedy end to hostilities. Protracted warfare, it assumes, creates more rather than less suffering for all sides. In order to determine the necessity for the use of a particular technology, then, one needs to know what the definition of victory is, and how to measure the submission of the enemy in order to determine whether the technology will be necessary in this regard. The third principle, proportionality, is of considerable concern to the developer and user of new technologies. A use of a particular technology is not proportional if the loss of life and damage to property incidental to attacks is excessive in relation to the concrete and direct military advantage expected to be gained.82 In order to make this determination, it can be argued, one must consider the military necessity of a particular use and evaluate the benefits of that use in furtherance of a specific objective against the collateral damage that may be caused. Discrimination, the fourth principle, goes to the heart of moral judgment. Indiscriminate attacks (uses) are prohibited under the rules. Indiscriminate uses occur whenever such uses are not directed against a specific military objective, or otherwise employ a method or means of combat the effects of which cannot be directed at a specified military target (indiscriminate bombing of cities for example). Indiscriminate usage also encompasses any method or means of combat, the effects of which cannot be limited as required, or that are otherwise of a nature to strike military and civilian targets without distinction. A final principle is command responsibility, that principle which exposes a multiple of superiors to various forms of liability for failure to act in the face of foreseeable illegal activities. This is a time-honored principle, grounded on the contract between soldiers and their superiors, which requires soldiers to act and superiors to determine when and how to act. It has a long history reflective of the need for control on

80

See Donna Marie Verchio, Just Say No! The SIrUS Project: Well-intentioned, but Unnecessary and Superfluous, 51 A.F. L. Rev. 183 (2001). 81

See Roy Gutman & Daoud Kuttab, Indiscriminant Attack, in Crimes of War, the Book, What the Public Should Know (2007), available at http://www.crimesofwar.org/thebook/indiscriminate-attack.html (last visited Oct. 9, 2010). "Military objectives are limited to those objects which by their nature, location, purpose or use make an effective contribution to military action and whose total or partial destruction, capture or neutralization, in the circumstances ruling at the time, offers a definite military advantage." 82

Headquarters, Department, of the Army, U.S. Army Field Manual 27-10, The Law of Land Warfare, change 1, para. 41 (1976), available at http://www.globalsecurity.org/military/library/policy/army/fm/27-10/CHANGEl .htm (last visited Oct. 9, 2010).

296

232

The Applied Ethics of Emerging Military and Security Technologies Vol. XII

The Columbia Science and Technology Law Review

2011

the battlefield.83 A 1997 Protocol to the Geneva Convention requires that each State Party "determine whether the employment of any new weapon, means or method of warfare that it studies, develops, acquires or adopts would, in some or all circumstance, be prohibited by international law."84 The legal framework for this review is the international law applicable to the State, including international humanitarian law ("IHL"). In particular this consists of the treaty and customary prohibitions and restrictions on specific weapons, as well as the general IHL rules applicable to all weapons, means and methods of warfare. General rules include the principles described above, such as protecting civilians from the indiscriminate effects of weapons and combatants from unnecessary suffering. The assessment of a weapon in light of the relevant rules will require an examination of all relevant empirical information pertinent to the weapon, such as its technical description and actual performance, and its effects on health and the environment. This is the rationale for the involvement of experts of various disciplines in the review process.85 Once again, the United States is not a signatory to this Protocol and thus, technically not bound by its requirements. Nonetheless, to the extent that it sets out reasonable requirements and methodologies for use by states fielding new and emerging technologies, however, this treaty could well set the standard in international law for what may be considered appropriate conduct. A final constraint worth noting is the emerging trend in international law to hold those responsible for fielding weapons which allegedly contravene the principles enunciated above through the use of litigation based on the concept of universal jurisdiction ,86 While litigation to date has revolved primarily See generally Brandy Womack, The Development and Recent Applications of the Doctrine of Command Responsibility: With Particular Reference to the Mens Rea Requirement, /^International Crime and Punishment: Selected Issues, Volume 1117 (Yee Sienho ed., 2003). 84

Protocol Additional to the Geneva Conventions of 12 Aug. 1949, and relating to the Protection of Victims of International Armed Conflicts, art. 36, 8 June 1977, 1125 U.N.T.S. 3, available at http://www.icrc.org/ihl.nsf/full/4707opendocument. 85 Kathleen Lewand, A Guide to the Legal Review of New Weapons, Means and Methods of Warfare, Measures to Implement Article 36 of Additional Protocol I of 1977, International Committee of the Red Cross (2006), available at http://www.icrc.org/eng/assets/files/other/icrc_002_0902.pdf (last visited Oct. 10, 2010). 86

The concept of universal jurisdiction is a customary international law norm that permits states to regulate certain conduct to which they have no discernable nexus. Generally, it is recognized as a principle of international law that all states have the right to regulate certain conduct regardless of the location of the offense or the nationalities of the offender or the victims. Piracy, slave trade, war crimes and genocide are all generally accepted subjects of universal jurisdiction. Belgium, Germany and Spain have all entertained such prosecutions. The issue of law fare is also of concern. Law fare is a strategy of using or misusing law as a substitute for traditional military means to achieve military objectives. Each operation conducted by the U.S. military results in new and expanding efforts by groups and countries to use lawfare to respond to military force. American military authorities are still grappling with many of these issues. See Council on Foreign Relations, Transcript, Lawfare, The Latest in Asymmetries (2003), available 297

The Applied Ethics of Emerging Military and Security Technologies Vol. XII

The Columbia Science and Technology Law Review

233 2011

around allegations of practices such as genocide, torture, rendition, and illegal interrogation, there is no reason to believe that future prosecutions may be justified where decisions regarding illegal innovation, adaptation, and use of weapons systems are made. These various principles and requirements of international humanitarian law and ethical rules of military conduct would clearly impose some limitations on the development and use of lethal autonomous robots. However, given the ambiguous meaning and uncertain legal binding status of these principles, they are unlikely to adequately constrain and shape the development and use of LARs on their own. Additional oversight mechanisms may therefore be warranted, which are further explored in the subsequent section.

V. LEGALLY BINDING INTERNATIONAL AGREEMENTS A more formal and traditional approach for oversight of a new weapons category such as LARs would be some form of binding international arms control agreement.87 Under existing international law, there is no specific prohibition on lethal autonomous robots. In September 2009, robotics expert Noel Sharkey, physicist Jurgen Altmann, bioethicist Robert Sparrow, and philosopher Peter Asaro founded the International Committee for Robot Arms Control ("ICRAC") to campaign for limiting lethal autonomous robots through an international agreement modeled on existing arms control agreements such as those restricting nuclear and biological weapons.88 ICRAC called for military robots to be barred from space and that all robotic systems should be prohibited from carrying nuclear weapons.89 The ICRAC is a small group at this time and as of yet at http://www.cfr.org/national-securi1y-and-defense/lawfare-latest-asymmetries/p5772 (last visited Dec. 10,2009). 87

Sparrow, supra note 41, at 27-29.

88 Nic Fleming, Campaign Asks for International Treaty to Limit War Robots, New Scientist, Sept. 30, 2009, available at http://www.newscientist.com/article/dnl7887-campaignasks-for-international-treaty-to-limit-war-robots.html (last visited Oct. 10, 2010). See also The Statement of the 2010 Expert Workshop, supra note 4 (statement from expert workshop organized by ICRAC calling for an arms control regime that would prohibit the further development, acquisition, deployment, and use of autonomous robot weapons). 89 Id. It is worth noting that there have long been discussions in other contexts of banning from space at least some weapons. For example, the Obama Administration has expressed an intent to seek a ban on weapons that "interfere with military and commercial satellites." See, e.g., Frank Morring, Jr., White House Wants Space Weapons Ban, Aviation Week, Jan. 27, 2009, available at http ://www. aviationweek. com/aw/generic/story_channel.j sp?channel=space&id=ne ws/Space wea 012709.xml&headline=White%20House%20Wants%20Space%20Weapons%20Ban (last visited Oct. 10, 2010). In contrast, the National Space Policy issued by the Bush White House in 2006 states in part that the "United States will oppose the development of new legal regimes or other restrictions that seek to prohibit or limit U.S. access to or use of space." See, e.g., Marc Kaufman, Bush Sets Defense As Space Priority, Wash. Post, Oct. 18, 2006, available at http://www.washingtonpost.eom/wp-dyn/content/article/2006/l 0/17/AR2006101701484.html

298

234

The Applied Ethics of Emerging Military and Security Technologies Vol. XII

The Columbia Science and Technology Law Review

2011

its campaign does not yet seem to have gained the momentum necessary to spark a new international legal regime. However, there is precedent for a non-governmental organization, the International Committee to Ban Landmines, successfully leading the charge towards banning a weapons system.90 While ICRAC's work has raised the issue of limiting lethal autonomous robots through an international arms control agreement, the wisdom of such a course of action is far from clear. Do explicit international legal restrictions on lethal autonomous robots make sense? Are they feasible - both from a political and a technological perspective? Does the ICRAC's specific proposal make sense? The goal of this section is to make some preliminary points about what the options may be for international legal restrictions on lethal autonomous robots, if a policy choice is made to attempt to restrict them. International law contains a significant number and diversity of precedents for restricting specific weapons. Existing legally binding arms control agreements and other instruments include a wide variety of different types of restrictions on targeted weapons, including prohibitions and limitations (restrictions that fall short of prohibition) on (i) acquisition, (ii) research and development, (iii) testing, (iv) deployment, (v) transfer or proliferation, and (vi) use. These various types of prohibitions and limitations form a kind of menu from which the drafters of an international legal instrument addressing lethal autonomous robots - or other emerging warfighting technologies - could choose in accordance with their goals and the parameters of political support for such restrictions. A similar menu could be created of the various types of monitoring, verification, dispute-resolution, and enforcement mechanisms that implement the prohibitions and limitations contained in existing international legal arms control instruments. These prohibitions and limitations (as well as any accompanying monitoring/verification, dispute-resolution, and enforcement provisions) can be contained in any of a number of different types of international legal instruments. They are typically contained in legally binding, multilateral agreements, including in multilateral agreements primarily focused on arms control and also in the Rome Statute of the International Criminal Court. However, there are also examples of prohibitions and limitations contained in legally binding, bilateral agreements, as well as examples of prohibitions and limitations contained in legally binding resolutions of the United Nations Security Council or in customary international law (which consists of rules of law derived from the consistent conduct of states acting out of the belief that the law required them to act that way). As with the content of the restrictions and their implementing provisions, the choice of type of instrument depends on the drafters' goals and the parameters of political support for the desired restrictions and implementing provisions. New international legal arms control instruments are typically freestanding. However, there is also at least one existing multilateral legal framework agreement with respect to which it is worth exploring whether that agreement could usefully be amended to itself provide a vehicle for some or all desired restrictions on lethal autonomous robots. (last visited Oct. 10,2010). See, e.g., Raymond Bonner, How a Group of Outsiders Moved Nations to Ban Land Mines, N.Y. Times. Sept. 20, 1997, at A5.

299

The Applied Ethics of Emerging Military and Security Technologies Vol. XII

The Columbia Science and Technology Law Review

235 2011

This is the 1980 Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons which may be Deemed to be Excessively Injurious or to have Indiscriminate Effects (the CCW),91 which has been ratified by over 100 state parties.92 The operative provisions of the CCW are contained within its protocols. The five protocols currently in force contain rules for the protection of military personnel and, particularly civilians and civilian objects from injury or attack under various conditions by means of: fragments that cannot readily be detected in the human body by x-rays (Protocol I), landmines and booby traps (amended Protocol II), incendiary weapons (Protocol III), blinding lasers (Protocol IV), and explosive remnants of war (Protocol V).93 It is worth noting that the case that lethal autonomous robots should be restricted by the CCW could be made most effectively if it were argued that such robots are contrary to the "principle," cited in the CCW preamble, "that prohibits the employment in armed conflicts of weapons, projectiles and material and methods of warfare of a nature to cause superfluous injury or unnecessary suffering."94 A. Menu of Types of Restrictions Contained in Current International Legal Arms Control Instruments Some international legal arms control instruments prohibit a full range of activities involving the targeted weapons. For example, State-parties to the Convention on the Prohibition of the Use, Stockpiling, Production and Transfer of Anti-Personnel Mines and on Their Destruction, typically referred to as the "Mine Ban Treaty," commit to not developing, producing, acquiring, retaining, stockpiling, or transferring antipersonnel landmines.95 The following menu contains additional examples of existing international legal instruments which adopt the specified types of restrictions on a narrower basis:

91 Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects (CCW), Oct. 10, 1980, 1342 U.N.T.S. 137, reprinted in 19 I.L.M. 1523 (1980), available at http://www.unog.ch/80256EDD006B8954/(httpAssets)/40BDE99D98467348C12571DE0060141 E/$file/CCW+text.pdf. 92 See, e.g., Convention on Certain Conventional Weapons (CCW) At a Glance, available at http://www.armscontrol.org/factsheets/CCW. 93

Id.

94 This argument would of course be contrary to the contentions of some robotics experts that lethal autonomous robots are particularly unlikely "to cause superfluous injury or unnecessary suffering." 95

Convention on the Prohibition of the Use, Stockpiling, Production and Transfer of AntiPersonnel Mines and on Their Destruction, Sept. 18, 1997, 36 I.L.M. 1507 (1997). 300

236

The Applied Ethics of Emerging Military and Security Technologies Vol. XII

The Columbia Science and Technology Law Review

2011

1. Prohibitions and Limitations on Acquisition Several international legal arms control instruments completely prohibit the acquisition of targeted weapons. For example, the Biological Weapons Convention ("BWC") prohibits all state-parties from acquiring, producing, developing, stockpiling, or retaining - and requires all state-parties to within nine months destroy or divert to peaceful purposes - 1) biological agents and toxins "of types and in quantities that have no justification for prophylactic, protective or other peaceful purposes"; and 2) weapons, equipment and delivery vehicles "designed to use such agents or toxins for hostile weapons or in armed conflict."96 The Convention on the Prohibition of the Development, Production, Stockpiling and Use of Chemical Weapons and on their Destruction ("CWC") prohibits all state parties from producing or acquiring, as well as developing, stockpiling or retaining, chemical weapons.97 In contrast, the Treaty on the Non-Proliferation of Nuclear Weapons ("NPT") creates two classes of states with regard to nuclear weapons.98 Nuclear-weapon stateparties are those that had manufactured and exploded a nuclear weapon or other nuclear explosive device prior to January 1, 1967 (China, France, Russia, the United Kingdom, and the United States)99 The NPT does not require nuclear-weapon state-parties to give up their nuclear weapons, but does require those parties to "pursue negotiations in good faith on effective measures relating to cessation of the nuclear arms race at an early date and to nuclear disarmament."100 Non-nuclear weapon state parties to the NPT are prohibited from receiving, manufacturing, or otherwise acquiring nuclear weapons.101 The Inter-American Convention on Transparency in Conventional Weapons Acquisitions102 provides a very different model, with a focus on transparency rather than

96

Convention on the Prohibition of the Development, Production and Stockpiling of Bacteriological (Biological) and Toxin Weapons and on Their Destruction (BWC), Apr. 10, 1972, 26 UST 583, 1015 U.N.T.S. 163, available at http://www.unog.ch/80256EDD006B8954/(httpAssets)/C4048678A93B6934C1257188004848D 0/$file/BWC-text-English.pdf (last visited Oct. 10, 2010). 97

Convention on the Prohibition of the Development, Production, Stockpiling and Use of Chemical Weapons and on Their Destruction (CWC), Jan. 13, 1993, 1974 U.N.T.S. 45, available at http://www.opcw.org/chemical-weapons-convention/articles/ (last visited Oct. 10, 2010). 98 Treaty on the Non-Proliferation of Nuclear Weapons (NPT), opened for signature July 1, 1968, 21 U.S.T. 483, 729 U.N.T.S. 161 (entered into force Mar. 5, 1970), available at http://www.iaea.org/Publications/Documents/Infcircs/Others/infcircl40.pdf (last visited Oct. 10, 2010). 99

Id. at Art. IX.

100

Id. at Art. VI.

101

M a t Art. II.

102

Organization of American States, Inter-American Convention on Transparency in Conventional Weapons Acquisitions, June 7, 1999, available at 301

The Applied Ethics of Emerging Military and Security Technologies Vol. XII

The Columbia Science and Technology Law Review

237 2011

prohibition of acquisitions. The Convention does not prohibit any acquisitions but does require its state-parties to annually report on their imports of certain specified heavy weapons, as well as submit notifications within 90 days of their incorporation of certain specified heavy weapons into their armed forces inventory, whether those weapons were imported or produced domestically.103 2. Prohibitions and Limitations on Research and Development Few, if any, international legal arms control instruments prohibit all research that could be useful for targeted weapons. Limitations on development differ from instrument to instrument. The CWC flatly prohibits the development of any chemical-weapon munition and device.104 In contrast, the BWC contains a more nuanced prohibition, banning the development, production, acquisition, and retention of 1) microbial or other biological agents or toxins "of types and in quantities that have no justification for prophylactic, protective or other peaceful purposes" and 2) weapons, equipment or means of delivery "designed to use such agents or toxins for hostile purposes or in armed conflict."105 It is important to note that restrictions based on quantities or intended use rather than the underlying nature of the technology can be exceptionally difficult to verify, at least without highly intrusive inspections. 3. Prohibitions and Limitations on Testing Prohibitions and limitations on testing of targeted weapons are most prominent in the nuclear-weapons context. For example, the Comprehensive Test Ban Treaty ("CTBT"), which has not yet entered into force, prohibits "any nuclear weapon test explosion or any other nuclear explosion."106 The CTBT's entry into force awaits ratification by nine key countries, including the United States.107 In contrast, the 1963 Treaty Banning Nuclear Weapon Tests in the Atmosphere, in Outer Space and Under Water (also known as the "Limited Test Ban Treaty") - which unlike the CTBT is in force - specifically prohibits nuclear-weapons tests "or any other nuclear explosion" only

http://www.oas.org/juridico/english/treaties/a-64.html (last visited Oct. 10, 2010).

104

CWC, supra note 58, at Arts. I-II.

105

BWC, supra note 57, at Art. I.

106

Comprehensive Nuclear Test Ban Treaty, opened for signature Sept. 24, 1996, 35 I.L.M. 1439 (1996), available at http://www.ctbto.org/the-treaty/treaty-text/ (last visited Oct. 10, 2010). 107

Comprehensive Nuclear Test Ban Treaty Organization Preparatory Commission, CTBTO Fact Sheet (2010), available at http://www.ctbto.org/fileadmin/user_upload/public_information/CTBT_FactSheet_02_2010.pdf (last visited Oct. 10,2010).

302

238

The Applied Ethics of Emerging Military and Security Technologies Vol. XII

The Columbia Science and Technology Law Review

2011

in the atmosphere, in outer space, and under water., 108 The Limited Test Ban Treaty also prohibits nuclear explosions in all other environments, including underground, if they cause "radioactive debris to be present outside the territorial limits of the State under whose jurisdiction or control" the explosions were conducted.109 4. Prohibitions and Limitations on Deployment Some international legal arms control instruments focus on limiting deployment of the targeted weapons, such as with overall or regional, numerical caps. For example, the Strategic Offensive Reductions Treaty, entered into by the U.S. and Russia in 2002, requires the two countries to reduce their operationally deployed, strategic nuclear forces to between 1,700 and 2,200 warheads by December 31, 2012.110 The Conventional Armed Forces in Europe Treaty, ratified by the United States in 1992, contains bloc and regional limits on deployment of certain weapons. in 5. Prohibitions and Limitations on Transfer/Proliferation Many international legal arms control instruments include prohibitions or limitations on transfer or other proliferation of the targeted weapons. For example, the NPT prohibits parties that possess nuclear weapons from transferring the weapons to any recipient as well as from assisting, encouraging, or inducing any non-nuclear-weapon state to manufacture or otherwise acquire such weapons in any way.112

108 Treaty Banning Nuclear Weapon Tests in the Atmosphere, in Outer Space and Under Water, Aug. 5, 1963, 480 U.N.T.S. 43, 14 U.S.T. 1313, available at http://disarmament.un.org/treatystatus.nsf/44e6eeabc9436b78852568770078d9cO/35ea6a019d9eO 58a852568770079dd94?OpenDocument (last visited Oct. 10, 2010). 109

Id.

110 Treaty on Strategic Offensive Reductions (SORT), May 24, 2002, U.S.-Russ., 41 I.L.M. 799 (2002), available at http://moscow.usembassy.gov/joint_05242002.html (last visited Oct. 10, 2010). See also Arms Control Association, The Strategic Offensive Reductions Treaty (SORT) At a Glance, available at http://www.armscontrol.org/factsheets/sort-glance (last visited Oct. 10, 2010). 111

Treaty on Conventional Armed Forces in Europe (CFE), Nov. 19, 1990, 30 I.L.M. 1, available at http://www.dod.gov/acq/acic/treaties/cfe/index.htm (last visited Oct. 10, 2010). The Adapted Conventional Armed Forces in Europe Treaty, which has not been ratified by the United States, would replace the original treaty's bloc and regional arms limits with national weapon ceilings. See Arms Control Association, The Adapted Conventional Armed Forces in Europe Treaty at a Glance, available at http://www.armscontrol.org/factsheets/adaptcfe (last visited Oct. 10, 2010). 112

NPT, supra note 98, at Art. I. 303

The Applied Ethics of Emerging Military and Security Technologies Vol. XII

The Columbia Science and Technology Law Review

239 2011

The CWC bans the direct or indirect transfer of chemical weapons.113 The CWC also bans assisting, encouraging, or inducing anyone to engage in CWC-prohibited activity.114 Similarly, the BWC bans the transfer to any recipient, directly or indirectly, and assisting any state, group of states, or international organizations to manufacture or otherwise acquire 1) biological agents and toxins "of types and in quantities that have no justification for prophylactic, protective or other peaceful purposes" and 2) weapons, equipment and delivery vehicles "designed to use such agents or toxins for hostile weapons or in armed conflict."115 In contrast, the Inter-American Convention on Transparency in Conventional Weapons Acquisitions does not prohibit exports but does require its state-parties to annually report on their exports of certain specified heavy weapons.116 6. Prohibitions and Limitations on Use Several international legal arms control instruments include prohibitions or limitations on use of the targeted weapons. The International Court of Justice, in its 1996 advisory opinion on the Legality of the Threat or Use of Nuclear Weapons, ruled that "the threat or use of nuclear weapons would generally be contrary to the rules of international law applicable in armed conflict, and in particular the principles and rules of humanitarian law; However, in view of the current state of international law, and of the elements of fact at its disposal, the Court cannot conclude definitively whether the threat or use of nuclear weapons would be lawful or unlawful in an extreme circumstance of self-defense, in which the very survival of a State would be at stake."117 The Rome Statute of the International Criminal Court prohibits 1) employing poison or poisoned weapons, 2) employing poisonous gases, and 3) employing bullets which flatten or expand easily in the human body.118 This list is potentially expandable. While the CWC bans chemical weapons use or military preparation for use,119 the BWC does not ban the use of biological and toxin weapons but reaffirms the 1925 Geneva Protocol, which

113

CWC, supra note 58, at Art. I.

BWC, supra note 57, at Art. III. 116

Inter-American Convention on Transparency in Conventional Weapons Acquisitions, supranotQ 102. 117

International Court of Justice, Legality of the Threat or Use of Nuclear Weapons, Advisory Opinion, 1996 I.C.J. (July 8). 118

(1998). 119

Rome Statute of the International Criminal Court, July 17, 1998, 2187 U.N.T.S. 3 CWC, supra note 58. 304

240

The Applied Ethics of Emerging Military and Security Technologies Vol. XII

The Columbia Science and Technology Law Review

2011

prohibits such use. 120 Protocol IV of the CC W prohibits the use of lasers specifically designed to cause permanent blindness. 121 It further obliges state-parties to make every effort to avoid causing permanent blindness through the use of other lasers.122 While prohibiting the use of blinding lasers, the convention does not rule out their development or stockpiling.123 However, it does outlaw any trade in such arms.124 B. Cautionary Note Regarding the Utility of International Legal Arms Control Instruments It is worth noting that even the broadest and most aggressively implemented international legal arms control instruments suffer from certain inherent weaknesses. For example, existing international legal arms control instruments only apply to states. Their impact on non-state actors is at best indirect (for example, the CWC and BWC require state parties to prohibit activities on their territory that are prohibited directly for them). Yet non-state actors, particularly transnational terrorist groups, may present a significant threat of utilizing lethal autonomous robots. In addition, these international legal arms control instruments typically require state consent - states can choose not to ratify these agreements and can withdraw from them if they do join.125

120

BWC, supra note 57.

121 Protocol IV on Blinding Laser Weapons, annexed to Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May be Deemed to be Excessively Injurious or to Have Indiscriminate Effects, Oct. 13, 1995, 35 I.L.M. 1218 (1996). 122

Id. at Art. 2.

123

M a t Art. 1.

124 For details of how the ban came about, see Louise Doswald-Beck, New Protocol on Blinding Laser Weapons, 312 Int'l Rev. Red Cross (1996), available at http://www.icrc.ch/web/eng/siteengO.nsf/html/57JN4Y, and Christina Grisewood, Limits of Lasers, Mag. Int'l Red Cross and Red Crescent Movement, available at http://www.redcross.int/EN/mag/magazine 1996_2/l 8-19.html. 125 Arms control prohibitions or restrictions imposed by legally binding United Nations Security Council ("UNSC") resolutions, such as UNSCR 1540, differ from this model. A resolution approved by the UNSC under its Chapter VII authorities is legally binding on all UN member states, whether or not they supported the resolution. In addition, such a resolution typically cannot be rescinded or amended without the acquiescence of all five permanent members of the UNSC (China, France, Russia, UK, and U.S). Customary international laws are another form of international law that can in some circumstances bind states without their approval and from which states are in some circumstances not permitted to withdraw.

305

The Applied Ethics of Emerging Military and Security Technologies Vol. XII

The Columbia Science and Technology Law Review

241 2011

VI. SOFT LAW/GOVERNANCE APPROACHES There have been a number of proposed strategies for managing the risks and regulating the uses of emerging military technologies.126 Those strategies vary in formality and scope. Proposed measures range from formal, binding agreements such as treaties to informal initiatives such as codes of conduct.127 Oversight of lethal autonomous robots is likely to lean towards the latter end of this spectrum at least initially. Some form of coordinated international oversight is warranted, but at least in the short term, formal "hard law" treaties may not be practicable. In a growing number of areas of international oversight, ranging from environmental to commercial to social to military issues, traditional "hard law" treaties and agreements are being supplemented or in some cases displaced by new "soft law" approaches.128 "Soft law" approaches seek to create and implement substantive principles or norms without creating enforceable legal requirements. The traditional model of binding international regulation that relies on formal treaties negotiated by government officials has a number of limitations, including the excessive resources and time needed to negotiate a formal international agreement, problems in enforcement of and compliance with such agreements, and the lack of flexibility and responsiveness in adapting such instruments to changing circumstances.129 Many new models of international oversight or harmonization have been developed to circumvent such problems. These new models tend to be more flexible and reflexive, capable of being launched relatively quickly and adapted easily to changing technological, political and security landscapes. These new soft law approaches have their own limitations, including perhaps most importantly that they are not as binding and often not as specific as traditional legal agreements. Yet, their growing popularity is due to advantages such as the relative ease by which they can be adopted and updated, and the broader roles they create for stakeholders to participate in their substantive formulation.130 Some of the key soft law/governance approaches that have been applied in other areas of emerging technologies that could conceivably be adapted to apply to the military

126

Comm. on Advances in Tech. and the Prevention of Their Application to Next Generation Bio warfare Threats Staff, Nat'l Research Council, An International Perspective on Advancing Technologies and Strategies for Managing Dual-Use Risks: Report of a Workshop 7374 (Nat'l Academies Press 2005). 127

Id.

128

Kenneth W. Abbott & Duncan Snidal, Hard and Soft Law in International Governance, 54 Int'l Org. 421, 434-50 (2000) (providing numerous examples of "soft law" approaches in international law); Richard L. Williamson, Jr., Hard Law, Soft Law, and Non-Law in Multilateral Arms Control: Some Compliance Hypotheses, 4 Chi. J. of Int'l L. 59, 63 (2003) (detailing the growing use of "soft law" in international arms control). 129

Abbott & Snidal, supra note 128, at 423.

130

Williamson, supra note 128, at 63. 306

242

The Applied Ethics of Emerging Military and Security Technologies Vol. XII

The Columbia Science and Technology Law Review

2011

use of robotics are briefly described below. A. Codes of Conduct Codes of conduct are non-binding and often somewhat general guidelines defining responsible, ethical behavior and which are intended to promote a culture of responsibility. They can be developed and implemented by a variety of different entities, including governmental agencies, industry groups, individual companies, professional or scientific societies, non-governmental organizations, or collaborative partnerships involving two or more of these entities. One of the first, and possibly most successful, codes of conduct was developed at the outset of the field of genetic engineering. The Asilomar guidelines on recombinant DNA research were adopted in the 1970s in response to safety concerns about some early genetic engineering experiments.131 These guidelines were initially developed by scientists based on discussions at a 1975 conference held at the Asilomar Conference Center in Pacific Grove, California, and were subsequently adopted into more binding guidelines (at least for funding recipients) by the National Institutes of Health in 1976. The later guidelines have been widely complied with by scientists around the world.132 More recently, codes of conduct have emerged at the forefront of discussions to restrict the use of genetic engineering to create new biological weapons.133 Although there are concerns that unenforceable codes of conduct will not provide strong enough assurances against the creation of new genetically engineered biological weapons, they may play an important bridging role in providing some initial protection and governance until more formal legal instruments can be negotiated and implemented.134 In the same way, codes of conduct may play a similar transitional role in establishing agreed-upon principles for the military use of robots. Codes of conduct are being created for other emerging technologies with potential military applications. The areas of synthetic biology and nanotechnology are two examples. In synthetic biology three different groups have recently proposed competing codes of conduct to manage security implications.135 The groups are the U.S. Government, the International Association Synthetic Biology (IASB), and the

131

Paul Berg, Asilomar 1975: DNA Modification Secured, 455 Nature 290, 290 (2008).

132

M a t 290-91.

Jeanne Guillemin, Can Scientific Codes of Conduct Deter Bioweapons? 07-06 MIT Center for International Studies Audit of the Conventional Wisdom (2007) at 1-2, available at http://web.mit.edu/cis/pdf/Audit_04_07_Guillemin.pdf. 134

Mat3.

See Markus Fischer & Stephen M. Maurer, Harminizing Biosecurity Oversight for Gene Synthesis, 28 Nature Biotechnology 20 (2010). 307

The Applied Ethics of Emerging Military and Security Technologies Vol. XII

The Columbia Science and Technology Law Review

243 2011

International Gene Synthesis Consortium (IGSC).136 This proliferation of competing codes flags a key question about who has the authority and influence to promulgate effective codes of conduct that relevant parties will comply with. A number of codes of conduct have also been created in the field of nanotechnology. The first code was developed by the Foresight Institute in the form of "guidelines," with a primary objective of discouraging the creation and deployment of autonomous replicating nanosystems.137 The Foresight Institute guidelines have since been updated six times,138 demonstrating the flexibility and adaptivity that is possible with codes of conduct which can be relatively easily updated (at least compared with a treaty or other more formal instrument). The current Foresight guidelines are extremely thorough and address issues and implications of nanotechnology in professional, industry, military, health, policy and other contexts.139 The guidelines are based on the premise that professional ethics and soft law measures can be at least as effective as hard law in promoting safe practices.140 The drafters of the Foresight guidelines also recognize the value in promoting the least restrictive legal alternative while developing good practices in areas of emerging technology.141 The European Union recently adopted a code of conduct for nanotechnology researchers.142 The code promotes a responsible and open approach to research conducted within a "safe, ethical and effective framework."143 Regular monitoring and revision of the code will occur in order to keep the code current with advances in nanoscience and nanotechnology.144 Another nanotechnology code of conduct originating in Europe, but with

136

Id. at 20.

137

Neil Jacobstein, Foresight Institute, Foresight Guidelines for Responsible Nanotechnology Development, Draft Version 6: April, 2006, available at http://www.foresight.org/guidelines/ForesightGuidelinesV6.pdf, at 4.

141

W a t 6.

142

Commission Recommendation of 07/02/2008 on a Code of Conduct for Repsonsible Nanosciences and Nanotechnologies Research (July 2008), available at http://ec.europa.eu/nanotechnology/pdf/nanocode-rec_pe0894c_en.pdf. 143

Wat5.

144

W a t 6. 308

244

The Applied Ethics of Emerging Military and Security Technologies Vol. XII

The Columbia Science and Technology Law Review

2011

international applicability, is the Responsible NanoCode.145 The Responsible NanoCode is an example of a code of conduct developed as a result of significant collaboration and designed to reach a wide target audience. The creators are United Kingdom's Royal Society, a nanotechnology industry trade group and a public interest organization.146 The code is for companies that handle nanomaterials and the specific objective of the code is to "establish a consensus of good practice in the research, production, retail and disposal of products using nanotechnologies."147 The code was developed to be universally applicable; it was devised "to be adopted by organisations in any part of the world, under any regulatory regime."148 Much of the discourse concerning codes of conduct tends to refer to codes of conduct as a single concept with a singular meaning or interpretation. But codes of conduct can exist on a continuum with respect to their objectives, specificity, audience, and expectations of compliance. The goal of all codes is to affect behavior, but different types of codes seek to shape behavior in different ways.149 There are actually three primary types of codes: ethics, conduct, and practice.150 Codes of ethics entail professionalism; codes of conduct espouse guidelines of appropriate behavior; codes of practice embody practices to be enforced.151 Many codes, commonly referred to as codes of conduct, may in reality be a combination of the three types of codes. The principal benefit of codes of conduct may not be the codes themselves, but rather the educational and cooperation-building effects of developing them.152 So, notwithstanding the ultimate utility and efficacy (or lack thereof) of a code, its production may itself have expressive value. The discussion and collaboration required to develop a code raises awareness of relevant issues and prompts dialogue between relevant 145 ResponsibleNanoCode, Information on the Responsible NanoCode Initiative (May 2008), available at http://www.responsiblenanocode.org/documents/InformationonTheResponsibleNanoCode.pdf. 146

Mat?.

147

W.at3.

148

W.at3.

149

Royal Society, The Roles of Codes of Conduct in Preventing the Misuse of Scientific Research, Royal Society Policy Document 03/05 (June 2005) , available at http://royalsociety.org/The-roles-of-codes-of-conduct-in-preventing-the-misuse-of-scientificresearch-/, at 2. 150

Brian Rappert, Towards a Life Sciences Code: Countering the Threats from Biological Weapons, in Strengthening the Biological Weapons Convention 2004, at 3 (Briefing Paper 13, 2d Series., Dep't of Peace Studies, Univ. of Bradford, 2004), available at www.brad.ac.uk/acad/sbtwc/briefing/BP_l3_2ndseries.pdf (last visited July 28, 2010). 151

The Royal Society, supra note 149, at 2.

152

Brian Rappert, Pacing Science and Technology with Codes of Conduct: Rethinking What Works (2010) (unpublished manuscript) (on file with author). 309

The Applied Ethics of Emerging Military and Security Technologies Vol. XII

The Columbia Science and Technology Law Review

245 2011

parties.153 For example, individuals, institutions and countries must be aware of their ethical obligations.154 Once a code is created it can help achieve that end by being a valuable educational tool.155 It is critical that countries are aware of other countries' intentions and limits when it comes to the use of autonomous robots, and so it is reasonable to expect that the exercise of attempting to develop a code of conduct addressing these issues could have both utility and educational benefits. Even though their tangible outcomes may be hard to identify, measure and quantify, codes of conduct possess unique features which make them attractive informal measures. The multiple codes of conduct that exist in the synthetic biology and nanotechnology industries highlight some of the salient benefits and drawbacks of codes of conduct. In terms of benefits, they can be created rather quickly compared to the time it would take to develop formal legal regulations. Codes can be drafted by interested parties who are knowledgeable in the area, and can be customized to address unique properties of a technology. On the other hand, one of the drawbacks of codes is their difficulty in application. Because anyone can craft a code of conduct, when multiple codes are introduced into an area, it is unclear whose code takes precedent. There is no hierarchical relationship amongst codes of conduct that would provide a clear sense of priority. The strengths of codes of conduct and other soft law mechanisms (being voluntary, cooperative, flexible measures) comes at a price: they have no rank order; they are all on the same playing field. Despite sounding like a straightforward concept, it is not a simple process to create a code of conduct. Drafting a comprehensive, appropriate and effective code requires thorough consideration of myriad issues, attention to detail, as well as a proper balancing of the policy interests of interested parties.156 While a code of conduct will not likely be sufficient to ensure the appropriate and ethical use of lethal autonomous military robots, the process of creating and disseminating such a code is undeniably a step forward and can be an important piece or at least starting point of an eventual treaty or agreement in this area.157

153

Royal Society, supra note 149, at 1.

154

Margaret A. Somerville & Ronald M. Atlas, Ethics: A Weapon to Counter Bioterrorism, Science, Mar. 25 2005, at 1881.

156

Rappert, supra note 152, at 3.

157 As discussed above, the most notable attempt to institute a code of conduct for military robotics to date is a "Code of Ethics" for robots being proposed by the Republic of South Korea, although the terms of the Code have yet to be fleshed out, and the code will apparently address issues relating to robots beyond just the military context. See supra note 68 and accompanying text.

310

246

The Applied Ethics of Emerging Military and Security Technologies Vol. XII

The Columbia Science and Technology Law Review

2011

B. Trans governmental Dialogue "Transgovernmental dialogue" refers to a growing number of informal and flexible arrangements under which governmental officials from different countries meet on a regular basis to discuss and coordinate polices. These opportunities provide a forum to share information and 'best practices,' to seek to harmonize policies and oversight mechanisms, to coordinate enforcement practices, and to help anticipate, prevent and resolve inter-national disputes.158 Transgovernmental dialogue can greatly enhance cooperation and influence policy outcomes. It achieves that end through collaborative mechanisms and countries' shared desire to address a common problem or goal. These types of dialogues are beneficial to the nations involved and are becoming increasingly common in areas requiring international coordination, with national security issues being a prime example.159 They offer "a structure that is less threatening to democratic governance than private transnational action and less costly than inter-state negotiations, yet they can lay a firm foundation for harmonized national regulation and even, if appropriate, for international regulation."160 An example of a transgovernmental dialogue in the national security context is the Australia Group, an informal forum of officials from forty-one nations with a common interest in preventing proliferation of materials that could be used for chemical or biological weapons.161 Arising out of the experiences of the Iraq/Iran war of the late 1980s, the Australia Group was chiefly concerned with the use of chemical and biological weapons deployed in that conflict, but the Group has subsequently developed a list of dual use162 items that that each country agrees to control through national export regulations. Since member countries do not have any legally binding obligations, achieving the goals set forth by the Australia Group depends completely on the voluntary good-faith commitment of the individual countries to the Group's goals.163 The Group meets annually to discuss ways to prevent proliferation of chemical and biological agents through national export licensing policies and other measures. 158

Kenneth W. Abbott, Douglas J. Sylvester & Gary E. Marchant, Transnational Regulation: Reality or Romanticism?, in International Handbook on Regulating Nanotechnologies (Graeme Hodge, Diana Bowman and Andrew Maynard eds., 2010). 159 See Patryk Pawlak, From Hierarchy to Networks: Transatlantic Governance of Homeland Security, 1 Journal of Global Change and Governance 1, 3 (2007) (discussing the shift from hierarchical to transgovernmental governance networks over the past decade, the accompanying structural and cultural shifts and their impacts on national security). 160

Abbott et al., supra note 158.

161

The Australia Group, http://www.australiagroup.net/en/index.html (last visited Feb. 26,

2010). 162

"Dual use" simply refers to technology-based research and products that can be applied to both military and civilian use. The Australia Group: An Introduction, The Australia Group, http://www.australiagroup.net/en/introduction.html (last visited Feb. 26, 2010). 311

The Applied Ethics of Emerging Military and Security Technologies Vol. XII

The Columbia Science and Technology Law Review

247 2011

Another example of an existing transgovernmental institution is the International Conference on Harmonisation of Technical Requirements for Registration of Pharmaceuticals for Human Use (ICH), which brings together pharmaceutical regulators from the US, Europe, and Japan, along with pharmaceutical industry representatives from the same three jurisdictions, to coordinate pharmaceutical regulatory policy issues with a view to harmonization.164 In addition to increasing harmonization, another goal of the ICH is to reduce the need for duplication of testing products, an accomplishment which is intended to reduce delays in the development and distribution of new medicines around the world.165 A third example is the Organization for Economic Co-operation and Development (OECD), an organization of 30 industrialized countries that has created two committees (the Working Party on Manufactured Nanomaterials (WPMN) and the Working Party on Nanotechnology (WPN)) to undertake a variety of informal harmonization activities.166 The objective of the WPN is to promote international co-operation that fosters the research, development, and responsible commercialization of nanotechnology.167 The WPN facilitates communication between governments which promotes discussion, awareness, and ideally a coordination of policy responses.168 Meanwhile, the WPMN is an international effort to analyze the environmental health and safety risks posed by nanotechnology.169 Another example from the nanotechnology realm is the International Dialogue on Responsible Research and Development of Nanotechnology, a forum that has brought together regulators from almost 50 nations every two years (2004, 2006, 2008) to discuss nanotechnology regulation.170 The initial meeting of this forum was sponsored by an NGO (the Meridian Institute), but the national governments volunteered to sponsor subsequent meetings (Japan in 2006 and the EU in 2008). These various examples of transnational dialogue are effective in starting a discussion among policymakers from different nations, and are relatively easy and quick 164

Int'l Conference on Harmonization, Welcome to the Official Web Site for the ICH, available at http://www.ich.org/cache/compo/276-254-l.html (last visited Oct. 10, 2010).

165

I