Abstract
Technological developments in the sphere of artificial intelligence (AI) inspire debates about the implications of autonomous weapon systems (AWS), which can select and engage targets without human intervention. While increasingly more systems which could qualify as AWS, such as loitering munitions, are reportedly used in armed conflicts, the global discussion about a system of governance and international legal norms on AWS at the United Nations Convention on Certain Conventional Weapons (UN CCW) has stalled. In this article we argue for the necessity to adopt legal norms on the use and development of AWS. Without a framework for global regulation, state practices in using weapon systems with AI-based and autonomous features will continue to shape the norms of warfare and affect the level and quality of human control in the use of force. By examining the practices of China, Russia, and the United States in their pursuit of AWS-related technologies and participation at the UN CCW debate, we acknowledge that their differing approaches make it challenging for states parties to reach an agreement on regulation, especially in a forum based on consensus. Nevertheless, we argue that global governance on AWS is not impossible. It will depend on the extent to which an actor or group of actors would be ready to take the lead on an alternative process outside of the CCW, inspired by the direction of travel given by previous arms control and weapons ban initiatives.
Similar content being viewed by others
Notes
In this article we generally refer to autonomous weapon systems (AWS). AWS are not a specific category of weapon. Rather, we understand AWS as being any type of weapon system which utilizes machine autonomy to select and apply force without immediate human control or intervention. While some autonomous weapons integrate AI elements into their critical functions, they may not all necessarily be based on AI technologies. We will only employ the term lethal autonomous weapon systems (LAWS) when specifically citing the discussion at the UN CCW, as this is the official term which states parties have used as part of this debate. The term “killer robots” is also commonly used in media reports to discuss these technologies. Both LAWS and “killer robots” terms emphasize the lethality of AWS (Park, 2020, p. 396; see also Ekelhof, 2017; Taddeo & Blanchard, 2022).
References
Acheson, R. (2022a). We will not weaponise our way out of horror. CCW Report, 10(2). https://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/2022/gge/reports/CCWR10.2.pdf
Acheson, R. (2022b). Editorial: Weapons don’t save lives. CCW Report, 10(7), 1. https://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/2022/gge/reports/CCWR10.7.pdf
Altmann, J., & Sauer, F. (2017). Autonomous Weapon Systems and Strategic Stability. Survival, 59(5), 117–142. https://doi.org/10.1080/00396338.2017.1375263
Alwardt, C., & Schörnig, N. (2022). A necessary step back? Recovering the security perspective in the debate on lethal autonomy. Zeitschrift Für Friedens- Und Konfliktforschung. https://doi.org/10.1007/s42597-021-00067-z
Amoroso, D., & Tamburrini, G. (2020). Autonomous Weapons Systems and Meaningful Human Control: Ethical and Legal issues. Current Robotics Reports, 1(4), 187–194. https://doi.org/10.1007/s43154-020-00024-3
Amoroso, D., & Tamburrini, G. (2021). In Search of the ‘Human Element’: International Debates on Regulating Autonomous Weapons Systems. International Spectator, 56(1), 20–38. https://doi.org/10.1080/03932729.2020.1864995
Asaro, P. (2012). On banning autonomous weapon systems: Human rights, automation, and the dehumanization of lethal decision-making. International Review of the Red Cross, 94(886), 687–709. https://doi.org/10.1017/S1816383112000768
Automated Decision Research. (2022). State Positions. https://automatedresearch.org/state-positions/
Badell, D., & Schmitt, L. (2021). Contested views? Tracing European positions on lethal autonomous weapon systems. European Security, 31(2), 242–261. https://doi.org/10.1080/09662839.2021.2007476
Barbrook, R. (2007). New York Prophecies: The Imaginary Future of Artificial Intelligence. Science as Culture, 16, 151–167. https://doi.org/10.1080/09505430701369027
Bellanova, R., Jacobsen, K. L., & Monsees, L. (2020). Taking the Trouble: Science, Technology and Security Studies. Critical Studies on Security, 8(2), 87–100. https://doi.org/10.1080/21624887.2020.1839852
Bendett, S. (2022, August 30). The Ukraine war and its impact on Russian development of autonomous weapons. Atlantic Council. https://www.atlanticcouncil.org/content-series/airpower-after-ukraine/the-ukraine-war-and-its-impact-on-russian-development-of-autonomous-weapons/
Bernstein, S., & Laurence, M. (2022). Practices and Norms: Relationships, Disjunctures, and Change. In A. Drieschova, C. Bueger, & T. Hopf (Eds.), Conceptualizing international practices (pp. 77–99). Cambridge University Press.
Bhuta, N., Beck, S., Geiss, R., Liu, H. Y., & Kress, C. (Eds.). (2016). Autonomous weapons systems: Law, ethics, policy. Cambridge University Press.
Biegon, R., & Watts, T. F. A. (2022). Remote Warfare and the Retooling of American Primacy. Geopolitics, 27(3), 948–971. https://doi.org/10.1080/14650045.2020.1850442
Bijker, W. E., & Law, J. (Eds.). (1992). Shaping Technology/Building Society: Studies in Sociotechnical Change. MIT Press.
Bo, M., Bruun, L., & Boulanin, V. (2022). Retaining Human Responsibility in the Development and Use of Autonomous Weapon Systems: On Accountability for Violations of International Humanitarian Law Involving AWS. Stockholm International Peace Research Institute. https://doi.org/10.55163/AHBC1664
Bode, I. (2019). Norm-making and the Global South: Attempts to Regulate Lethal Autonomous Weapons Systems. Global Policy, 10(3), 359–364. https://doi.org/10.1111/1758-5899.12684
Bode, I. (2022). Practice-based and Deliberative Normativity: Retaining Human Control over the Use of Force. Conference paper under review.
Bode, I., & Huelss, H. (2018). Autonomous weapons systems and changing norms in international relations. Review of International Studies, 44(3), 393–413. https://doi.org/10.1017/S0260210517000614
Bode, I., & Huelss, H. (2022). Autonomous Weapons Systems and International norms. McGill-Queen’s University Press.
Bode, I., & Nadibaidze, A. (2022, April 4). AI and Drones in the Russian Invasion of Ukraine: Challenging the Expectations? The AutoNorms Blog. https://www.autonorms.eu/ai-and-drones-in-the-russian-invasion-of-ukraine-challenging-the-expectations/
Bode, I., & Watts, T. F. A. (2021). Meaning-less Human Control. The Consequences of Automation and Autonomy in Air Defence Systems. Drone Wars UK & Center for War Studies, University of Southern Denmark. https://dronewars.net/wp-content/uploads/2021/02/DW-Control-WEB.pdf
Bode, I., & Watts, T. F. A. (forthcoming). Loitering Munitions and Unpredictability: New Challenges to Human Control. Article 36 & Center for War Studies.
Borrie, J. (2014). Humanitarian reframing of nuclear weapons and the logic of a ban. International Affairs, 90(3), 625–646. https://doi.org/10.1111/1468-2346.12130
Boulanin, V. (2021, March 3). Regulating Military AI Will Be Difficult. Here’s a Way Forward. Bulletin of the Atomic Scientists. https://thebulletin.org/2021/03/regulating-military-ai-will-be-difficult-heres-a-way-forward/
Boulanin, V., Saalman, L., Topychkanov, P., Su, F., & Carlsson, M. P. (2020). Artificial Intelligence, Strategic Stability and Nuclear Risk. Stockholm International Peace Research Institute. https://www.sipri.org/publications/2020/other-publications/artificial-intelligence-strategic-stability-and-nuclear-risk
Boulanin, V., & Verbruggen, M. (2017). Mapping the Development of Autonomy in Weapons Systems. Stockholm International Peace Research Institute. https://www.sipri.org/sites/default/files/2017-11/siprireport_mapping_the_development_of_autonomy_in_weapon_systems_1117_1.pdf
Boutin, B. (2022). State Responsibility in Relation to Military Applications of Artificial Intelligence. Leiden Journal of International Law. https://doi.org/10.1017/S0922156522000607
Brehm, M. (2017). Defending the Boundary: Constraints and Requirements on the Use of Autonomous Weapon Systems Under International Humanitarian and Human Rights Law. In The Geneva Academy of International Humanitarian Law and Human Rights (Issue 9). Geneva Academy. https://doi.org/10.2139/ssrn.2972071
Carpenter, C. (2022, January 7). A Better Path to a Treaty Banning ‘Killer Robots’ Has Just Been Cleared. World Politics Review. https://www.worldpoliticsreview.com/articles/30232/a-better-path-to-a-treaty-banning-ai-weapons-killer-robots
C.A.S.E Collective. (2006). Critical Approaches to Security in Europe: A Networked Manifesto. Security Dialogue, 37(4), 443–487. https://doi.org/10.1177/0967010606073085
CBC Radio. (2022, January 10). Quit using The Terminator as an example of AI gone wrong, argues BBC Reith Lecturer. https://www.cbc.ca/radio/ideas/quit-using-the-terminator-as-an-example-of-ai-gone-wrong-argues-bbc-reith-lecturer-1.6309630
CCW Review Conference. (2021, December 17). Report of main committee II. Reaching Critical Will. https://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/2021/RevCon/documents/MCII-rev3.pdf
Chandler, K. (2021). Does Military AI Have Gender? Understanding Bias and Promoting Ethical Approaches in Military Applications of AI. United Nations Institute for Disarmament Research. https://doi.org/10.37559/GEN/2021/04
Chengeta, T. (2022). Is the Convention on Conventional Weapons the Appropriate Framework to Produce a New Law on Autonomous Weapon Systems?. In F. Viljoen, C. Fombad, D. Tladi, A. Skelton, & M. Killander (Eds.), A Life Interrupted: Essays in Honour of the Lives and Legacies of Christof Heyns (pp. 379–397). Pretoria University Law Press.
Chinese Delegation to the CCW. (2016, April 12, 10:34:19). CCW Meeting of Experts on Lethal Autonomous Weapons. UN Digital Recording System. https://conf.unog.ch/digitalrecordings/
Chinese Ministry of Foreign Affairs. (2018, April 11). Position Paper submitted by China to the Group of Governmental Experts of the High Contracting Parties to the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons. Reaching Critical Will. https://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/2018/gge/documents/GGE.1-WP7.pdf
Chinese Ministry of Foreign Affairs. (2021, December 14). Position Paper of the People’s Republic of China on Regulating Military Applications of Artificial Intelligence (AI).https://www.fmprc.gov.cn/mfa_eng/wjdt_665385/wjzcs/202112/t20211214_10469512.html
Colby, E. A., & Mitchell, A. W. (2020). The Age of Great-Power Competition: How the Trump Administration Refashioned American Strategy. Foreign Affairs, 99(1), 118–130. https://www.foreignaffairs.com/united-states/age-great-power-competition
Crootof, R. (2015). The Killer Robots Are Here: Legal and Policy Implications. Cardozo Law Review, 36(5), 1837–1916.
Defense Advanced Research Projects Agency (DARPA). (n.d.). AI Next Campaign. https://www.darpa.mil/work-with-us/ai-next-campaign
Diehl, C., & Lambach, D. (2022). (K)ein „AI arms Race“? Technologieführerschaft im Verhältnis der Großmächte. Zeitschrift für Außen- und Sicherheitspolitik, 15(2–3), 263–282. https://doi.org/10.1007/s12399-022-00915-7
Docherty, B. (2022). An Agenda for Action: Alternative Processes for Negotiating a Killer Robots Treaty. Human Rights Watch and International Human Rights Clinic. https://www.hrw.org/report/2022/11/10/agenda-action/alternative-processes-negotiating-killer-robots-treaty
Ekelhof, M. A. C. (2017). Complications of a Common Language: Why It Is So Hard to Talk about Autonomous Weapons. Journal of Conflict and Security Law, 22(2), 311–331. https://doi.org/10.1093/jcsl/krw029
Ekelhof, M. A. C. (2018). Lifting the Fog of Targeting: “Autonomous Weapons” and Human Control through the Lens of Military Targeting. Naval War College Review, 71(3), 67–100. https://digital-commons.usnwc.edu/nwc-review/vol71/iss3/6/
Fritsch, S. (2011). Technology and Global Affairs. International Studies Perspectives, 12, 27–45. https://doi.org/10.1111/j.1528-3585.2010.00417.x
Gadinger, F. (2022). The Normativity of International Practices. In A. Drieschova, C. Bueger, & T. Hopf (Eds.), Conceptualizing International Practices. Directions for the Practice Turn in International Relations (pp. 100–121). Cambridge University Press.
Garcia, D. (2016). Future arms, technologies, and international law: Preventive security governance. European Journal of International Security, 1(1), 94–111. https://doi.org/10.1017/eis.2015.7
Garcia, D. (2017, December 13). Governing Lethal Autonomous Weapon Systems. Ethics & International Affairs. https://www.ethicsandinternationalaffairs.org/2017/governing-lethal-autonomous-weapon-systems/
Garcia, D. (2018). Lethal Artificial Intelligence and Change: The Future of International Peace and Security. International Studies Review, 20(2), 334–341. https://doi.org/10.1093/isr/viy029
Garcia, D. (forthcoming). Common Good Governance in the Age of Military Artificial Intelligence. Oxford University Press.
Geist, E. M. (2016). It’s already too late to stop the AI arms race—We must manage it instead. Bulletin of the Atomic Scientists, 72(5), 318–321. https://doi.org/10.1080/00963402.2016.1216672
Gillespie, T. (2014, June). Algorithm. Culture Digitally. https://culturedigitally.org/2014/06/algorithm-draft-digitalkeyword/
Gorenburg, D., Fink, A., Bendett, S., & Edmonds, J. (2022). A Technological Divorce: The impact of sanctions and the end of cooperation on Russia’s technology and AI sector (CNA Occasional Paper). CNA. https://www.cna.org/reports/2022/04/a-technological-divorce
Gray, M., & Ertan, A. (2021). Artificial Intelligence and Autonomy in the Military: An Overview of NATO Member States’ Strategies and Deployment. NATO Cooperative Cyber Defence Centre of Excellence. https://ccdcoe.org/library/publications/artificial-intelligence-and-autonomy-in-the-military-an-overview-of-nato-member-states-strategies-and-deployment/
Haas, M. C., & Fischer, S. C. (2017). The evolution of targeted killing practices: Autonomous weapons, future conflict, and the international order. Contemporary Security Policy, 38(2), 281–306. https://doi.org/10.1080/13523260.2017.1336407
Hagel, C. (2014, November 15). Reagan National Defense Forum Keynote. US Department of Defense. https://www.defense.gov/News/Speeches/Article/606635/
Haner, J., & Garcia, D. (2019). The Artificial Intelligence Arms Race: Trends and World Leaders in Autonomous Weapons Development. Global Policy, 10(3), 331–337. https://doi.org/10.1111/1758-5899.12713
Hawley, J. K., & Mares, A. L. (2012). Human Performance Challenges for the Future Force: Lessons from Patriot after the Second Gulf War. In P. Savage-Knepshield, J. Martin, J. Locket III, & L. Allender (Eds.), Designing Soldier Systems: Current Issues in Human Factors (pp. 3–34). Ashgate.
Heyns, C. (2016). Human Rights and the Use of Autonomous Weapons Systems (AWS) during Domestic Law Enforcement. Human Rights Quarterly, 38, 350–378. https://doi.org/10.1353/hrq.2016.0034
Holland Michel, A. (2021). Known Unkowns: Data Issues and Military Autonomous Systems. The United Nations Institute for Disarmament Research. https://unidir.org/known-unknowns
Horowitz, M. C., & Scharre, P. (2015). Meaningful Human Control in Weapon Systems: A Primer. Center for a New American Security. https://www.cnas.org/publications/reports/meaningful-human-control-in-weapon-systems-a-primer
International Committee of the Red Cross (ICRC). (2021, May 12). ICRC position on autonomous weapon systems. https://www.icrc.org/en/document/icrc-position-autonomous-weapon-systems
International Committee of the Red Cross (ICRC). (2022, July 26). What you need to know about autonomous weapons. https://www.icrc.org/en/document/what-you-need-know-about-autonomous-weapons
International Panel on the Regulation of Autonomous Weapons (iPRAW). (2018). Focus on Ethical Implications for a Regulation of LAWS (“Focus on” Report No.4). https://www.ipraw.org/publications/ethical-implications/
International Panel on the Regulation of Autonomous Weapons (iPRAW). (2020). A path towards the regulation of LAWS. IPRAW Briefing. https://www.ipraw.org/publications/briefing-a-path-towards-regulation/
Jasanoff, S. (1987). Contested Boundaries in Policy-Relevant Science. Social Studies of Science, 17(2), 195–230. https://doi.org/10.1177/030631287017002001
Jasanoff, S. (2004). The idiom of co-production. In S. Jasanoff (Ed.), States of Knowledge: The Co-Production of Science and the Social Order (pp. 1–12). Routledge.
Johnson, J. (2019). Artificial Intelligence & Future Warfare: Implications for International Security. Defense and Security Analysis, 35(2), 147–169. https://doi.org/10.1080/14751798.2019.1600800
Kahn, L. (2022, October 20). Russia is Lying About its AI Capabilities: How Russia is Using Emerging Technologies to Hide Human Rights Violations. The SAIS Review of International Affairs. https://saisreview.sais.jhu.edu/russia-ai-human-rights-violations-ukraine-syria/
Kallenborn, Z. (2021, June 4). If a killer robot were used, would we know? Bulletin of the Atomic Scientists. https://thebulletin.org/2021/06/if-a-killer-robot-were-used-would-we-know/
Konaev, M. (2021). Military Applications of Artificial Intelligence: The Russian Approach. In Advanced Military Technology in Russia: Capabilities and Implications (pp. 63–74). Chatham House. https://www.chathamhouse.org/2021/09/advanced-military-technology-russia
Leander, A. (2008). Thinking Tools. In A. Klotz, & D. Prakesh (Eds.), Qualitative Methods in International Relations: A Pluralist Guide (pp. 11–27). Palgrave Macmillan.
Li, J. (2014, November 20). Lijie: ’Zhinenghua zhanzheng’ zheng pumianerlai [Li Jie: ‘Intelligent war’ is coming]. Global Times. https://opinion.huanqiu.com/article/9CaKrnJFQVr
Liu, X. (2020, April 14). Robot warriors join Chinese military arsenal, will free soldiers from dangerous missions. Global Times. https://www.globaltimes.cn/content/1185595.shtml
Ma, V. (2016). The Ethics and Implications of Modern Warfare: Robotic Systems and Human Optimization. Harvard International Review, 37(4), 43–45. https://www.jstor.org/stable/26445617
Mahnken, T. G. (2010). Technology and the American Way of War Since 1945. Columbia University Press.
Maas, M. (2019). How Viable is International Arms Control for Military Artificial Intelligence? Three Lessons from Nuclear Weapons. Contemporary Security Policy, 40(3), 285–311. https://doi.org/10.1080/13523260.2019.1576464
Macfarlane, K., & Christie, L. (2022). Automation in Military Operations (POSTNOTE 681). The Parliamentary Office of Science and Technology. https://post.parliament.uk/research-briefings/post-pn-0681/
McDermott, R. (2021). Russian UAV Technology and Loitering Munitions. Eurasia Daily Monitor, 18(72). https://jamestown.org/program/russian-uav-technology-and-loitering-munitions/
Mizokami, K. (2021, November 2). Autonomous Drones Have Attacked Humans. This Is a Turning Point. Popular Mechanics. https://www.popularmechanics.com/military/weapons/a36559508/drones-autonomously-attacked-humans-libya-united-nations-report/
Morgan, F. E., Boudreaux, B., Lohn, A. J., Ashby, M., Curriden, C., Klima, K., & Grossman, D. (2020). Military Applications of Artificial Intelligence: Ethical Concerns in an Uncertain World. RAND Corporation. https://www.rand.org/pubs/research_reports/RR3139-1.html
Moyes, R. (2019). Target Profiles. Article 36. http://www.article36.org/wp-content/uploads/2019/08/Target-profiles.pdf
Moyes, R. (2022, August 9). Continued CCW failure makes an alternative process on autonomous weapons more likely. Article 36. https://article36.org/updates/continued-ccw-failure-makes-an-alternative-process-on-autonomous-weapons-more-likely/
Nadibaidze, A. (2021a, June 3). Russia’s Perspective on Human Control and Autonomous Weapons: Is the Official Discourse Changing? The AutoNorms Blog. https://www.autonorms.eu/russias-perspective-on-human-control-and-autonomous-weapons-is-the-official-discourse-changing-2/
Nadibaidze, A. (2021b, August 23). Can the UN GGE Go Beyond the Eleven Guiding Principles on LAWS? The AutoNorms Blog. https://www.autonorms.eu/can-the-un-gge-go-beyond-the-eleven-guiding-principles-on-laws/
Nadibaidze, A. (2022a). Great power identity in Russia’s position on autonomous weapons systems. Contemporary Security Policy, 43(3), 407–435. https://doi.org/10.1080/13523260.2022.2075665
Nadibaidze, A. (2022b). Russian Perceptions of Military AI, Automation, and Autonomy. Foreign Policy Research Institute. https://www.fpri.org/article/2022/01/russian-perceptions-of-military-ai-automation-and-autonomy/
National Security Commission on Artificial Intelligence (NSCAI). (2021, March). Final Report. https://www.nscai.gov/wp-content/uploads/2021/03/Full-Report-Digital-1.pdf
Nicolini, D. (2013). Practice Theory, Work, and Organization: An Introduction. Oxford University Press.
Noor, O. (2022a, March 15). Discussions at UN on Autonomous Weapon Systems Blocked by Russia, but States Indicate Way Forward. Campaign to Stop Killer Robots. https://www.stopkillerrobots.org/news/discussions-at-un-on-autonomous-weapon-systems-blocked-by-russia-but-states-indicate-way-forward/
Noor, O. (2022b, October 21). 70 states deliver joint statement on autonomous weapons systems at UN General Assembly. Campaign to Stop Killer Robots. https://www.stopkillerrobots.org/news/70-states-deliver-joint-statement-on-autonomous-weapons-systems-at-un-general-assembly/
Office for Democratic Institutions and Human Rights. (2022). ODIHR Interim Report on reported violations of international humanitarian law and international human rights law in Ukraine. https://www.osce.org/files/f/documents/c/d/523081_0.pdf
Pang, H. (2019). 21 shiji zhanzheng yanbian yu gouxiang: zhinenghua zhanzhen [Evolution and conception of War in the 21st Century: Intelligent War]. Shanghai Academy of Social Sciences Press.
Park, S. (2020). Analysis of the Positions Held by Countries on Legal Issues of Lethal Autonomous Weapons Systems and Proper Domestic Policy Direction of South Korea. Korean Journal of Defense Analysis, 32(3), 393–418. https://doi.org/10.22831/kjda.2020.32.3.004
PLA Daily (2015, July 31). Meiguo lujun jinyibu jiakuai gaige, jiemi ‘quyu dingxiang budui’ [The United States Ground Force further accelerated reform: revealing ‘armed forces orienteering’]. http://military.people.com.cn/n/2015/0731/c1011-27390390.html
Pratt, S. F. (2022). Normative Transformation and the War on Terrorism. Cambridge University Press.
Qiao-Franco, G., & Bode, I. (2023). Weaponised Artificial Intelligence and Chinese Practices of Human-Machine Interaction. Chinese Journal of International Politics, 1–23. https://doi.org/10.1093/cjip/poac024
Qiao-Franco, G., & Zhu, R. (2022). China’s Artificial Intelligence Ethics: Policy Development in an Emergent Community of Practice. Journal of Contemporary China. https://doi.org/10.1080/10670564.2022.2153016
Rainsford, S. (2022, July 15). Ukraine war: Four-year-old Liza killed by Russian attack on Vinnytsia. BBC News. https://www.bbc.com/news/world-europe-62181726
RIA Novosti. (2022, 19 April). ВС России внедрят новые способы ведения боевых действий [The Russian Armed Forces will integrate new methods of warfare]. https://ria.ru/20220419/oborona-1784272753.html
Rosendorf, O. (2021). Predictors of Support for a Ban on Killer Robots: Preventive Arms Control as an Anticipatory Response to Military Innovation. Contemporary Security Policy, 42(1), 30–52. https://doi.org/10.1080/13523260.2020.1845935
Rosert, E., & Sauer, F. (2019). Prohibiting Autonomous Weapons: Put Human Dignity First. Global Policy, 10(3), 370–375. https://doi.org/10.1111/1758-5899.12691
Rosert, E., & Sauer, F. (2021). How (not) to stop the killer robots: A comparative analysis of humanitarian disarmament campaign strategies. Contemporary Security Policy, 42(1), 4–29. https://doi.org/10.1080/13523260.2020.1771508
Russell, S. (2021, December 8). AI in Warfare. The BBC Reith Lectures. https://www.bbc.co.uk/sounds/play/m00127t9
Russian Federation. (2018, April 4). Russia’s Approaches to the Elaboration of a Working Definition and Basic Functions of Lethal Autonomous Weapons Systems in the Context of the Purposes and Objectives of the Convention. Reaching Critical Will. https://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/2018/gge/documents/GGE.1-WP6-English.pdf
Russian Federation. (2021a, September 27). Considerations for the report of the Group of Governmental Experts of the High Contracting Parties to the Convention on Certain Conventional Weapons on emerging technologies in the area of Lethal Autonomous Weapons Systems. United Nations. https://undocs.org/ccw/gge.1/2021/wp.1
Russian Federation. (2021b, September 27). To understanding the characteristics of the LAWS relevant to the aims and objectives of the Convention. Group of Governmental Experts on Emerging Technologies in the Area of LAWS, UNODA. https://documents.unoda.org/wp-content/uploads/2021/09/To-understanding-the-characteristics-of-the-LAWS-.pdf
Sauer, F. (2020). Stepping back from the brink: Why multilateral regulation of autonomy in weapons systems is difficult, yet imperative and feasible. International Review of the Red Cross, 102(913), 235–259. https://doi.org/10.1017/S1816383120000466
Sauer, T., & Reveraert, M. (2018). The potential stigmatizing effect of the Treaty on the Prohibition of Nuclear Weapons. The Nonproliferation Review, 25(5–6), 437–455. https://doi.org/10.1080/10736700.2018.1548097
Scharre, P. (2021). Debunking the AI Arms Race Theory. Texas National Security Review, 4(3). https://tnsr.org/2021/06/debunking-the-ai-arms-race-theory/
Schmidt, E. (2022). AI, Great Power Competition & National Security. Daedalus, 151(2), 288–298. https://doi.org/10.1162/daed_a_01916
Schmidt, E., & Work, R. (2022, December 5). How to Stop the Next World War: A Strategy to Restore America’s Military Deterrence. The Atlantic. https://www.theatlantic.com/ideas/archive/2022/12/us-china-military-rivalry-great-power-war/672345/
Shane, S., & Wakabayashi, D. (2018, April 4). ‘The Business of War’: Google Employees Protest Work for The Pentagon. The New York Times. https://www.nytimes.com/2018/04/04/technology/google-letter-ceo-pentagon-project.html
Sharikov, P. (2018). Artificial intelligence, cyberattack, and nuclear weapons—A dangerous combination. Bulletin of the Atomic Scientists, 74(6), 368–373. https://doi.org/10.1080/00963402.2018.1533185
Sharkey, A. (2019). Autonomous Weapons Systems, Killer Robots and Human dignity. Ethics and Information Technology, 21, 75–87. https://doi.org/10.1007/s10676-018-9494-0
Sharkey, N. (2016). Staying in the Loop: Human Supervisory Control of Weapons. In N. Bhuta, S. Beck, R. Geiss, H.-Y. Liu, & C. Kress (Eds.), Autonomous Weapons Systems: Law, Ethics, Policy (pp. 23–38). Cambridge University Press.
Sharkey, N. (2018, August 28). The Impact of Gender and Race Bias in AI. ICRC Humanitarian Law & Policy. https://blogs.icrc.org/law-and-policy/2018/08/28/impact-gender-race-bias-ai/
Stanley, A. (2021, May 30). The Age of Autonomous Killer Robots May Already Be Here. Gizmodo. https://gizmodo.com/flying-killer-robot-hunted-down-a-human-target-without-1847001471
State Council of China (2019, July 24). China’s National Defense in the New Era. http://english.www.gov.cn/archive/whitepaper/201907/24/content_WS5d3941ddc6d08408f502283d.html
Suchman, L. (2016). Situational Awareness and Adherence to the Principle of Distinction as a Necessary Condition for Lawful Autonomy. CCW Informal Meeting of Experts on Lethal Autonomous Weapons. Geneva, 12 April 2016, Panel ‘Towards a Working Definition of LAWS’.
Suchman, L. (2020). Algorithmic Warfare and the Reinvention of Accuracy. Critical Studies on Security, 8(2), 175–187. https://doi.org/10.1080/21624887.2020.1760587
Taddeo, M., & Blanchard, A. (2022). A Comparative Analysis of the Definitions of Autonomous Weapons Systems. Science and Engineering Ethics, 28(37). https://doi.org/10.1007/s11948-022-00392-3
TASS. (2021, September 13). Uran-9, Nerekhta Robots Used in Troops Formations for First Time at Zapad-2021 Drills. https://tass.com/defense/1337237
TASS. (2022a, August 17). В Минобороны РФ создали управление по работе с искусственным интеллектом [The Russian MoD created a department to work with AI]. https://tass.ru/armiya-i-opk/15492531
TASS (2022b, August 25). В МО рассказали об интеллектуализации российских вооружений [The MoD talked about the intellectualization of Russian armaments].https://tass.ru/armiya-i-opk/15557335?
Tavsan, S. (2021, June 20). Turkish defense company says drone unable to go rogue in Libya. Nikkei Asia. https://asia.nikkei.com/Business/Aerospace-Defense/Turkish-defense-company-says-drone-unable-to-go-rogue-in-Libya
The United States. (2013, November 14). U.S. Delegation Opening Statement as Delivered by Michael W. Meier. U.S. Mission to International Organizations in Geneva. https://geneva.usmission.gov/2013/11/15/u-s-opening-statement-at-the-meeting-of-parties-to-the-ccw/
The United States. (2014, November 13). U.S. Delegation Opening Statement as Delivered by Michael W. Meier. U.S. Mission to International Organizations in Geneva. https://geneva.usmission.gov/2014/11/13/u-s-statement-at-the-meeting-of-high-contracting-parties-to-the-convention-on-certain-conventional-weapons-ccw/
The United States. (2015, April 17). U.S. Delegation Closing Statement and the Way Ahead as Delivered by Michael W. Meier, Head of Delegation. U.S. Mission to International Organizations in Geneva. https://geneva.usmission.gov/2015/05/08/ccw-laws-meeting-u-s-closing-statement-and-the-way-ahead
The United States. (2018a, April 3). Humanitarian benefits of emerging technologies in the area of lethal autonomous weapon systems. U.S. Delegation to the CCW. https://ogc.osd.mil/Portals/99/Law%20of%20War/Practice%20Documents/US%20Working%20Paper%20-%20Humanitarian%20benefits%20of%20emerging%20technologies%20in%20the%20area%20of%20LAWS%20-%20CCW_GGE.1_2018_WP.4_E.pdf?ver=O0lg6BIxsFt57nrOuz3xHA%3D%3D
The United States. (2018b, April 9). Opening Statement as Delivered by Ian McKay. U.S. Mission to International Organizations in Geneva. https://geneva.usmission.gov/2018/04/09/ccw-u-s-opening-statement-at-the-group-of-governmental-experts-meeting-on-lethal-autonomous-weapons-systems/?_ga=2.26152510.2044898756.1665825878-233344413.1665825878
The United States. (2018c, April 11). U.S. Statement on Characterization of the systems under consideration in order to promote a common understanding on concepts and characteristics relevant to the objectives and purposes of the CCW. U.S. Mission to International Organizations in Geneva. https://geneva.usmission.gov/2018/04/11/ccw-u-s-statement-on-characterization-of-the-systems-under-consideration/?_ga=2.126230926.2044898756.1665825878-233344413.1665825878
The United States. (2018d, April 17). U.S. Statement on the Outcome of the GGE. U.S. Mission to International Organizations in Geneva. https://geneva.usmission.gov/2018/04/17/u-s-statement-on-the-outcome-of-the-gge/
The United States. (2018e, August 28). Human-Machine Interaction in the Development, Deployment and Use of Emerging Technologies in the Area of Lethal Autonomous Weapons Systems Submitted by the United States. Reaching Critical Will. https://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/2018/gge/documents/GGE.2-WP4.pdf
The United States. (2019, March 27). Consideration of the human element in the use of lethal force. U.S. Mission to International Organizations in Geneva. https://geneva.usmission.gov/2019/03/27/convention-on-certain-conventional-weapons-consideration-of-the-human-element-in-the-use-of-lethal-force/?_ga=2.24996537.2044898756.1665825878-233344413.1665825878
The White House. (2017, December). National Security Strategy of the United States of America. https://trumpwhitehouse.archives.gov/wp-content/uploads/2017/12/NSS-Final-12-18-2017-0905.pdf
The White House. (2022, October). National Security Strategy. https://www.whitehouse.gov/wp-content/uploads/2022/10
United Nations Security Council. (2021). Final Report of the Panel of Experts on Libya Established Pursuant to Security Council Resolution 1973 (2011). https://undocs.org/S/2021/229
United States Department of Defense. (2012). Department of Defense Directive Number 3000.09: Autonomy in Weapon Systems.https://www.esd.whs.mil/portals/54/documents/dd/issuances/dodd/300009p.pdf
United Nations Office for Disarmament Affairs (UNODA). (2019). Report of the 2019 session of the Group of Governmental Experts on Emerging Technologies in the Area of Lethal Autonomous Weapons Systems. https://documents.unoda.org/wp-content/uploads/2020/09/CCW_GGE.1_2019_3_E.pdf
Victor, D., & Nechepurenko, I. (2022, July 15). Russia Repeatedly Strikes Ukraine’s Civilians. There’s Always an Excuse. The New York Times. https://www.nytimes.com/article/russian-civilian-attacks-ukraine.html
Vilmer, J.-B. J. (2022). The forever-emerging norm of banning nuclear weapons. Journal of Strategic Studies, 45(3), 478–504. https://doi.org/10.1080/01402390.2020.1770732
Vincent, J. (2021, June 3). Have autonomous robots started killing in war? The Verge. https://www.theverge.com/2021/6/3/22462840/killer-robot-autonomous-drone-attack-libya-un-report-context
Wang, Y., & Chen, D. (2018). Rising Sino-U.S. competition in Artificial Intelligence. China Quarterly of International Strategic Studies, 4(2), 241–258. https://doi.org/10.1142/S2377740018500148
Warren, A., & Hillas, A. (2020). Friend or Frenemy? The Role of Trust in Human-Machine Teaming and Lethal Autonomous Weapons Systems. Small Wars & Insurgencies, 31(4), 822–850. https://doi.org/10.1080/09592318.2020.1743485
Watts, T. F. A., & Biegon, R. (2021). Revisiting the remoteness of remote warfare: US military intervention in Libya during Obama’s presidency. Defence Studies, 21(4), 508–527. https://doi.org/10.1080/14702436.2021.1994397
Work, R. (2016, April 28). Remarks by Deputy Secretary Work on Third Offset Strategy. U.S. Department of Defensehttps://www.defense.gov/News/Speeches/Speech/Article/753482/remarks-by-deputy-secretary-work-on-third-offset-strategy/
Wyatt, A. (2020). Charting great power progress toward a lethal autonomous weapon system demonstration point. Defence Studies, 20(1), 1–20. https://doi.org/10.1080/14702436.2019.1698956
Zhang, M. (2020). Jiefangjun caigou xinxing zhineng daodan ke xingcheng zhanchang chixu yazhi faxian mubiao ji cuihui [The People’s Liberation Army purchases new smart missiles to form continuous domination on the battlefield]. Tencent News. https://new.qq.com/omn/20200312/20200312A0O
Acknowledgements
The authors are grateful to two anonymous reviewers and to the editors for their feedback on earlier drafts of this article.
Funding
This research is part of a project which has received funding from the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (grant agreement No. 852123). Dr. Tom F.A. Watts’ contribution to this paper was funded by a Leverhulme Trust Early Career Research Fellowship (ECF-2022-135).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Statements and declarations
The authors have no competing interests to declare that are relevant to the content of this article.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Bode, I., Huelss, H., Nadibaidze, A. et al. Prospects for the global governance of autonomous weapons: comparing Chinese, Russian, and US practices. Ethics Inf Technol 25, 5 (2023). https://doi.org/10.1007/s10676-023-09678-x
Published:
DOI: https://doi.org/10.1007/s10676-023-09678-x