Skip to main content
Log in

On the moral responsibility of military robots

  • Original Paper
  • Published:
Ethics and Information Technology Aims and scope Submit manuscript

Abstract

This article discusses mechanisms and principles for assignment of moral responsibility to intelligent robots, with special focus on military robots. We introduce the concept autonomous power as a new concept, and use it to identify the type of robots that call for moral considerations. It is furthermore argued that autonomous power, and in particular the ability to learn, is decisive for assignment of moral responsibility to robots. As technological development will lead to robots with increasing autonomous power, we should be prepared for a future when people blame robots for their actions. It is important to, already today, investigate the mechanisms that control human behavior in this respect. The results may be used when designing future military robots, to control unwanted tendencies to assign responsibility to the robots. Independent of the responsibility issue, the moral quality of robots’ behavior should be seen as one of many performance measures by which we evaluate robots. How to design ethics based control systems should be carefully investigated already now. From a consequentialist view, it would indeed be highly immoral to develop robots capable of performing acts involving life and death, without including some kind of moral framework.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2

Similar content being viewed by others

Notes

  1. Between 50 and 80 countries either already use or are in the process of acquiring the technology to start using military robots (ABIresearch 2011).

  2. For instance, a regular aircraft like a Boing 767 is not considered a highly lethal weapon, although it may be used to kill thousands of people. Likewise, a kilogram of water, with an equivalent mass-energy corresponding to one thousand Nagasaki bombs, has low lethality when used in the intended fashion.

  3. However, this can also be seen as an example of collective responsibility of the company’s employees and owners. See for instance (Young 2010, p. 67).

  4. Note that equating the internal decision process with the concept of intention, very much is an observers perspective, consistent with the previously adopted view of moral responsibility as a quality an observer assigns to an agent.

  5. Also see https://edocs.uis.edu/kmill2/www/TheRules/ (Accessed January 3, 2012).

  6. Basic forms of this type of reinforcement learning is already developed and used in robotics (see e.g. Hertzberg and Chatila (2008) or Sutton and Barto (1998)), with inspiration from the “Law of Effect”, a model of human and animal learning introduced by Thorndike (1911).

References

  • ABIresearch. (2011). Military robot markets to exceed $8 billion in 2016. Retrieved June 15, 2012, from http://www.abiresearch.com/press/3616-Military+Robot+Markets+to+Exceed+%248+Billion+in+2016.

  • Allen, C., Wallach, W., & Smit, I. (2006). Why machine ethics? IEEE Intelligent Systems, 12–17, July/August.

  • Aristotle. (1985). The Nicomachean Ethics (Terence Irwin, Trans.). Hackett Publishing Co, 1985.

  • Arkin, R. C. (2009a). Governing lethal behavior in autonomous robots. London: Chapman & Hall/CRC.

    Book  Google Scholar 

  • Arkin, R. C. (2009b). Ethical robots in warfare. IEEE Technology and Society Magazine, 28(1), 30–33, Spring 2009.

    Google Scholar 

  • Asaro, P. M. (2006). What should we want from a robot ethic? IRIE International Review of Information Ethics, 6 (12/2006).

  • Bechtel, W. (1985). Attributing responsibility to computer systems. Metaphilosophy, 16(4), 296–306.

    Article  Google Scholar 

  • Bone, E., & Bolkcom, C. (2003, April). Unmanned aerial vehicles: Background and issues for congress. Retrieved January 3, 2012, from https://www.policyarchive.org/handle/10207/1698.

  • Connolly, W. (1974). The terms of political discourse. Princeton: Princeton University Press.

    Google Scholar 

  • Dennett, D. C. (1973). Mechanism and responsibility. In T. Honderich (Ed.), Essays on freedom of action. Boston: Routledge & Keegan Paul.

    Google Scholar 

  • Dennett, D. C. (1997). When HAL kills, who’s to blame? computer ethics. In D. G. Stork (Ed.), HAL’s Legacy: 2001′s computer as dream and reality. Cambridge: MIT Press.

    Google Scholar 

  • Dodig-Crnkovic, G., & Persson, D. (2008). Sharing moral responsibility with robots: A pragmatic approach. In A. Holst, P. Kreuger, & P. Funk (Eds.), 10h Scandinavian Conference on Artificial Intelligence SCAI 2008 (Vol. 173). Frontiers in Artificial Intelligence and Applications.

  • Eshleman, A. (2009). Moral responsibility, the stanford encyclopedia of philosophy (Winter 2009 Edition). In E. N. Zalta (Ed.), Retrieved January 21, 2012, from http://plato.stanford.edu/archives/win2009/entries/moral-responsibility/.

  • Franklin, S., & Graesser, A. (1997). Is it an agent, or just a program?: A taxonomy for autonomous agents (pp. 21–35). Berlin: Intelligent Agents III.

    Google Scholar 

  • Friedman, B. (1990). Moral responsibility and computer technology. Erin document reproduction services.

  • Friedman, B., & Millett, L. (1995). It’s the computer’s fault—reasoning about computers as moral agents. In Conference companion of the conference on human factors in computing systems (pp. 226–227). Denver, CO.

  • Friedman, B., & Millett, L. (1997). Reasoning about computers as moral agents: A research note, in human values and the design of computer technology. In B. Friedman (Ed.), Stanford/New York: CSLI Publications/Cambridge University Press.

  • GlobalSecurity. (2012). TALON small mobile robot. Retrieved January 22, 2012, from http://www.globalsecurity.org/military/systems/ground/talon.htm.

  • Grossman, N. (2007). Rehabilitation or revenge: Prosecuting child soldiers for human rights violations. Georgetown Journal of International Law, 38, 323–362.

    Google Scholar 

  • Hertzberg, J., & Chatila, R. (2008), AI reasoning methods for robotics. In Springer handbook of robotics (pp. 207–223).

  • Hildebrand, A. (2009, March). Samsung Techwin’s latest: A killing robot, info4 4 SECURITY. Retrieved January 21, 2012, from http://www.info4security.com/story.asp?storycode=4121852.

  • Hinds, P., Roberts, T., & Jones, H. (2004). Whose job is it anyway? A study of human-robot interaction in a collaborative task. Human-Computer Interaction, 19, 151–181.

    Article  Google Scholar 

  • iRobot. (2012). Retrieved January 22, 2012, from http://www.irobot.com/gi/ground.

  • Johnson, D. G. (2006). Computer systems: Moral entities but not moral agents. Ethics and Information Technology, 8, 195–204.

    Article  Google Scholar 

  • Kim, T., & Hinds, P. J. (2006). Who should i blame? Effects of autonomy and transparency on attributions in human-robot interaction. In Proceedings of RO-MAN’06 (pp. 80–85).

  • Lin, P., Bekey, G., & Abney, K. (2008). Autonomous military robotics: Risk, ethics, and design, a US department of defense office of naval research-funded report. Retrieved June 16, 2012, from http://ethics.calpoly.edu/ONR_report.pdf.

  • Matarić, M. J., & Michaud, F. (2008). Behavior-based systems. In B. Siciliano, & O. Khatib (Eds.), Springer handbook of eobotics (pp. 891–909). Springer.

  • Matthias, A. (2004). The responsibility gap: Ascribing responsibility for the actions of learning automata. Ethics and Information Technology, 6(3), 175–183.

    Article  Google Scholar 

  • Miller, K. W. (2011). Moral responsibility for computing artifacts: “The rules”. IT Professional, 13(3), 57–59.

    Article  MATH  Google Scholar 

  • Moon, Y., & Nass, C. (1998). Are computers scapegoats? Attributions of responsibility in human-computer interaction. International Journal of Human-Computer Studies, 49(1), 79–94.

    Article  Google Scholar 

  • Parasuraman, R., Sheridan, T. B., & Wickens, C. D. (2000). A model for types and levels of human interaction with automation. IEEE Transactions on Systems, Man, & Cybernetics, 30(3), 286–297.

    Article  Google Scholar 

  • QineticQ. (2012). MAARS—Modular Advanced Armed Robotic System. Retrieved January 22, 2012, from http://www.qinetiq-na.com/products/unmanned-systems/maars/.

  • Raytheon. (2009). News release. Retrieved January 22, 2012, from http://www.raytheon.ca/rtnwcm/groups/rcl/documents/content/rcl_archive_phalanx_release.pdf.

  • Riedel, F. W., Hall, S. M., Barton, J. D., Christ, J. P., Funk, B. K., Milnes, T. D., et al. (2010). Guidance and navigation in the global engagement department. Johns Hopkins APL Technical Digest, 29(2).

  • Samsung. (2012). SGR-1. Retrieved January 22, 2012, from http://www.samsungtechwin.com/product/product_01_02.asp.

  • Sheridan, T. B. (1992). Telerobotics, automation, and human supervisory control. USA: MIT Press.

    Google Scholar 

  • Singer, P. W. (2009a). Wired for war—The robotics revolution and 21st Century conflict. Penguin.

  • Singer, P. W. (2009b). Military robots and the laws of war. The New Atlantis, Winter 2009.

  • Singer, P. W. (2009c). Wired for war? Robots and military doctrine. JFQ: Joint Force Quarterly, 2009 1st Quarter, 1(52), 104–110.

  • Sofge, E. (2009). America’s Robot Army: Are Unmanned Fighters Ready for Combat? Retrieved January 3, 2012, fromhttp://www.popularmechanics.com/technology/military/robots/4252643.

  • Sparrow, R. (2007). Killer robots. Journal of Applied Philosophy, 24(1), 62–77.

    Article  Google Scholar 

  • Stahl, B. C. (2004). Responsible management of information systems. Hershey: Idea-Group Publishing.

    Google Scholar 

  • Strawson, P. F. (1974). Freedom and resentment, in freedom and resentment and other essays. London: Methuen.

    Google Scholar 

  • Sutton, R. S., & Barto, A. G. (1998). Reinforcement learning: An introduction. Cambridge: MIT Press.

    Google Scholar 

  • Thorndike, E. L. (1911). Animal Intelligence (2nd ed.). New York: Hafner. Transaction Publishers, 2000.

    Google Scholar 

  • U.S. Air Force. (2006). ‘Reaper’ moniker given to MQ-9 unmanned aerial vehicle. In The official Web site of U.S. Air Force. Retrieved January 3, 2012, from http://www.af.mil/news/story.asp?storyID=123027012&page=2.

  • U.S. Air Force. (2009). Unmanned Aircraft Systems Flight Plan 2009–2047. Retrieved January 3, 2012, from http://www.globalsecurity.org/military/library/policy/usaf/usaf-uas-flight-plan_2009-2047.pdf.

  • U.S. Navy. (2011). MK 15—Phalanx Close-In Weapons System (CIWS), United States Navy Fact File. Retrieved June 14, 2012, from http://www.navy.mil/navydata/fact_display.asp?cid=2100&tid=487&ct=2.

  • Walzer, M. (2006). Just and unjust wars: A moral argument with historical illustrations. Basic Books.

  • Wezeman, S. (2007). UAVS and UCAVS: Developments in the European Union. European Parliament, October, 2007. Retrieved January 3, 2012, from http://www.europarl.europa.eu/activities/committees/studies/download.do?file=19483.

  • Yamauchi, B. (2004). PackBot: A Versatile platform for military robotics. In Proceedings of SPIE Vol. 5422: Unmanned ground vehicle technology VI, , Orlando, FL.

  • Yamauchi, B., Pook, P., & Gruber, A. (2002). Bloodhound: A semi-autonomous battlefield medical robot. In Proceedings of the 23rd Army Science Conference, U.S. Army, Orlando, FL.

  • Young, I. M. (2010). Responsibility and global labor justice. In G. Ognjenovic (Ed.), Responsibility in context: Perspectives. Springer.

Download references

Acknowledgments

The author would like to thank several anonymous reviewers for their highly valuable comments and suggestions to this and earlier versions of the paper.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Thomas Hellström.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Hellström, T. On the moral responsibility of military robots. Ethics Inf Technol 15, 99–107 (2013). https://doi.org/10.1007/s10676-012-9301-2

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10676-012-9301-2

Keywords

Navigation