This Article 
 Bibliographic References 
 Add to: 
Beyond Asimov: The Three Laws of Responsible Robotics
July/August 2009 (vol. 24 no. 4)
pp. 14-20
Robin Murphy, Texas A&M University
David D. Woods, Ohio State University
Asimov's Three Laws of Robotics have been inculcated so successfully into our culture that they now appear to shape expectations as to how robots should act around humans. However, there has been little serious discussion as to whether the Laws really do provide a framework for human-robot interactions. Asimov actually used his laws as a literary device to explore the lack of resilience in the interplay between people and robots in a range of situations. This paper briefly reviews some of the practical shortcomings of each of Asimov's Laws for framing the relationships between people and robots, including reminders about what robots can't do. The main focus of the paper is to propose an alternative, parallel set of Laws of Responsible Robotics as a means to stimulate debate about the accountability relationships for robots when their actions can result in harm to people or human interests. The alternative laws emphasize (1) systems safety in terms of the responsibilities of those who develop and deploy robotic systems, (2) robots' responsiveness as they participate in dynamic social and cognitive relationships, and (3) smooth transfer of control as a robot encounters and initially responds to disruptions, impasses, or opportunities in context.

1. S.L. Anderson, "Asimov's 'Three Laws of Robotics' and Machine Metaethics," AI and Society, vol. 22, no. 4, 2008, pp. 477–493.
2. A. Sloman, "Why Asimov's Three Laws of Robotics are Unethical,"27 July 2006; miscasimov-three-laws.html.
3. C. Allen, W. Wallach, and I. Smit, "Why Machine Ethics?" IEEE Intelligent Systems, vol. 21, no. 4, 2006, pp. 12–17.
4. M. Moran, "Three Laws of Robotics and Surgery," J. Endourology, vol. 22, no. 8, 2008, pp. 1557–1560.
5. R. Clarke, "Asimov's Laws of Robotics: Implications for Information Technology Part 1," Computer, vol. 26, no. 12, 1993, pp. 53–61.
6. R. Clarke, "Asimov's Laws of Robotics: Implications for Information Technology Part 2," Computer, vol. 27, no. 1, 1994, pp. 57–66.
7. W. Wallach and C. Allen, Moral Machines: Teaching Robots Right from Wrong, Oxford Univ. Press, 2009.
8. D. Woods and E. Hollnagel, Joint Cognitive Systems: Patterns in Cognitive Systems Engineering, Taylor and Francis, 2006.
9. R.C. Arkin and L. Moshkina, "Lethality and Autonomous Robots: An Ethical Stance," Proc. IEEE Int'l Symp. Technology and Society (ISTAS 07), IEEE Press, 2007, pp. 1–3.
10. N. Sharkey, "The Ethical Frontiers of Robotics," Science, vol. 322, no. 5909, 2008, pp. 1800–1801.
11. M.F. Rose et al., Technology Development for Army Unmanned Ground Vehicles, Nat'l Academy Press, 2002.
12. D. Woods, "Conflicts between Learning and Accountability in Patient Safety," DePaul Law Rev., vol. 54, 2005, pp. 485–502.
13. S.W.A. Dekker, Just Culture: Balancing Safety and Accountability, Ashgate, 2008.
14. J. Allen et al., "Towards Conversational Human–Computer Interaction," AI Magazine, vol. 22, no. 4, 2001, pp. 27–38.
15. J.M. Bradshaw et al., "Dimensions of Adjustable Autonomy and Mixed-Initiative Interaction," Agents and Computational Autonomy: Potential, Risks, and Solutions, M. Nickles, M. Rovatsos, and G. Weiss eds., LNCS 2969, Springer, 2004, pp. 17–39.
16. B. Whitby, "Sometimes It's Hard to Be a Robot: A Call for Action on the Ethics of Abusing Artificial Agents," Interacting with Computers, vol. 20, no. 3, 2008, pp. 326–333.
17. D.D. Woods, and N. Sarter, "Learning from Automation Surprises and 'Going Sour' Accidents," Cognitive Engineering in the Aviation Domain, N.B. Sarter, and R. Amalberti, eds., Nat'l Aeronautics and Space Administration, 1998.

Index Terms:
robotics, robot ethics, Asimov's Laws, autonomy, safety, resilience
Robin Murphy, David D. Woods, "Beyond Asimov: The Three Laws of Responsible Robotics," IEEE Intelligent Systems, vol. 24, no. 4, pp. 14-20, July-Aug. 2009, doi:10.1109/MIS.2009.69
Usage of this product signifies your acceptance of the Terms of Use.