Who Should obey Asimov’s Laws of Robotics? A Question of Responsibility

In Spyridon Stelios & Kostas Theologou, The Ethics Gap in the Engineering of the Future. Emerald Publishing. pp. 9-25 (2024)
  Copy   BIBTEX

Abstract

The aim of this chapter is to explore the safety value of implementing Asimov’s Laws of Robotics as a future general framework that humans should obey. Asimov formulated laws to make explicit the safeguards of the robots in his stories: (1) A robot may not injure or harm a human being or, through inaction, allow a human being to come to harm; (2) A robot must obey the orders given to it by human beings except where such orders would conflict with the First Law; (3) A robot must protect its own existence as long as such protection does not conflict with the First or Second Law. In Asimov’s stories, it is always assumed that the laws are built into the robots to govern the behaviour of the robots. As his stories clearly demonstrate, the Laws can be ambiguous. Moreover, the laws are not very specific. General rules as a guide for robot behaviour may not be a very good method to achieve robot safety – if we expect the robots to follow them. But would it work for humans? In this chapter, we ask whether it would make as much, or more, sense to implement the laws in human legislation with the purpose of governing the behaviour of people or companies that develop, build, market or use AI, embodied in robots or in the form of software, now and in the future.

Other Versions

No versions found

Links

PhilArchive

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

Similar books and articles

AI armageddon and the three laws of robotics.Lee McCauley - 2007 - Ethics and Information Technology 9 (2):153-164.

Analytics

Added to PP
2025-01-12

Downloads
119 (#192,184)

6 months
119 (#54,163)

Historical graph of downloads
How can I increase my downloads?

Author's Profile

Erik Persson
Lund University

Citations of this work

No citations found.

Add more citations