petersen-robot-servitude

petersen-robot-servitude - The Ethics of Robot Servitude...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon
The Ethics of Robot Servitude Stephen Petersen * Niagara University steve@stevepetersen.net October 5, 2006 1 The question Suppose we could build creatures with intelligence comparable to our own, who by design want to do tasks we find unpleasant. May we build such creatures? This is the central question I wish to examine. Before we turn to my answer and its defense, though, I’d like briefly to consider something philosophers typically do not stop to consider: namely, why we might ask the question in the first place. The question is, first of all, a natural and engaging one. When I discuss the possi- bility of artificial intelligence with undergraduates, they immediately begin to wonder about whether they might have robot servants in their lifetime, and this leads them immediately to the question of whether they should have them. The association is un- derstandable, given the prevalence of robot servants in pop culture. To pick some ref- erences from my own cultural frame, there’s C3PO and R2D2 from Star Wars , Marvin in The Hitchhiker’s Guide to the Galaxy , Rosie from The Jetsons , HAL from 2001 , and “Robot” from Lost in Space . Much of the Twilight Zone corpus is dedicated to robot labor. More recently there’s Data from Star Trek: The Next Generation , Bender from Futurama , and the host of robots in the Kubrick-Spielberg movie A.I. Disgrun- tled robot servants are at the heart of the Matrix plotline (as the backstory in Animatrix makes clear). Isaac Asimov’s I, Robot series simply assumes that intelligent robots should be programmed as our servants; it’s written into Asimov’s famous “3 laws of robotics”. 1 * Thanks to Marc Alspector-Kelly, Jim Delaney, Ashley McDowell, Bill Rapaport, and Mark Walker for comments on drafts. Thanks also to many undergraduate students for class discussion. And thanks, finally, to Patrick Grim, Eric Dietrich, Selmer and Katherine Bringsjord, and all who discussed this with me at the NA-CAP 2006 Conference. This is a draft (file: robot-servitude-jetai.tex,v version: 1.10 ), and should not be circulated or cited without permission. 1 “1. A robot may not injure a human being, or, through inaction, allow a human being to come to harm. 2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law. 3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.” From Asimov (1950). 1
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Indeed, the very word ‘robot’ has its roots in the issue of mechanical servitude. Karel ˇ Capek chose ‘robot’ for his play R.U.R: Rossum’s Universal Robots to invoke the Czech word robota , which means “drudgery” or “forced labor”. 2
Background image of page 2
Image of page 3
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 08/03/2010 for the course MECHANIC 65921 taught by Professor Jons during the Spring '10 term at Tampa.

Page1 / 13

petersen-robot-servitude - The Ethics of Robot Servitude...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online