NobleSavage
Veteran Member
I've only read one short story by Asmov and his History of the World. I get the gist of The Three Laws of Robotics and if I'm not mistaken he played around with the problems of his own laws.
A quick recap:
A robot may not injure a human being or, through inaction, allow a human being to come to harm.
A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
Wouldn't this be slavery if the robot had sufficient XYZ (Consciousness, self awareness, inner mirror experience, bla, bla)?
Did Asmov ever contemplate that this might be slavery? Has anyone else?
Question for us geeks: You think we would need to embed the 3 laws in hardware like a Trusted Computing Module? Maybe by the time this question is relevant the differences between hardware and software will be too intermingled to draw a line.
A quick recap:
A robot may not injure a human being or, through inaction, allow a human being to come to harm.
A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
Wouldn't this be slavery if the robot had sufficient XYZ (Consciousness, self awareness, inner mirror experience, bla, bla)?
Did Asmov ever contemplate that this might be slavery? Has anyone else?
Question for us geeks: You think we would need to embed the 3 laws in hardware like a Trusted Computing Module? Maybe by the time this question is relevant the differences between hardware and software will be too intermingled to draw a line.