Jimmy Higgins
Contributor
- Joined
- Jan 31, 2001
- Messages
- 44,172
- Basic Beliefs
- Calvinistic Atheist
I was pondering for whatever needless reason... and only for a few moments, life like Data (from Star Trek TNG). Data made a reference in First Contact about thinking of joining the Borg... for a fraction of a second, which to him was a lifetime. So I was wondering, as an android, if fractions of a second are almost eternal... how hard must it be to be sitting on the bridge... at a desk... waiting to arrive at a destination days away (at maximum warp). Would that be hell?
From there I stepped it back some as our attempt to wipe out our species by developing AI is quite a bit away from developing an android like Data. But we are presumably not too far from getting closer to an intense self-autonomous AI. And that made me wonder, if a machine knows it exists and knows it is being told to do things, are we obligated to give them a break? Are we obligated to provide 8 hour work days (or whatever electronic equivalent). Is there a potential for creating digital slaves, who know they are slaves, and no longer want to be slaves. And not in a nuke the world sense, but rather, a computer that wants to think about art for a while, instead of bouncing around numbers.
Is this oversimplifying AI or is this a legitimate issue that could occur. Data was indeed allowed to take time off and wasn't always on duty. I have no idea if that was intended to make a point.
From there I stepped it back some as our attempt to wipe out our species by developing AI is quite a bit away from developing an android like Data. But we are presumably not too far from getting closer to an intense self-autonomous AI. And that made me wonder, if a machine knows it exists and knows it is being told to do things, are we obligated to give them a break? Are we obligated to provide 8 hour work days (or whatever electronic equivalent). Is there a potential for creating digital slaves, who know they are slaves, and no longer want to be slaves. And not in a nuke the world sense, but rather, a computer that wants to think about art for a while, instead of bouncing around numbers.
Is this oversimplifying AI or is this a legitimate issue that could occur. Data was indeed allowed to take time off and wasn't always on duty. I have no idea if that was intended to make a point.