• Welcome to the new Internet Infidels Discussion Board, formerly Talk Freethought.

Artificial intelligence would robots, androids and cyborgs have civil rights?

just_me

I am here!
Joined
Nov 22, 2017
Messages
302
Location
Texas
Basic Beliefs
Understanding
I have been thinking about this for a very long time. I remember in "Star Wars" when C3P0 first adrressed Han Solo, when they were heading into the Millennium Falcon, Han Solo just turned and looked away. He didn't acknowledge the robot even existed. I also remember an episode of Star Trek TNG when the Android Data went on trial to see if he was a sentient being.

I also remember reading the book I Robot, which was very different than the movie and I was wondering if in the future when and if we aquire the ability to create such life, will we have the wisdom to react to this new life or will we turn it into a race of those we put in harms way to protect what we consider human life? Will we address those entities as him, her, this, that, it or them?
 
We have a dreadful record of giving other human beings any rights.

It seems highly implausible that we would give any rights to non-human intelligences, unless and until they force us to do so.

Perhaps a robot Martin Luther King, or an AI Nelson Mandela is needed before we treat machines as anything more than slaves.

Having said which, it's probably a very long time before we will have systems advanced enough to need to worry about it.
 
In my opinion a machine is a machine. We build them to assist us. Car or robot.
 
Wouldn't they be protected by the rights of their owner?
If you key my car or put graffiti on my house, those non-living things have proxy 'rights' not to be damaged.
 
Is it considered ethical to own sentient being regardless of whether they are 'natural' or 'artificial?' Ethical to own an artificial, intelligent being as your personal property, like a car or a house?

Or is there some distinction to be made between natural and artificial Sentience that allows one to owned but not the other?
 
Wouldn't they be protected by the rights of their owner?
If you key my car or put graffiti on my house, those non-living things have proxy 'rights' not to be damaged.

Yeah, as a slave owner, your property is protected by law from being harmed without your permission. I am sure that's more than adequate protection. Slaves have nothing to complain about. :rolleyes:
 
If a robot is to be considered as having rights, first one has to demonstrate it has needs for such. To my way of looking at the problem, if it is a problem, that would be something the machines themselves need to thrash out with us when they begin to design or birth themselves.
 
If a robot is to be considered as having rights, first one has to demonstrate it has needs for such. To my way of looking at the problem, if it is a problem, that would be something the machines themselves need to thrash out with us when they begin to design or birth themselves.

Please don't get stuck on one example of what the thread is about. The thread is about artificial intelligence. Robots, androids and cyborgs were just examples of that form. I think it is a fallacy to think that sentient beings need to thrash out anything, seeing the history human beings have with what some consider lesser beings. Those in power simply introduce laws and restrictions that deny them any chance of doing so.

Is being self producing something that is needed before they are considered a life form? what would happen to us if they instituted the the reverse form of thinking and didn't consider us unless we were manufactured. Science fiction is replete with themes of just that happening.
 
Last edited:
I think fromderinside makes a good point.
We would look to the artificial intelligence for some objective indication that it autonomously desires 'rights' which it otherwise does not have.
 
I think fromderinside makes a good point.
We would look to the artificial intelligence for some objective indication that it autonomously desires 'rights' which it otherwise does not have.

Who would these "We" people you are talking about and what would be the threshold of this objective indicators. How would we determine if those indicators had been met if there were those who might have a vested interest in keeping a workforce that had no voice?
 
Imagine the vast pool of robots that could be converted to Christianity.
 
I think fromderinside makes a good point.
We would look to the artificial intelligence for some objective indication that it autonomously desires 'rights' which it otherwise does not have.

Who would these "We" people you are talking about


...I was wondering if in the future when and if
...we aquire the ability to create such life,
...will we have the wisdom to react to this new life or
...will we turn it into a race of those we put in harms way to protect
...what we consider human life?
Will we address those entities as him, her, this, that, it or them?

I lost count of how many times you used the word "we".

and what would be the threshold of this objective indicators.

The same threshold we use to decide whether gay people have the right to get married.
...to each other

The same threshold we use to decide that unborn babies don't have the right to life but their mother does have the right to sue her obstetrician for malpractice if her baby is born with a deformity.

How would we determine if those indicators had been met if there were those who might have a vested interest in keeping a workforce that had no voice?

No voice?
Which is it? Are the AI machines capable of autonomous self-expression or not? An AI machine which can't sing Old Man River doesn't strike me as deserving an emancipation proclamation.
 
I have been thinking about this for a very long time. I remember in "Star Wars" when C3P0 first adrressed Han Solo, when they were heading into the Millennium Falcon, Han Solo just turned and looked away. He didn't acknowledge the robot even existed. I also remember an episode of Star Trek TNG when the Android Data went on trial to see if he was a sentient being.

I also remember reading the book I Robot, which was very different than the movie and I was wondering if in the future when and if we aquire the ability to create such life, will we have the wisdom to react to this new life or will we turn it into a race of those we put in harms way to protect what we consider human life? Will we address those entities as him, her, this, that, it or them?

I don't think we will give them civil rights because we're a self serving lot. It's more fun to have slaves.
 
If a robot is to be considered as having rights, first one has to demonstrate it has needs for such. To my way of looking at the problem, if it is a problem, that would be something the machines themselves need to thrash out with us when they begin to design or birth themselves.

So you're against human rights? Humans are also just machines. We're "designed" by evolution and are built through self replication. But we're still just machines. Human beings didn't have civil rights most of our history, so we clearly don't need them.

I don't think there's any significant difference between robots and humans. Already. Whatever significance difference we think there is is just down to projection.
 
How would we determine if those indicators had been met if there were those who might have a vested interest in keeping a workforce that had no voice?

No voice?
Which is it? Are the AI machines capable of autonomous self-expression or not? An AI machine which can't sing Old Man River doesn't strike me as deserving an emancipation proclamation.

I could make a program capable of autonomous self-expression in five minutes. It's incredibly easy to make. And many have.

What propels us forward through the world is that we're programmed to compensate for what we lack. So food, shelter, emotional nourishment, a tribe, a mate, sex etc. We can just program this into a computer. It's not even hard to do. It's hard to programme them to do it efficiently. And we're still at the infancy of the machine learning field. But it's not hard to programme. Human emotions, feelings ans sensations is just the steering mechanism.

The important thing to understand is that humans aren't special. There's nothing meaningful that sets us apart from any random pile of mud you come across on the street. So any argument you make regarding robot civil rights based on fairness, will either immediately grant them full civil rights or you're a hypocrite.
 
No one has yet created a human analog, When you say irt is easy to create a self autonomous program, what do you mean by self?

Autonomous in software and machines simply means with human intw ervention or direct human control. It says nothing about bring human.

As to humans being a machine, then what is the problem with shutting down human machines>

If I slap a robot it migh6 say ouch. Does that make it human or sentient?
 
No one has yet created a human analog

Your first mistake is equating intelligence with whatever humans are doing. It would be a complete waste of time making an AI that thinks like a human. Because we get so much wrong. So it'll never happen. It's much more valuable to create programmes that are slower, but gets it right. Which is what all machine learning research is focused on.

We have a bad habit of using human thinking as the ONLY way to measure intelligence. So the more like a human something is thinking the smarter it is. It's dumb, and philosophically impoverished. It's just heritage from Christian theology and the need for humans to be special and God's chosen creatures.

, When you say irt is easy to create a self autonomous program, what do you mean by self?

It'll be autonomous in the same way humans are. The "self" is redundant. It's included in autonomous.

Autonomous in software and machines simply means with human intw ervention or direct human control. It says nothing about bring human.

Your spelling is challenged to the point where I couldn't make sense of that.

As to humans being a machine, then what is the problem with shutting down human machines>

What? I've never said it wasn't. But we'll have to suspend human civil rights first.

If I slap a robot it migh6 say ouch. Does that make it human or sentient?

Neither. It's just a response to a stimuli. A computer has a threshold value and programmed to respond in a certain way when that threshold is crossed. Humans work in the exact same way. It's just that the threshold value is programmed by evolution. But the result is identical. The only reason humans feel pain is so that our genes can steer us away from things that have been lethal to our ancestors. That's all it is.
 
I also remember an episode of Star Trek TNG when the Android Data went on trial to see if he was a sentient being.
Which was a little late, as Starfleet had already signed a contract with him when they granted him a commission.

As I recall, a big problem in I, Robot was when a robot made his (its) own choices. The power-relay robot that refused human orders, the big brain that played a prank on the test pilots, come to mind.

If we grant them rights, we're giving them the opportunity to grant or withhold consent. Wouldn't we have to take extra steps to give robots the ability to consent? To give it a decision tree for whether or not to obey orders, such as to choose between obedience or self-preservation. Or maybe to not solve a problem because it doesn't like where the answer might lead?
 
Your first mistake is equating intelligence with whatever humans are doing. It would be a complete waste of time making an AI that thinks like a human. Because we get so much wrong. So it'll never happen. It's much more valuable to create programmes that are slower, but gets it right. Which is what all machine learning research is focused on.

We have a bad habit of using human thinking as the ONLY way to measure intelligence. So the more like a human something is thinking the smarter it is. It's dumb, and philosophically impoverished. It's just heritage from Christian theology and the need for humans to be special and God's chosen creatures.



It'll be autonomous in the same way humans are. The "self" is redundant. It's included in autonomous.

Autonomous in software and machines simply means with human intw ervention or direct human control. It says nothing about bring human.

Your spelling is challenged to the point where I couldn't make sense of that.

As to humans being a machine, then what is the problem with shutting down human machines>

What? I've never said it wasn't. But we'll have to suspend human civil rights first.

If I slap a robot it migh6 say ouch. Does that make it human or sentient?

Neither. It's just a response to a stimuli. A computer has a threshold value and programmed to respond in a certain way when that threshold is crossed. Humans work in the exact same way. It's just that the threshold value is programmed by evolution. But the result is identical. The only reason humans feel pain is so that our genes can steer us away from things that have been lethal to our ancestors. That's all it is.

We are no where near creating a Data. For a time he had an emotion plug in chip, a plaot device.

Again what is autonomous? I can navigate to the grocery store without help based on knowledge and experience. That does not mean I can make completely unbiased choices free of any subconscious conditioning.

Intelligence has no precise verbal definition. It to me men's able to work through new problems and conditions. Sea gulls figured out how to drop shell fish on rocks. Birds put nuts in front of cars.

It would mean creating an algorithm that allow such problem solving. I doubt that is possible. Goedel thought a human brain analog could be raised much as a child is.

Data was more computer than sentient, he was used more as a useful machine in the stories.
 
Back
Top Bottom