People have tried to bring class action suits on behalf of pet dogs and cats. Part of it was humans should not be allowed to technically own pets.
If a robot is to be considered as having rights, first one has to demonstrate it has needs for such. To my way of looking at the problem, if it is a problem, that would be something the machines themselves need to thrash out with us when they begin to design or birth themselves.
So you're against human rights? Humans are also just machines. We're "designed" by evolution and are built through self replication. But we're still just machines. Human beings didn't have civil rights most of our history, so we clearly don't need them.
I don't think there's any significant difference between robots and humans. Already. Whatever significance difference we think there is is just down to projection.
Your first mistake is equating intelligence with whatever humans are doing. It would be a complete waste of time making an AI that thinks like a human. Because we get so much wrong. So it'll never happen. It's much more valuable to create programmes that are slower, but gets it right. Which is what all machine learning research is focused on.
We have a bad habit of using human thinking as the ONLY way to measure intelligence. So the more like a human something is thinking the smarter it is. It's dumb, and philosophically impoverished. It's just heritage from Christian theology and the need for humans to be special and God's chosen creatures.
It'll be autonomous in the same way humans are. The "self" is redundant. It's included in autonomous.
Your spelling is challenged to the point where I couldn't make sense of that.
What? I've never said it wasn't. But we'll have to suspend human civil rights first.
If I slap a robot it migh6 say ouch. Does that make it human or sentient?
Neither. It's just a response to a stimuli. A computer has a threshold value and programmed to respond in a certain way when that threshold is crossed. Humans work in the exact same way. It's just that the threshold value is programmed by evolution. But the result is identical. The only reason humans feel pain is so that our genes can steer us away from things that have been lethal to our ancestors. That's all it is.
We are no where near creating a Data. For a time he had an emotion plug in chip, a plaot device.
Again what is autonomous? I can navigate to the grocery store without help based on knowledge and experience. That does not mean I can make completely unbiased choices free of any subconscious conditioning.
Intelligence has no precise verbal definition. It to me men's able to work through new problems and conditions. Sea gulls figured out how to drop shell fish on rocks. Birds put nuts in front of cars.
It would mean creating an algorithm that allow such problem solving. I doubt that is possible. Goedel thought a human brain analog could be raised much as a child is.
Data was more computer than sentient, he was used more as a useful machine in the stories.
If a robot is to be considered as having rights, first one has to demonstrate it has needs for such. To my way of looking at the problem, if it is a problem, that would be something the machines themselves need to thrash out with us when they begin to design or birth themselves.
So you're against human rights? Humans are also just machines. We're "designed" by evolution and are built through self replication. But we're still just machines. Human beings didn't have civil rights most of our history, so we clearly don't need them.
I don't think there's any significant difference between robots and humans. Already. Whatever significance difference we think there is is just down to projection.
No. As I wrote '...one must demonstrate it (one) has needs for such ...'. Couple that with the difference between how one designs another and how one is designed by one's consequences among others and you have your demonstrator mechanism in place. There is a difference between evolution and design.
...
It would mean creating an algorithm that allow such problem solving. I doubt that is possible.
...
It would mean creating an algorithm that allow such problem solving. I doubt that is possible.
And yet we observe that it IS possible. Or humans couldn't do it.
Unless we are to accept the crazy and repeatedly debunked religious idea of substance dualism.
We don't know what algorithms humans use to solve problems, but we do know that they must exist.
Just as heavier than air flight was known to be possible from the first time man saw birds fly - they didn't know how to do it, but to declare it impossible would have been (and was) insanity.
...
It would mean creating an algorithm that allow such problem solving. I doubt that is possible.
And yet we observe that it IS possible. Or humans couldn't do it.
Unless we are to accept the crazy and repeatedly debunked religious idea of substance dualism.
We don't know what algorithms humans use to solve problems, but we do know that they must exist.
Just as heavier than air flight was known to be possible from the first time man saw birds fly - they didn't know how to do it, but to declare it impossible would have been (and was) insanity.
Steve's opinion is also based on obsolete ideas on Machine Learning. We've had all the necessary technical breakthroughs now. There's a reason Machine Learning has become the hottest field in computing. Right now AI is problem solving left and right. I have a friend who is both an expert in machine learning as well as an expert in crystallography. He couldn't figure out how certain proteins folded for 20 years. Built an AI. Solved it in hours. That's creative problem solving, doing it better than a human. We're already in the goal.
We've already reached a point where the difference between machine thinking and human thinking are insignificant. At least if we discuss ability. What sets us apart now are human emotional (and physical) needs. Well... we're never going to programme those into a computer. Why would we? Which means that Skynet will never arise because no computer will ever feel the need to protect itself for survival.
Steve's opinion is also based on obsolete ideas on Machine Learning. We've had all the necessary technical breakthroughs now. There's a reason Machine Learning has become the hottest field in computing. Right now AI is problem solving left and right. I have a friend who is both an expert in machine learning as well as an expert in crystallography. He couldn't figure out how certain proteins folded for 20 years. Built an AI. Solved it in hours. That's creative problem solving, doing it better than a human. We're already in the goal.
We've already reached a point where the difference between machine thinking and human thinking are insignificant. At least if we discuss ability. What sets us apart now are human emotional (and physical) needs. Well... we're never going to programme those into a computer. Why would we? Which means that Skynet will never arise because no computer will ever feel the need to protect itself for survival.
Well, we hope not.
Are you Sarah Connor?
Steve's opinion is also based on obsolete ideas on Machine Learning. We've had all the necessary technical breakthroughs now. There's a reason Machine Learning has become the hottest field in computing. Right now AI is problem solving left and right. I have a friend who is both an expert in machine learning as well as an expert in crystallography. He couldn't figure out how certain proteins folded for 20 years. Built an AI. Solved it in hours. That's creative problem solving, doing it better than a human. We're already in the goal.
We've already reached a point where the difference between machine thinking and human thinking are insignificant. At least if we discuss ability. What sets us apart now are human emotional (and physical) needs. Well... we're never going to programme those into a computer. Why would we? Which means that Skynet will never arise because no computer will ever feel the need to protect itself for survival.
Well, we hope not.
Are you Sarah Connor?
Another research field that is huge now is robot warriors. That's fully autonomous AI soldiers. Today every targeting system is manned by a human for legal reasons. A human has to push the button. But that was creatively re-interpreted in the Afghan and Iraq war, down to, a human needs to be present and able to cancel attacks when necessary.
I think the next big war will feature human soldiers supported by a fleet of drones, autonomous tanks and autonomous artillery.
I wonder how our relationship with war will change when going to war has zero human cost to one's own side. How will that shift public opinion?
Another research field that is huge now is robot warriors. That's fully autonomous AI soldiers. Today every targeting system is manned by a human for legal reasons. A human has to push the button. But that was creatively re-interpreted in the Afghan and Iraq war, down to, a human needs to be present and able to cancel attacks when necessary.
I think the next big war will feature human soldiers supported by a fleet of drones, autonomous tanks and autonomous artillery.
I wonder how our relationship with war will change when going to war has zero human cost to one's own side. How will that shift public opinion?
Well to a great extent we already saw that in Desert Storm. The allied casualties were lower than would have been expected in an exercise of similar scale. Most were blue-on-blue incidents due to units having advanced far further than their air cover expected.
Of course, nobody knew that casualties would be so low until after the event; and the subsequent occupation was a whole other story.
But it's certainly a good question - what happens when casualties on our side are essentially an impossibility?
And if the locals resist human occupation forces, how will they respond to robots and drones?
What? That was complete gibberish. Who cares if something is designed by humans are a product of evolution? What possible difference could that make? You sound like a Christian
What? That was complete gibberish. Who cares if something is designed by humans are a product of evolution? What possible difference could that make? You sound like a Christian
Humans still don't understand evolution nor do they practice it as designers.
Just because we're products of a process does not make our behavior conform to that process. Rather we conform to the conditions that lead to our existence and not to the process of evolution. Who knows, one of us might have lang tail feathers to attract females.
...
It would mean creating an algorithm that allow such problem solving. I doubt that is possible.
And yet we observe that it IS possible. Or humans couldn't do it.
Unless we are to accept the crazy and repeatedly debunked religious idea of substance dualism.
We don't know what algorithms humans use to solve problems, but we do know that they must exist.
Just as heavier than air flight was known to be possible from the first time man saw birds fly - they didn't know how to do it, but to declare it impossible would have been (and was) insanity.
...
It would mean creating an algorithm that allow such problem solving. I doubt that is possible.
And yet we observe that it IS possible. Or humans couldn't do it.
Unless we are to accept the crazy and repeatedly debunked religious idea of substance dualism.
We don't know what algorithms humans use to solve problems, but we do know that they must exist.
Just as heavier than air flight was known to be possible from the first time man saw birds fly - they didn't know how to do it, but to declare it impossible would have been (and was) insanity.
It is possible in a limited sense.
A thought experiment.
You get into a car type you have never been in before. You quickly see how to
Find the ignition
adjust the steering wheel
open close windows
recogiize door handles
find and operate air conditioning, heting, and audio
window wipers
turn signals
Using a computer language with a robot with digital video and audio processing along with human dexterity, and sensors how would you code the same human capacity? Beyond generalizations and talking details.
Given that there can be a lot of optical clutter along with the things you want to perceive. A general solution.
Anyone who could accomplish that would gain i8nternational recognition, at least in engineering and science. Beyond a level of complexity logic becomes impossibly complex and convoluted.
I found in manufacturing that beyond a certain point in creating instructions it becomes impossible to reduce it to a set of discrete steps with conditional logic and jumps. It requires a human capacity to see and analyze and put it together.