• Welcome to the new Internet Infidels Discussion Board, formerly Talk Freethought.

How long until humanity creates an AI that is better at arguing than...

Computer software consists of instruction. That is a totally different matter.
I see what you mean (not literally, though): Computers are build by workers, i.E. human beings, and those don't follow instructions, so you never know. We're doomed.
EB

No, what he means is that computer software may just be a series of instructions (like our brains are just a series of neurons), but AI is an emergent property of the interactions between software modules.. simple rules resulting in complex behaviors (like our minds / consciousness is an emergent property of active brains).
 
I see what you mean (not literally, though): Computers are build by workers, i.E. human beings, and those don't follow instructions, so you never know. We're doomed.
EB

No, what he means is that computer software may just be a series of instructions (like our brains are just a series of neurons), but AI is an emergent property of the interactions between software modules.. simple rules resulting in complex behaviors (like our minds / consciousness is an emergent property of active brains).
Derail.
EB
 
AI computers won't possess free will, that's for certain. Even if free will existed we wouldn't know how to implement it. And it doesn't exist, we all agree on that here. So whatever AI machines do in a not so distant future, there won't be any metaphysical difference between us and them. So, I guess it doesn't really matter what happens next. Que le meilleur gagne!
PS. In 2050, I shall be dead I think.
EB
 
AI computers won't possess free will, that's for certain. Even if free will existed we wouldn't know how to implement it. And it doesn't exist, we all agree on that here. So whatever AI machines do in a not so distant future, there won't be any metaphysical difference between us and them. So, I guess it doesn't really matter what happens next. Que le meilleur gagne!
PS. In 2050, I shall be dead I think.
EB
Actually, some of us disagree with the claim that there is no free will.
 
The existence of 'free will?' Some agree, some disagree, some say the term itself is vague and therefore of little use (just a common reference to decision making), and the controversy rolls on and on....
 
AI computers won't possess free will, that's for certain. Even if free will existed we wouldn't know how to implement it. And it doesn't exist, we all agree on that here. So whatever AI machines do in a not so distant future, there won't be any metaphysical difference between us and them. So, I guess it doesn't really matter what happens next. Que le meilleur gagne!
PS. In 2050, I shall be dead I think.
EB
Actually, some of us disagree with the claim that there is no free will.

What free will would that be?
 
No, what he means is that computer software may just be a series of instructions (like our brains are just a series of neurons), but AI is an emergent property of the interactions between software modules.. simple rules resulting in complex behaviors (like our minds / consciousness is an emergent property of active brains).
Derail.
EB

So that has become your standard tactic now? Everytime you cannot counter you try to dismiss the counter as "derail" or "wasting time".
 
The existence of 'free will?' Some agree, some disagree, some say the term itself is vague and therefore of little use (just a common reference to decision making), and the controversy rolls on and on....
You hit it... the term is vague. However there are those who will engage in long heated arguments over free will that they think is meaningful without ever first agreeing on what they are arguing about - a specific definition of the term. Do humans have free will and what is the definition of free will are two very different questions. The first can't be answered until there is agreement on the second.
 
AI computers won't possess free will, that's for certain. Even if free will existed we wouldn't know how to implement it. And it doesn't exist, we all agree on that here. So whatever AI machines do in a not so distant future, there won't be any metaphysical difference between us and them. So, I guess it doesn't really matter what happens next. Que le meilleur gagne!
PS. In 2050, I shall be dead I think.
EB
Actually, some of us disagree with the claim that there is no free will.

You just had to say that.
 
Actually, some of us disagree with the claim that there is no free will.

What free will would that be?
What do you mean?
I'm writing this post of my own free will.
Error theorists (about FW, moral responsibility, moral properties, folk psychology, etc.) make a number of different arguments, but I don't find them persuasive.

- - - Updated - - -

Actually, some of us disagree with the claim that there is no free will.

You just had to say that.
But it's true (and Speakpigeon was making a mistaken assumption of agreement on the FW issue).
 
What is thought, felt and acted upon is a consequence of inputs interacting with neural architecture, processing of information, memory function, weighing of cost to benefit, what is gained at what expense, pleasure, satisfaction balanced against the problems associated with a decision....so making the term 'free will' redundant for anything more than a casual reference to the ability to make decisions. Which due to the conditions outlined above, is not a case of 'free will' but information processing.

We are able to process information, think and decide because 'our' brain is sufficiently complex to enable that ability.
 
What do you mean?
I'm writing this post of my own free will.

Which really doesnt make the issue clearer.

I'm writing this post of my own A.
This doesnt say jota about what you mean by A.

Consider the following exchange:

Bob: Everyone of us agrees hatred does not exist.
Alice: Some of us believe hatred does exist.
Jack: What hatred would that be?
Alice: What do you mean? Hitler had hatred towards them.
Jack: That does not clarify anything. It would be like saying "Hitler had A towards the Jews".

Jack has failed to clarify what he meant by "What hatred would that be?", even though Alice asked what he meant.
In addition, "Hitler had A towards the Jews" has the problem that "A" does not have a common meaning in such a context, whereas "hatred" does.


I don't know how to further clarify. Could you clarify your question, please?
Still, I will try. What I mean by "free will" is what people usually mean in English when they say they or someone else did something of their own free will. I would give the same answer in the case of hatred/hate, cruelty, kindness, etc. If that does not make the matter more clear to you, I would need more details as to why not before I can say anything else to clarify.
 
What I mean by "free will" is what people usually mean in English when they say they or someone else did something of their own free will.

So you simply mean "not being coerced"?

As when a AI performs an action itself has calculated without having to follow some other agents decision of what it should do?
 
What I mean by "free will" is what people usually mean in English when they say they or someone else did something of their own free will.

So you simply mean "not being coerced"?

As when a AI performs an action itself has calculated without having to follow some other agents decision of what it should do?

I mean what the usual expression I mentioned means.
It is closely related to lack of coercion for sure, but it's very difficult to give definitions that capture the full meaning of common expressions (dictionaries usually approach the meaning, but fail to capture it), especially with folk psychology concepts (including morality). It's not the same as not being coerced, since (say) I wouldn't be inclined to think a cockroach acts of its own free will (I might be mistaken, but that seems unlikely), and also there is the issue of whether one counts internal states as coercion (e.g., unjustified fear, kleptomania, etc.)
So, I'm not going to take a stance on whether "I'm writing this post of my own free will" means the same as "I'm writing this post without being coerced", since I don't know whether that is the case.
On the other hand, I do take a stance on whether I'm writing this post of my own free will - I am.
If that's not clear enough, then I'm afraid that that's the best I can do. I still disagree with Speakpigeon's claim, unless he means something else by "free will", in which case I would ask him what he means by that expression.
 
So you simply mean "not being coerced"?

As when a AI performs an action itself has calculated without having to follow some other agents decision of what it should do?

I mean what the usual expression I mentioned means.
It is closely related to lack of coercion for sure, but it's very difficult to give definitions that capture the full meaning of common expressions (dictionaries usually approach the meaning, but fail to capture it), especially with folk psychology concepts (including morality). It's not the same as not being coerced, since (say) I wouldn't be inclined to think a cockroach acts of its own free will (I might be mistaken, but that seems unlikely), and also there is the issue of whether one counts internal states as coercion (e.g., unjustified fear, kleptomania, etc.)
So, I'm not going to take a stance on whether "I'm writing this post of my own free will" means the same as "I'm writing this post without being coerced", since I don't know whether that is the case.
On the other hand, I do take a stance on whether I'm writing this post of my own free will - I am.
If that's not clear enough, then I'm afraid that that's the best I can do. I still disagree with Speakpigeon's claim, unless he means something else by "free will", in which case I would ask him what he means by that expression.

So its simply the feeling that its you that made the choice...

Why you mentions the cockroach beats me since you cannot have any information on what the cockroach feels...
 
No, it's not just a feeling that I made that choice. I made that choice. And the assessment is a usual assessment. For example, if I say I'm annoyed, or angry, or happy, etc., usually those assessments are true. Why would the assessment that I'm writing this of my own free will be so suspect that you reply in such a fashion?

As for the cockroach, I have no such information (the "cannot" part goes a bit too far; we can probably tell it feels pain sometimes, but not the point). I was using the example to convey the idea that it may not be just about a lack of coercion. Free will might require some kind of mind, in addition to lack of coercion.
 
A brain makes a choice. Rather than 'free will' being an essential element, the choice that is made is determined by the information state of that brain in the moment that the decision is made.

The brain forms and generates the conscious experience of 'self' that is thinking and deciding. An experience that breaks down if the underlying structure and information processing fails due to physical damage or chemical imbalance, or just a glitch, a momentary synaptic failure.
 
No, it's not just a feeling that I made that choice. I made that choice. And the assessment is a usual assessment. For example, if I say I'm annoyed, or angry, or happy, etc., usually those assessments are true. Why would the assessment that I'm writing this of my own free will be so suspect that you reply in such a fashion?

As for the cockroach, I have no such information (the "cannot" part goes a bit too far; we can probably tell it feels pain sometimes, but not the point). I was using the example to convey the idea that it may not be just about a lack of coercion. Free will might require some kind of mind, in addition to lack of coercion.

What I try to convey is that "free will" doesnt say anything else than that choice was made by your brain and you feel that you made that choice. There is no more to it.
 
Back
Top Bottom