• Welcome to the new Internet Infidels Discussion Board, formerly Talk Freethought.

Are words immaterial?

But a property itself is not material. If it is, then we should say that the consciousness is an "emergent material" instead of "emergent property".
Duh... No, we wouldnt. That is not proper usage of the word "material".

But we know that we can't detect the consciousness of someone else the way we can detect material.
I have no idea what significance "detect material" is supposed to have.

But detecting wether a person is concious or not is not rocket science...
 
But we know that we can't detect the consciousness of someone else the way we can detect material.
I have no idea what significance "detect material" is supposed to have.

But detecting wether a person is concious or not is not rocket science...

You know what I mean; don't try to pull that cheap trick.
 
I think that an implication is that if the consciousness exists, then the function is all that matters and not the componentry. So, for example, imagine that we replaced a person's molecules with, say, basketballs, soccer balls, tennis balls etc. and for each kind of molecule. Then if this gigantic machine can function just like a human, then maybe it will also have a consciousness.

The above is the part of functionalism that I'm leery of. If molecules are replaced with soccer balls, which have different functional properties (behaviors) than the molecules, it seems illogical to assume that they will behave the same way.

Say, for example, that my specific experiences and unique perspective has an effect on how I interact with others. Replace me with another human, or a giant blow up doll, and you're not going to end up with the same properties. <-- yeah... I said that.
 
You know what I mean; don't try to pull that cheap trick.
It is no trick. if my answer is not correct then please give a proper argument.

It's probably an unconscious choice, but many people on FT will take the most ridiculous interpretation of a post and use it. I don't know why, but this is quite common.

You know that I specifically put, "we can't detect the consciousness of someone else" in post #440. But for some reason you took it to mean that we can't detect if a person is conscious.
 
I think that an implication is that if the consciousness exists, then the function is all that matters and not the componentry. So, for example, imagine that we replaced a person's molecules with, say, basketballs, soccer balls, tennis balls etc. and for each kind of molecule. Then if this gigantic machine can function just like a human, then maybe it will also have a consciousness.

The above is the part of functionalism that I'm leery of. If molecules are replaced with soccer balls, which have different functional properties (behaviors) than the molecules, it seems illogical to assume that they will behave the same way.

Say, for example, that my specific experiences and unique perspective has an effect on how I interact with others. Replace me with another human, or a giant blow up doll, and you're not going to end up with the same properties. <-- yeah... I said that.

The point is that if the function is the same, then it doesn't matter what the parts are. So we have to assume that the soccer balls allow for the same function but on a much larger scale. Whether or not the soccer balls can replicate the functions of a brain is irrelevant to functionalism.
 
Last edited:
The above is the part of functionalism that I'm leery of. If molecules are replaced with soccer balls, which have different functional properties (behaviors) than the molecules, it seems illogical to assume that they will behave the same way.

Say, for example, that my specific experiences and unique perspective has an effect on how I interact with others. Replace me with another human, or a giant blow up doll, and you're not going to end up with the same properties. <-- yeah... I said that.

The point is that if the function is the same, then it doesn't matter what the parts are. So we have to assume that the soccer balls allow for the same function but on a much larger scale. Whether or not the soccer balls can replicate the functions of a brain is irrelevant to functionalism.

Well, you might end up with a fractal intelligence, that is layered in certain ways. It's an interesting rabbit hole, but I don't know how valid it is in actual reality (above my pay grade).

I do have a problem with the "soccer balls" being able to function as many different types of neurotransmitters, etc. when they are comprised of different functional units than a human brain (at least the various atoms are arranged in a totally different way in them). So it's not possible for them to function in the exact same manner.

It just doesn't make sense to think that the functional behavior of a soccer ball equates to the functional behavior of a dopamine molecule, but perhaps it does.
 
The point is that if the function is the same, then it doesn't matter what the parts are. So we have to assume that the soccer balls allow for the same function but on a much larger scale. Whether or not the soccer balls can replicate the functions of a brain is irrelevant to functionalism.

Well, you might end up with a fractal intelligence, that is layered in certain ways. It's an interesting rabbit hole, but I don't know how valid it is in actual reality (above my pay grade).

I do have a problem with the "soccer balls" being able to function as many different types of neurotransmitters, etc. when they are comprised of different functional units than a human brain (at least the various atoms are arranged in a totally different way in them). So it's not possible for them to function in the exact same manner.

You may be correct, but that is not relevant to whether or not two identical functions composed of different parts can produce the same consciousness.

Soccer balls was a really bad example; I just like taking ideas to the extreme.

Imagine we slowly replace your current cells with your own stem cells for every part of your body. They will not be composed of the exact same parts as your original cells, but we can be quite certain that you will still have a consciousness, provided everything goes well. So now we have the same function composed of different parts which will still have a consciousness. Now I know this is much different than soccer balls, but one has to wonder how different each body part can be in order to give rise to a consciousness.

Similarly, we can just look at how many consciousnesses there are in the world with different parts with each one having a similar function. It seems that function is all that matters even though the parts are all different.

It just doesn't make sense to think that the functional behavior of a soccer ball equates to the functional behavior of a dopamine molecule, but perhaps it does.
 
Similarly, we can just look at how many consciousnesses there are in the world with different parts with each one having a similar function.

Eh? There really isnt. All conscious creatures have very similar neural systems.

And there is no conscious agent today that is not an offspring of at least one conscious parent.
 
Similarly, we can just look at how many consciousnesses there are in the world with different parts with each one having a similar function.

Eh? There really isnt. All conscious creatures have very similar neural systems.

Okay, but they are still different. So I wonder how different they can be. What does it matter what parts are used as long as it functions like a consciousness?

And there is no conscious agent today that is not an offspring of at least one conscious parent.

Well I don't think it's a far stretch to assume that a person and a consciousness could be made given the right technology to do so.
 
Eh? There really isnt. All conscious creatures have very similar neural systems.

Okay, but they are still different. So I wonder how different they can be. What does it matter what parts are used as long as it functions like a consciousness?

And there is no conscious agent today that is not an offspring of at least one conscious parent.

Well I don't think it's a far stretch to assume that a person and a consciousness could be made given the right technology to do so.

?
We know this:
1) We have actually no idea whatsoever what the subjective experience (what you misname as "consciousness") are.

2) there is only one process that exhibit subjective experience.

From this I find it a ridiculous to say that it is possible to create subjective experience with any hardware/wetware.
 
Okay, but they are still different. So I wonder how different they can be. What does it matter what parts are used as long as it functions like a consciousness?

And there is no conscious agent today that is not an offspring of at least one conscious parent.

Well I don't think it's a far stretch to assume that a person and a consciousness could be made given the right technology to do so.

?
We know this:
1) We have actually no idea whatsoever what the subjective experience (what you misname as "consciousness") are.

2) there is only one process that exhibit subjective experience.

Well then what if we can build something that mimics the process?

From this I find it a ridiculous to say that it is possible to create subjective experience with any hardware/wetware.

Imagine a really good 3-D organ printer. Now imagine that it prints a replica of you. Your clone will probably be able to experience. Why does it matter how we get made?
 
My clone's experience will not be my own.

It already is happening and has happened since always. Just that we call these clones "twins". Same basic build, different experiences, ergo different subjectivity.
 
My clone's experience will not be my own.

It already is happening and has happened since always. Just that we call these clones "twins". Same basic build, different experiences, ergo different subjectivity.

Oh, I did not mean that the subjectivities of the clones would become singular; I just meant subjectivity in general.
 
Imagine a really good 3-D organ printer.
Define "good" in this case. As it stands this just begs the question.

Now imagine that it prints a replica of you. Your clone will probably be able to experience.
Maybe, maybe not. How would you know?


How could we even start to think about it when we have know idea what SE even is?
 


Well, you might end up with a fractal intelligence, that is layered in certain ways. It's an interesting rabbit hole, but I don't know how valid it is in actual reality (above my pay grade).



I do have a problem with the "soccer balls" being able to function as many different types of neurotransmitters, etc. when they are comprised of different functional units than a human brain (at least the various atoms are arranged in a totally different way in them). So it's not possible for them to function in the exact same manner.

You may be correct, but that is not relevant to whether or not two identical functions composed of different parts can produce the same consciousness.


Soccer balls was a really bad example; I just like taking ideas to the extreme.


Imagine we slowly replace your current cells with your own stem cells for every part of your body. They will not be composed of the exact same parts as your original cells, but we can be quite certain that you will still have a consciousness, provided everything goes well. So now we have the same function composed of different parts which will still have a consciousness. Now I know this is much different than soccer balls, but one has to wonder how different each body part can be in order to give rise to a consciousness.

Similarly, we can just look at how many consciousnesses there are in the world with different parts with each one having a similar function. It seems that function is all that matters even though the parts are all different.


It just doesn't make sense to think that the functional behavior of a soccer ball equates to the functional behavior of a dopamine molecule, but perhaps it does.

I get what you're saying, I just don't think that the function of stem cells will precisely match that of older neurons that they replace. But maybe they will- like a new car is presumable just a nicer car than your old one.
 
Define "good" in this case. As it stands this just begs the question.

The conversation that you joined into is not about whether or not it's possible to give rise to the same function using different parts. Functionalism of conscious experience assumes this, whether possible or not, and then goes on to make its case about what will happen. We must beg the question or start with a premise for everything that we claim.

Before every conversation that makes a claim we don't start with, "assuming an evil demon is not tricking us ..." or "assuming I haven't been dreaming for 30 years in a universe much different than the one in my dream".

Your concern is legitimate, and I am not saying that it is or isn't possible to have an identical brain function from different parts.

However, we do know that identical functions are possible from different structures with different components. Take the chair for example; we can define its function as something that enables a person to sit. This object could be a tree stump, boulder, sack of potatoes etc. In other words, these objects have the property that allows people to use it as a chair; they function equally as something to sit on.

Obviously it is more complicated with the property of experience.

Now imagine that it prints a replica of you. Your clone will probably be able to experience.
Maybe, maybe not. How would you know?


How could we even start to think about it when we have know idea what SE even is?

And you don't know if anyone else is able to experience besides you. But there are some things that we have to assume if we are ever going to hope to make progress.

This thing that we know about the least (the ability to experience) is what we are most certain of, so I am confident that someone will figure it out one day. But until then, we can waste time starting with premises that may be wrong. :thinking:
 
Last edited:
I get what you're saying, I just don't think that the function of stem cells will precisely match that of older neurons that they replace. But maybe they will- like a new car is presumable just a nicer car than your old one.

This seems to be a major problem that I can't explain. But I suspect that further inquiry into functionalism will explain it.
 
Back
Top Bottom