• Welcome to the new Internet Infidels Discussion Board, formerly Talk Freethought.

Immortal But Damned to Hell on Earth

If the simulation is good enough I don't see how it could not transfer your consciousness.

you mean COPY. That would not "transfer" consciousness... it might copy enough information for consciousness to assert itself.. .but that would then be two of you, each arguing that they themselves are the real you.

I'm thinking Star Trek transporter technology... The science fiction of it is that you are transferred by way of matter to energy, back to matter. The science fact of it is that it is a murder machine. It vaporizes you and then builds a copy of you that does not know the difference.. so the copy of you that is created by the transporter says, "it works!".. but you are actually dead and that is your clone.

You're assuming a non-destructive scanning process--something that at least at first I don't think we will have.
 
That question is a red herring though. It's like asking a little kid: "Hey, since you're just going to grow new ones, it's totally okay if I punch out your teeth, right?"

Of course I don't want my current me to die; but knowing that an exact copy (apart perhaps from the memory of dying) of myself would take my place would certainly be something of a reassurance. And more importantly, that copy wouldn't be laying awake at night worrying about being a copy; because he'd know he is "me", and that whether he's a copy or an original isn't relevant.

A reassurance is different from a guarantee that my consciousness would survive the experience. I can be reassured by a generous life insurance policy that my loved ones will be cared for, but that doesn't change the fact of my death.

The part I can't wrap my head around is why so many people have no trouble asserting that I can be subjected to something that undoubtedly would be lethal if no copy existed, but is not lethal if a copy exists.

It *is* lethal. It's just that death would no longer be absolute. Think of it another way: not a single cell in your body was there when you were born. You've replaced them many times. Every part of you that was born into this world, has already died. Yet you are still alive. There is continuity, even though you have no physical connection to the you that was born.

Forgive me for saying so, but I find this repeated objection incredibly naive. Obviously, the gradual replacement of cells over time (not brain cells, I might add) is not destructive to the continuity of my conscious self. But why is this so? It could be that the gradual nature of this replacement process is what enables my first-person perspective to persist over time. In other words, at any given time, the vast majority of my cells are identical to what they were--for instance--last year. If you go back a year, the vast majority of my cells are identical to what they were the previous year, and so on. There is no reason to assume that the relevant fact about cell regeneration is that in 50 years, few of my cells will be the same as they are today, as that may only be the case under the non-trivial condition that the cells are replaced very slowly.

To be clear, my argument is not and never has been that the complete conservation of my specific cells or particles at any given time is necessary for my continued existence. We all seem to agree on that point. We also seem to agree that the unique arrangement of my particles is a necessary condition for preserving consciousness over time, but we diverge in that you and others seem to be saying it is a sufficient condition as well. I am not so sure about that, because it remains to be demonstrated.

With the copy scenario, the 'original' you that gets shot in the head is more akin to those dead cells from your infancy than anything else; since an exact copy of you will wake up somewhere else. That copy will be *you*. Of course, this really only applies if the copy is activated only after your death so that there's no divergence of experience.

This illustrates our disagreement. By my estimation, there is a glaringly obvious fact about being shot in the head while a copy is made in Paris, versus normal human development from infancy to adulthood; the gunshot breaks all the connections in the brain that are necessary for human consciousness, while growing from an infant to an adult does not. It's the ship of Perseus all over again. Pieces of a ship, or a human, can be replaced over time as long as the arrangement is preserved. No single replacement event is sufficient to interrupt the functioning of the whole, so consciousness may proceed within that substrate. Clearly, being shot point-blank in the head (or more to the point, disintegrated by a transporter) is utterly unlike this phenomenon in at least one crucial respect.

There is also the matter of substrate identity. We all agree that a physical brain is required for consciousness per se. We also agree that tiny parts of the brain can be replaced over time without interrupting the flow of consciousness. Where we disagree is over whether an individual consciousness can be preserved by creating a copy of its substrate and instantaneously destroying the original substrate. There is every reason to believe the copy will behave as if it were the original; that is, there will exist a being that believes it has survived destruction. You and others seem to take this belief as likely to be true, if not necessarily true. I do not come to this conclusion, because I don't believe we know enough about the internal, first-person perspective and its requirements for continuity over time.
 
What would be the difference between your consciousness on a hard drive, and you consciousness inside your skull, but paralyzed from the neck down. You can still communicate fully with the world, and if all physical needs are met, can live quite a long time.
How independent is machine with a soul?
 
A reassurance is different from a guarantee that my consciousness would survive the experience. I can be reassured by a generous life insurance policy that my loved ones will be cared for, but that doesn't change the fact of my death.

And like I said, the fact that *you* doesn't mean that "you" don't keep on living. You're focused too much on individual instances of 'you'.

Of course you'd be stupid to be okay with dying just because a perfect copy of will be reinstated after your death. But that doesn't mean the new you isn't exactly the same as the old you, and therefore *you*.


Forgive me for saying so, but I find this repeated objection incredibly naive. Obviously, the gradual replacement of cells over time (not brain cells, I might add) is not destructive to the continuity of my conscious self. But why is this so? It could be that the gradual nature of this replacement process is what enables my first-person perspective to persist over time. In other words, at any given time, the vast majority of my cells are identical to what they were--for instance--last year. If you go back a year, the vast majority of my cells are identical to what they were the previous year, and so on. There is no reason to assume that the relevant fact about cell regeneration is that in 50 years, few of my cells will be the same as they are today, as that may only be the case under the non-trivial condition that the cells are replaced very slowly.

Then compare it to sleep, if you must. There is no continuity of the conscious self; there is only the illusion thereof. The continuity is broken every-time you go to sleep or pass out. Yet when you wake up, you feel like *you* again. That is the point of the comparison; the you that wakes up after you die, is identical to you in every way. It is you. Just a different instance of you; much the instance of you that woke up today is not the instance of you that went to sleep yesterday.

I do not come to this conclusion, because I don't believe we know enough about the internal, first-person perspective and its requirements for continuity over time.

Then, as others have pointed out, you *are* taking a dualistic position. You are inserting a requirement to "me"-ness that is separate from simple physical mechanics. You're inserting an abstract, a non-entity that somehow can not be replicated. This is no different from inserting a soul to the equation.

I don't exactly understand what "first-person perspective" even means in this sense... the copy would have the exact same kind of first-person perspective. As for continuity over time; in a non-dualistic universe, such a requirement to consciousness should result in a particular imprint on the organization of the conscious-granting system; and as such it should theoretically be possible to artificially create that same imprint. In other words, it isn't relevant because if you can copy the brain right down to the sub-atomic level, you will also be able to copy whatever imprint is generated by this continuity over time process.
 
And like I said, the fact that *you* doesn't mean that "you" don't keep on living. You're focused too much on individual instances of 'you'.

Of course you'd be stupid to be okay with dying just because a perfect copy of will be reinstated after your death. But that doesn't mean the new you isn't exactly the same as the old you, and therefore *you*.


Forgive me for saying so, but I find this repeated objection incredibly naive. Obviously, the gradual replacement of cells over time (not brain cells, I might add) is not destructive to the continuity of my conscious self. But why is this so? It could be that the gradual nature of this replacement process is what enables my first-person perspective to persist over time. In other words, at any given time, the vast majority of my cells are identical to what they were--for instance--last year. If you go back a year, the vast majority of my cells are identical to what they were the previous year, and so on. There is no reason to assume that the relevant fact about cell regeneration is that in 50 years, few of my cells will be the same as they are today, as that may only be the case under the non-trivial condition that the cells are replaced very slowly.

Then compare it to sleep, if you must. There is no continuity of the conscious self; there is only the illusion thereof. The continuity is broken every-time you go to sleep or pass out. Yet when you wake up, you feel like *you* again. That is the point of the comparison; the you that wakes up after you die, is identical to you in every way. It is you. Just a different instance of you; much the instance of you that woke up today is not the instance of you that went to sleep yesterday.

I do not come to this conclusion, because I don't believe we know enough about the internal, first-person perspective and its requirements for continuity over time.

Then, as others have pointed out, you *are* taking a dualistic position. You are inserting a requirement to "me"-ness that is separate from simple physical mechanics. You're inserting an abstract, a non-entity that somehow can not be replicated. This is no different from inserting a soul to the equation.

I don't exactly understand what "first-person perspective" even means in this sense... the copy would have the exact same kind of first-person perspective. As for continuity over time; in a non-dualistic universe, such a requirement to consciousness should result in a particular imprint on the organization of the conscious-granting system; and as such it should theoretically be possible to artificially create that same imprint. In other words, it isn't relevant because if you can copy the brain right down to the sub-atomic level, you will also be able to copy whatever imprint is generated by this continuity over time process.

I believe my post in OPD addresses the concerns you raised, namely (a) focusing on individual instances of a person, (b) sleep as an example of continuity-breaking, and (c) the difference between "the same kind of first-person perspective" and "the same first person perspective." If you'd like, I would be interested in hearing your views in response to that thread. Thanks!
 
One real objection to this brain in a chip scenario just occurred to me.

Can a digital brain forget something. The human brain copes with the complexity of life by forgetting things. There is this great gray area between the wonderful and the horrible. We only remember the remarkable, so the mildly disturbing is lost.

Would the cyber you be able to delete or modify files? Sort of like home shock therapy?
 
One real objection to this brain in a chip scenario just occurred to me.

Can a digital brain forget something. The human brain copes with the complexity of life by forgetting things. There is this great gray area between the wonderful and the horrible. We only remember the remarkable, so the mildly disturbing is lost.

Would the cyber you be able to delete or modify files? Sort of like home shock therapy?

I don't see why not. A digital consciousness would more than likely function through the simulation of a brain, so while you probably wouldn't have a windows-type file structure, you would still be able to affect things like memory; especially since by the time we're capable of such a simulation our understanding of how to manipulate memories will undoubtedly be more advanced than it is today. If anything, you'd stand to gain direct control over memories, allowing you to remember the awesome stuff in perfect fidelity while blocking off all the stuff you don't want to remember.
 
Back
Top Bottom