• Welcome to the new Internet Infidels Discussion Board, formerly Talk Freethought.

Immortal But Damned to Hell on Earth

We agree. But as far as he sees, the person who dies during the blink does not see his loved ones (or anyone, anything) ever again.

But he does. Nothing is "lost" during the transport, so the guy who dies and comes back is still exactly the same guy as he was before. The new person isn't a different person from the original person.

If nothing is lost, why is such advanced technology required to recreate the original configuration of particles? Surely, that implies that everything is lost, otherwise people who get vaporized would go on living today! The fact that such an elaborate piece of science fiction is needed to counteract the severe entropic blow suffered by the original person's body doesn't support the notion that nothing is lost.

Are you arguing from some kind of dualistic perspective or something? If all that we are is the end result of our physical processes, then why would an interruption of those processes make them somehow different from if they weren't interrupted?

The continuity of the physical substrate of consciousness is not a dualistic argument. It applies equally to computers. Destroy a computer and re-constitute it down to the bit (of information) and you have a new computer that operates in the same manner and has the same stored data as the old one. The old one was destroyed. The analogy ends here, though, because as far as we know computers aren't conscious; there is no first-person sensation associated with being a computer. I'm not a dualist about the first-person sensation, but neither do I discount it as a factor that must be taken into account when a sufficient level of physical complexity is reached.
 
It wouldn't be like dying and coming back to life though. It would be like dying. The computer would feel the sensation of dying and coming back to life, assuming the "recording" was made up to the point of brain death, and then it was switched on right after that. I find myself wondering though, assuming for the moment we could transfer our "consciousness" into a computer, how would that effect your mind? Part of our consciousness stream is input from our senses. Not just the classic five either, there's balance, the feeling of inhabiting a body, orientation in space, etc. How would the mind handle the lack of or substitution of these various inputs?
 
I am making an unstated presumption that may have something to do with our disagreement: I believe that it is like something to be one person and not another person, even if that other person is your double. There is the experience of being fed the sensory input of one brain/nervous system, situating one at a vantage point at the level of one's eyeballs. This first-person perspective (which I am calling consciousness) is vulnerable to changes in the sensory apparatus, especially swift and destructive changes. Whether or not another consciousness exists, residing in the sensory apparatus of a manufactured being whose components were intentionally arranged to mimic the locus of the original consciousness, seems to me an entirely separate event, so I don't see the mechanism of how it effects what's going on from the first-person perspective of the guy entering the transporter.

Well, I guess that you could say that from his perspective, he dies and then comes back to life. The consciousness in the "manufactured being" is the same consciousness as was in the original being. A transportation machine is a murder/resurrection machine. If you get killed and resurrected, you're still the same you that you were before you died.

Also, there's no problem in getting resurrected in a new body while your old body is still alive. Both are you and there is no "prime".

Suppose this is the sequence of events:

1. I take a snapshot of your quantum state.

2. I make a perfect copy of you and introduce you to him.

In this scenario, if you were completely confident that the copy were exact, would you have an objection to being shot in the head with a hunting rifle? Why or why not?

Now consider this scenario:

1. I take a snapshot of your quantum state.

2. I shoot you in the head with a hunting rifle.

3. I make a perfect copy of you and introduce your remains to him.

You seem to be saying that, in the second situation, your consciousness would "pick up" where it left off before being shot, but in the first situation, it wouldn't know to "jump" from you to your copy upon your being shot. I find this a very idiosyncratic way of looking at life and death, since we can be pretty sure that hunting rifle blasts to the cranium usually end in death. My view is that you are dead in both cases, which explains why you (presumably) would not want to be executed at gunpoint in the first scenario. Your account appears unable to explain this unwillingness.
 
Last edited:
But he does. Nothing is "lost" during the transport, so the guy who dies and comes back is still exactly the same guy as he was before. The new person isn't a different person from the original person.

If nothing is lost, why is such advanced technology required to recreate the original configuration of particles? Surely, that implies that everything is lost, otherwise people who get vaporized would go on living today! The fact that such an elaborate piece of science fiction is needed to counteract the severe entropic blow suffered by the original person's body doesn't support the notion that nothing is lost.

Huh? People who get vapourized don't get reconstituted in the exact same configuration that they were in before they got vapourized. There's nothing that can actually do this, so we need to discuss fictional situations in order to have it occur. I really don't understand what you're saying here at all.

Are you arguing from some kind of dualistic perspective or something? If all that we are is the end result of our physical processes, then why would an interruption of those processes make them somehow different from if they weren't interrupted?

The continuity of the physical substrate of consciousness is not a dualistic argument. It applies equally to computers. Destroy a computer and re-constitute it down to the bit (of information) and you have a new computer that operates in the same manner and has the same stored data as the old one. The old one was destroyed. The analogy ends here, though, because as far as we know computers aren't conscious; there is no first-person sensation associated with being a computer. I'm not a dualist about the first-person sensation, but neither do I discount it as a factor that must be taken into account when a sufficient level of physical complexity is reached.

I just don't get why you feel that the "continuity" part is somehow essential. If I die and get resurrected identically ten feet infront of me or if I walk those ten feet without anything happening to me, I don't see how there's a difference in the "me" at the end of those ten feet.
 
You seem to be saying that, in the secibd situation, your consciousness would "pick up" where it left off before being shot, but in the first situation, it wouldn't know to "jump" from you to your copy upon your being shot. I find this a very idiosyncratic way of looking at life and death, since we can be pretty sure that hunting rifle blasts to the cranium usually end in death. My view is that you are dead in both cases, which explains why you (presumably) would not want to be executed at gunpoint in the first scenario. Your account appears unable to explain this unwillingness.

What do you mean by "jump"? I don't get how the consciousness needs to jump into the new copy in a non-dualist situation. That's like saying that the light needs to know how to jump into a flashlight when you turn it on. The mechanism to create the consciousness/light is part of the body/flashlight.
 
You seem to be saying that, in the secibd situation, your consciousness would "pick up" where it left off before being shot, but in the first situation, it wouldn't know to "jump" from you to your copy upon your being shot. I find this a very idiosyncratic way of looking at life and death, since we can be pretty sure that hunting rifle blasts to the cranium usually end in death. My view is that you are dead in both cases, which explains why you (presumably) would not want to be executed at gunpoint in the first scenario. Your account appears unable to explain this unwillingness.

Continuity of the original you is irrelevant as to whether we ought to talk about the copy as being *you*. If the copy is exact, then there is no difference either for us or it. It is you. The original 'you' can be dead while the copy can, at the same time, be 'you'. I don't see any inconsistency there; just people talking past each other in regards to what they mean with 'you'.
 
You seem to be saying that, in the secibd situation, your consciousness would "pick up" where it left off before being shot, but in the first situation, it wouldn't know to "jump" from you to your copy upon your being shot. I find this a very idiosyncratic way of looking at life and death, since we can be pretty sure that hunting rifle blasts to the cranium usually end in death. My view is that you are dead in both cases, which explains why you (presumably) would not want to be executed at gunpoint in the first scenario. Your account appears unable to explain this unwillingness.

What do you mean by "jump"? I don't get how the consciousness needs to jump into the new copy in a non-dualist situation. That's like saying that the light needs to know how to jump into a flashlight when you turn it on. The mechanism to create the consciousness/light is part of the body/flashlight.

Well, first off, could you answer the question I posed? Specifically, if I made an exact copy of you, would you have any objection to me killing you? If you don't, I imagine that you think that from your perspective, being shot would be a brief blink followed by experiencing whatever your copy was experiencing (hence the "jump" from your body to his). Is that accurate?
 
What do you mean by "jump"? I don't get how the consciousness needs to jump into the new copy in a non-dualist situation. That's like saying that the light needs to know how to jump into a flashlight when you turn it on. The mechanism to create the consciousness/light is part of the body/flashlight.

Well, first off, could you answer the question I posed? Specifically, if I made an exact copy of you, would you have any objection to me killing you? If you don't, I imagine that you think that from your perspective, being shot would be a brief blink followed by experiencing whatever your copy was experiencing (hence the "jump" from your body to his). Is that accurate?

Well, if I had an exact copy and I was back alive a few seconds later, I wouldn't really be put out any. Killing someone in real life wouldn't be any different than killing someone in a video game because they'd just respawn. If you kept doing it while I was trying to get some work done of the like, I might start to get a little annoyed about you slowing me down, but other than that, I don't see why I'd particularly care.

Also, there's no jumping. The new body recreates my consciousness from scratch.
 
Well, first off, could you answer the question I posed? Specifically, if I made an exact copy of you, would you have any objection to me killing you? If you don't, I imagine that you think that from your perspective, being shot would be a brief blink followed by experiencing whatever your copy was experiencing (hence the "jump" from your body to his). Is that accurate?

Well, if I had an exact copy and I was back alive a few seconds later, I wouldn't really be put out any. Killing someone in real life wouldn't be any different than killing someone in a video game because they'd just respawn. If you kept doing it while I was trying to get some work done of the like, I might start to get a little annoyed about you slowing me down, but other than that, I don't see why I'd particularly care.

Why do you say "a few seconds later"? In the example I gave, your copy was already in existence before I asked you if you'd mind being fragged. Since it makes no difference either way, suppose your copy was in a cafe in Paris. I made him at 3PM, and sent an e-mail to my gunman at 3:01PM. What do you expect to experience upon being shot dead? Remember, your copy would not have the memory of being shot, or even of being approached by the gunman. He's just sipping coffee in Paris.
 
Well, if I had an exact copy and I was back alive a few seconds later, I wouldn't really be put out any. Killing someone in real life wouldn't be any different than killing someone in a video game because they'd just respawn. If you kept doing it while I was trying to get some work done of the like, I might start to get a little annoyed about you slowing me down, but other than that, I don't see why I'd particularly care.

Why do you say "a few seconds later"? In the example I gave, your copy was already in existence before I asked you if you'd mind being fragged. Since it makes no difference either way, suppose your copy was in a cafe in Paris. I made him at 3PM, and sent an e-mail to my gunman at 3:01PM. What do you expect to experience upon being shot dead? Remember, your copy would not have the memory of being shot, or even of being approached by the gunman. He's just sipping coffee in Paris.

I don't see how any of those situations would have factors of different relevance. If we have this technology, death isn't much of a thing. Some people are dicks who cut infront of me in checkout lines. Other people are dicks who shoot me in the head with a shotgun. Both are slightly annoying, but neither really put me out.
 
You seem to be saying that, in the secibd situation, your consciousness would "pick up" where it left off before being shot, but in the first situation, it wouldn't know to "jump" from you to your copy upon your being shot. I find this a very idiosyncratic way of looking at life and death, since we can be pretty sure that hunting rifle blasts to the cranium usually end in death. My view is that you are dead in both cases, which explains why you (presumably) would not want to be executed at gunpoint in the first scenario. Your account appears unable to explain this unwillingness.

Continuity of the original you is irrelevant as to whether we ought to talk about the copy as being *you*. If the copy is exact, then there is no difference either for us or it. It is you. The original 'you' can be dead while the copy can, at the same time, be 'you'. I don't see any inconsistency there; just people talking past each other in regards to what they mean with 'you'.

There remains the problem I posed to Tom. I am not confident enough about the nature of confidence to agree to undergo instantaneous oblivion, even if I could be 100% certain that there was an exact copy of myself in the next room (or a cafe in Paris) that is identical to me circa 1 minute ago. Are you? It seems obvious to me that, as is true in every other example but this one, I would cease to exist as a result of instantaneous oblivion, and that's a fact about me and my circumstances; I can't conceive of how what happens elsewhere with regards to copying technology would impinge on that fact. For me to be killed and then suddenly forget it ever happened and go on living as my copy seems like magic to me. The more accurate description would be that my copy goes on living, with no memory of being obliterated (nor of being created a minute earlier), and I die. The part I can't wrap my head around is why so many people have no trouble asserting that I can be subjected to something that undoubtedly would be lethal if no copy existed, but is not lethal if a copy exists.

- - - Updated - - -

Why do you say "a few seconds later"? In the example I gave, your copy was already in existence before I asked you if you'd mind being fragged. Since it makes no difference either way, suppose your copy was in a cafe in Paris. I made him at 3PM, and sent an e-mail to my gunman at 3:01PM. What do you expect to experience upon being shot dead? Remember, your copy would not have the memory of being shot, or even of being approached by the gunman. He's just sipping coffee in Paris.

I don't see how any of those situations would have factors of different relevance. If we have this technology, death isn't much of a thing. Some people are dicks who cut infront of me in checkout lines. Other people are dicks who shoot me in the head with a shotgun. Both are slightly annoying, but neither really put me out.

I'm just trying to get you to clarify why you think there would be a time gap of a few seconds, or any seconds at all. If two people who each legitimately can claim to be you exist, and you are one of those people, what is the status of your consciousness (not his) if I blow your brains out?
 
Information can be stored in lots of ways, and info storage is not consciousness. If you wrote down everything you ever sensed, felt, or thought, would the paper have consciousness? The computer would have to perfectly replicate the exact pattern of bio-chemistry in your brain that gives rise to consciousness in general and your consciousness in particular. Also, sensation is arguably a defining aspect of consciousness, meaning that the system would have to simulate our sensory inputs which are what give us a sense of being and self, which involves a feeling of where you end and the "outside" world begins.
 
I'm just trying to get you to clarify why you think there would be a time gap of a few seconds, or any seconds at all. If two people who each legitimately can claim to be you exist, and you are one of those people, what is the status of your consciousness (not his) if I blow your brains out?

The time gap was irrelevant verbage to pad out the sentence. It's not an important aspect.

If there's two me's who are living different lives, then they become different people the moment they start doing different things, but neither of them is more the original than the other. Honestly, I'd be kind of happy that there isn't someone who knows the PIN code to my ATM anymore. I'd probably make another one so we can both only work a half day, though.
 
Well, I can't force you to take it seriously. I think the rational way to look at it is that two people are two people, regardless of how much they resemble each other or how far apart in time they began to exist. I don't even care which one is the original, just that they are two. I can no more expect to wake up in my duplicate's body than yours.
 
Well, I can't force you to take it seriously. I think the rational way to look at it is that two people are two people, regardless of how much they resemble each other or how far apart in time they began to exist. I don't even care which one is the original, just that they are two. I can no more expect to wake up in my duplicate's body than yours.

You wouldn't wake up in your duplicate's body. You'd wake up in your body.

I really don't get how you're not arguing for a dualist position if you're insisting that the duplicate is somehow someone different than you.
 
But would uploading the contents of your brain to a computer actually transfer your consciousness?

I find it hard to believe that it would.

Other people may think it's you but I think "you" would still be dead.

I feel the same.

But this is a philosophical question, not a scientific one. It all comes down to how you define concepts like "you". So why post this in the science forum?
 
Well, I can't force you to take it seriously. I think the rational way to look at it is that two people are two people, regardless of how much they resemble each other or how far apart in time they began to exist. I don't even care which one is the original, just that they are two. I can no more expect to wake up in my duplicate's body than yours.

You wouldn't wake up in your duplicate's body. You'd wake up in your body.

I really don't get how you're not arguing for a dualist position if you're insisting that the duplicate is somehow someone different than you.

Because two instances of the same type of thing are still two instances of it. There is a difference between two identical beings and one single being: not some esoteric dualistic quality, but the fact that there are twice as many instances when you have two than there are when you just have the one. It's not dualism vs. physicalism, it's counting.

ETA: I thought of a clearer way to put it: if someone told me I was going to die in 15 minutes, I would not be consoled by the information that someone else who has my memories and personality will be created in 14 minutes, even if I were totally convinced that were true. I don't think any dualistic assumptions are necessary to explain that intuition, because two beings are two beings even if they are perfect copies of one another.
 
There remains the problem I posed to Tom. I am not confident enough about the nature of confidence to agree to undergo instantaneous oblivion, even if I could be 100% certain that there was an exact copy of myself in the next room (or a cafe in Paris) that is identical to me circa 1 minute ago. Are you?

That question is a red herring though. It's like asking a little kid: "Hey, since you're just going to grow new ones, it's totally okay if I punch out your teeth, right?"

Of course I don't want my current me to die; but knowing that an exact copy (apart perhaps from the memory of dying) of myself would take my place would certainly be something of a reassurance. And more importantly, that copy wouldn't be laying awake at night worrying about being a copy; because he'd know he is "me", and that whether he's a copy or an original isn't relevant.


The part I can't wrap my head around is why so many people have no trouble asserting that I can be subjected to something that undoubtedly would be lethal if no copy existed, but is not lethal if a copy exists.

It *is* lethal. It's just that death would no longer be absolute. Think of it another way: not a single cell in your body was there when you were born. You've replaced them many times. Every part of you that was born into this world, has already died. Yet you are still alive. There is continuity, even though you have no physical connection to the you that was born. With the copy scenario, the 'original' you that gets shot in the head is more akin to those dead cells from your infancy than anything else; since an exact copy of you will wake up somewhere else. That copy will be *you*. Of course, this really only applies if the copy is activated only after your death so that there's no divergence of experience.
 
So why post this in the science forum?

I wasn't sure which one to put it in since it is a philosophical question but it comes out of something that science may come up with in a few decades.

If the mods want to move it that's cool.
 
Philosophical it is...

Would it be you, or a computer who thinks it is you?
So... how would you bring yourself to pull the plug?

It's back to the old AI/personhood problem.
 
Back
Top Bottom