Hacker News new | past | comments | ask | show | jobs | submit login

> If we make a copy of you and I kill you, am I extending your life?

If you make a copy of me, for a brief time there will be two of "me". After living for some time, those two will diverge increasingly. That's what living is, changing and developing.

So next after copying me, you want to kill me and you want to know if you're thereby extending my life? That seems like a non sequitur statement. What do you mean by killing "me", anyway? Killing both copies? Killing one copy? How does killing extend the life of anything?

> But there's a biological system with a nearly identical behavior that's living. That copy of you will have a similar conscious experience, but it won't be you.

Okay, now we're moving closer. The first mistake you make is assuming there's an "original" and a "copy". If they're functionally identical, they're both me in equal measures. Killing one is still murder, becoming more tragic the more time the two minds had to develop on their own.

Your are basing this example on the hidden assumption that a mind is trapped inside a body and forever bound to it, that a copy of my mind won't be "me". I recognize this sounds intuitively correct to a lot of people, but it's not actually reflecting physical reality. The atoms making up my mind are not specially tagged. The information they represent can be copied in principle, and the process that is "me" can be executed on other platforms. In this, the assumption of uniqueness is just not applicable.

> That's the point of the parent. The parent's not talking about a soul.

I realize that "soul" is a loaded term. But in the end, you're operating on those same loaded assumptions that seem obvious to yourself, but aren't really compatible with a purely physical model. It's an easy mistake to make without realizing it. Our everyday intuition is not always a good indicator of physical reality.

Everyday intuition suggest that there can only be one "me". It suggests that consciousness is one unbroken process, and that my mind is a fundamentally static kernel. It suggests that I'm a black box with unknowable inner workings. None of these things are actually true.




>If you make a copy of me, for a brief time there will be two of "me". After living for some time, those two will diverge increasingly. That's what living is, changing and developing.

ok.

>What do you mean by killing "me", anyway? Killing both copies? Killing one copy? How does killing extend the life of anything?

I label you with 1. I label your copy as 2. I kill #1.

>The first mistake you make is assuming there's an "original" and a "copy".

That's not a mistake at all. I'm talking about biological systems. There are two distinct biological systems here in the sense that #1 and #2 occupy different portions of space. If there are two objects in different places in memory on a computer that represent the same thing, they are conceptually equal, but they are not equal if you compare their addresses.

>Your are basing this example on the hidden assumption that a mind is trapped inside a body and forever bound to it, that a copy of my mind won't be "me". I recognize this sounds intuitively correct to a lot of people, but it's not actually reflecting physical reality. The atoms making up my mind are not specially tagged. The information they represent can be copied in principle, and the process that is "me" can be executed on other platforms. In this, the assumption of uniqueness is just not applicable.

This is entirely a misinterpretation of what I've said.

>I realize that "soul" is a loaded term. But in the end, you're operating on those same loaded assumptions that seem obvious to yourself, but aren't really compatible with a purely physical model. It's an easy mistake to make without realizing it. Our everyday intuition is not always a good indicator of physical reality. Everyday intuition suggest that there can only be one "me". It suggests that consciousness is one unbroken process, and that my mind is a fundamentally static kernel. It suggests that I'm a black box with unknowable inner workings. None of these things are actually true.

Again, a misinterpretation.


> I label you with 1. I label your copy as 2. I kill #1.

Which one is "me" again?

> If there are two objects in different places in memory on a computer that represent the same thing, they are conceptually equal, but they are not equal if you compare their addresses.

Exactly my point. And once both minds exist separately, their both based on me, but are rapidly diverging forks of me at a certain point in time. The same thing happens to me without the fork, it's just that there is one "copy" in existence not two.

> This is entirely a misinterpretation of what I've said.

I didn't mean to do that.


>Which one is "me" again?

There is a machine with a chair. The machine is linked to chamber with a door, where the "copy" exits. We put you in the chair. I hope you can identify yourself. We turn on the machine. A person comes out of the chamber. We write a #2 on the forehead of the person that comes out of the chamber. I call the person without the 2 "the original".


Sure, we could call any one of those two bodies "the original", but what's the point. They're both based on a single me at a time X, and from that point on they're two separate beings. Both of them are going to have somewhat different lives and become different people. The same thing would have happened to a single me.

So do you believe there is something immaterial in my "original" body that could not be copied over? It sure sounds like you do, or am I missing something?


All I've been doing in this branch is trying to clarify the parent post. The parent's point was that 1!=2, therefore "you're not helping my situation". That's it.

>So do you believe there is something immaterial in my "original" body that could not be copied over?

No.


> I don't want a copy of me to live forever, I want to live forever myself. Copying my brain won't copy my conscious self.

... is a notion that I think is based on faulty assumptions. I've been trying to address why I believe this is only intuitively right but physically wrong in like 10 posts all over here against pretty much anyone who cared to comment, without a single voice of support.

I recognize I'm alone in this. I get this means I'm likely wrong about this, and you're right in some way to assume that I just don't understand (or to quote an email: don't have the mental capacity nor the education required to understand).

I don't understand how, once you introduce the concept of making an adequate copy, that still means there is an "original" and a "copy" which somehow isn't another instance of the original. I don't understand why the concept of making a copy is fine when we're talking about, say, a piece of text but then immediately seizes to be valid when we're talking about minds.

Collectively, you guys tried to make me understand by invoking several things. For example, the argument was raised that because you obviously can't have several instances of your consciousness at the same time, this can never work. There was the argument that any would-be left-over physical body somehow remains the home of the mind, and there would obviously be an impostor around pretending to be me but ultimately not being me. At two points, it was suggested that I (specifically, I, Udo) be killed to prove that I care about my existence and the existence of forks based on me, which I do and which does not really have any meaningful connection to the question of the feasibility of copying. I also said that the "problem" of the left-over body most likely won't arise due to technical limitations of the process. Still, that didn't count and people still think that uploading minds is like building pyramids. Finally, the argument was made that both instances are by necessity different, if only because they occupy a different positions in space, to which I tried to explain that, yes, they're different and they'll continue to become more different over time, and yet that doesn't have any real bearing on the proposed possibility or impossibility of the whole concept.

This is where I apologize and withdraw from the discussion. You certainly tried to explain it harder than most people here, but in the end that didn't work out. I'm old enough to have the strong feeling that at some point in the future, a populist Kurzweil 2.0 will come along and everybody will just come around to this conclusion as if it had just been invented.

And that's fine with me. I don't need to be right, but I do want to have the option of being uploaded. Repulsion to the very idea is deeply ingrained within our society. Attempts to prolong life are virtually unknown outside horror literature where it's always portrayed as monstrous. Even "progressive" sites such as io9 draw the line here. It goes against the foundation of most religions, and it contradicts the intuition of most people.

Of all things, a Buffy quote comes to mind here, allow me to paraphrase it: "You'll live again, but it won't be you. A demon sets up in your body, having your memories and thinking your thoughts, but you'll be gone." - I think this sums up the opinion of most people when it comes to transhumanism. And once again, you're right, I struggle to understand this at every level.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: