Hacker News new | past | comments | ask | show | jobs | submit login

> because you find a notion inconvenient to deal with doesn't mean it's not factually possible.

That is a completely unjust accusation.

Even from a materialist perspective, I think the guy was raising one of the most important questions.




I didn't mean it as an accusation. The argument was literally that this can't be possible because that would allow multiple copies of a person!

> Even from a materialist perspective, I think the guy was raising one of the most important questions.

I fail to see how that's even a debate at this point. Either you see that the mind is a process being executed on a certain kind of substrate, or you believe that something from outside the physical universe is making up the mind and the brain is just an actuator. Pretty much everything else derives from one of these points of view.

If you think there is "special matter" and "ordinary matter", or that there is something special going on in the brain beyond its information theoretical content, that's not really a scientific/materialist position.

If there's any defensible position in between, I'm genuinely interested in hearing it. Again, I don't mean to attack or upset anyone, please forgive me if I inadvertently gave the wrong signals.


But would a duplicate of me feel like I live forever? Won't I just feel it's my duplicate that lives on when my biological self dies? Maybe a little more satisfying than having a pyramid "live on" but still.


As software people, we have a unique advantage over other humans when it comes to understanding the process of copying information. I invite you to look at it from that perspective.

> But would a duplicate of me feel like I live forever?

Your duplicate is you, it doesn't even matter how many of you there are at the end. Unless the initial brain upload is a destructive scan, this pertains to your biological "original" as well. How things feel like in that situation would be pretty much down to your state of mind, so no difference there to how things are now.

Also, this doesn't mean you'll live forever. Nothing is forever. It's just another form of existence, albeit there is reason to be hopeful it will be longer and individually better than a biological existence.

> Won't I just feel it's my duplicate that lives on when my biological self dies?

Again, that depends on the tech level at the time. Most likely, the first forays into this territory will be made by dead people who have their brains scanned. There will be no biological self left in that case. Once technology moves further, it may become possible to do a non-destructive scan of the brain. In that case, people will most likely opt for this technology as a backup plan (though some - including myself - may not and choose to literally fork themselves).

> Maybe a little more satisfying than having a pyramid "live on" but still.

The difference, as I said, is that it is you who lives on.

Even if you could somehow confer upon a pyramid a portion of yourself, it's still just a static monument. Transferring your mind onto a virtual machine is not just a matter of preserving it for posterity, the point is to continue living and thereby changing. The concept of you is not a static thing.


>As software people, we have a unique advantage over other humans when it comes to understanding the process of copying information. I invite you to look at it from that perspective.

Still small relief compared to the existencial angst of dying to see your copy survive.

>The difference, as I said, is that it is you who lives on.

So, if I have managed to offload a complete and working copy of your brain and memories, you'd be OK with me killing you, right?


> So, if I have managed to offload a complete and working copy of your brain and memories, you'd be OK with me killing you, right?

I already talked about that. You're the second commenter who wants to kill me, by the way. It's not just a matter of "offloading", it's about continuing my existence on another substrate. Just sitting on a hard drive somewhere doesn't mean anything.

Anyway, actually killing me would entail erasing both instances of me, so I'm against that. And like I said, from the moment of creation every instance will continue to live (if you let them) and make their own way through life, becoming an entity in its own right. So, again, no killing please.

I want you to let go of the notion of a "copy", this seems to be the major psychological hurdle here. Think of it as a fork, or rapidly diverging instances.


>I want you to let go of the notion of a "copy", this seems to be the major psychological hurdle here. Think of it as a fork, or rapidly diverging instances.

I think the main psychological hurdle here is your belief that the notion of self is a mere personal issue to be overcome by changing one's ideas about "copies".

In essense, you claim victory not by solving a very real issue (the fact that we identify with the notion of our self and its continuity) but by declaring it a "non problem".

It's not about "letting go"; can you solve the ACTUAL problem of our identification with our continuity of self?

A gradual process of replacement of our body/brain ("ship of Theseus" style) could be a valid answer to this.

Your "just accept that copies of you are just as you as you are" is not. For one, I could not give a flying fuck for my copies -- I'd only care for them as a means to save my memories externally, e.g like a more evolved hard disk.


Maybe the problem here is consciousness?

If I get "duplicated" then there are 2 consciousnesses of me.

They maybe an illusion, which gets created every moment in time, but it still feels like they exist.


I'm very sure you're right. But that's what I meant when I said that physical reality does not bend to our everyday intuitions. Physics doesn't care whether people think something should be prohibited from working. ;)

> If I get "duplicated" then there are 2 consciousnesses of me.

They're both based on you. There is nothing in principle preventing multiple instances of yourself from existing at the same time. Of course, as already said, those instances diverge over time in the same way that you're not the same person as today when you wake up tomorrow.

>They maybe an illusion, which gets created every moment in time, but it still feels like they exist.

Consciousness doesn't have to be an illusion. It's just the sensation of existing, of being alive, of processing information. It is in a very real sense the feeling of being yourself. There is no contradiction here with the ability to fork it.


I have an inkling that, when we finally have the means to live forever, we will realize that we actually are just a program running on a brain, and it won't matter if we die.


Again, there is no "forever" in this universe. But this technology certainly holds the promise of a drastically prolonged existence, which may be unappealing to some.

> we will realize that we actually are just a program running on a brain

I think a lot of people realized this quite some time ago already, and they're fine with it :)

> it won't matter if we die.

That's a personal decision to make. It depends on your definition of importance, and whether you think that a mind can have value in of itself.


The Tad WIlliams Otherland series deals with these ideas. In the book, the biological original is killed the instant that the copy goes online. Thus there is no me and my twin, there is only me.

At the end of the day its my memories and knowledge that I treasure, not the meat its running on. The real question is: can we make a synthetic analog for brain goo that can run the consciousness program?


I think you completely missed the point. What you are calling his argument, is not his argument.

The point is this:

> I don't see how transferring my brain to a computer does anything for me. I don't want a copy of me to live forever, I want to live forever myself.

Copying someone's brain is, unambiguously, a copy of that person. The original still dies. I'm the original. I don't want a copy of me to live forever. That doesn't do anything for me. I want to live forever.


I don't think I'm missing the point as such, it's rather that my position is just sufficiently aggravating it looks from your perspective as if I didn't see your core problem.

I'm painfully aware of the point you are trying to make, and no, I still don't agree with the premise behind it. Death is two things: your mind stops being executed, and the information making up your mind is being destroyed. Neither of these things happens to an uploaded person if the upload is sufficiently recent. If you make a copy of yourself, they're both you. If you make a working copy of a dead person, that means there is still a you after the whole thing.

People wonder about strange things like continuity of consciousness, which does not exist to begin with, but it's always invoked nevertheless. How will it feel to be duplicated? And the hidden assumption is that by describing the proposed weirdness of the feeling it will somehow contradict and disprove the feasibility of the whole concept.

What it feels like to be duplicated will very much depend on the external circumstances at that time. If you get run over by a bus and wake up in a piece of silicon, that's how it will be. If you step into a matter replicator or a non-destructive scanner, you'll suddenly look at another instance of yourself. Twice. None of these things are spacetime-shattering paradoxons.

The notion of an "original" and a "copy" is not even applicable if the copy isn't degraded somehow.

-

In closing I think I made a serious mistake during this whole discussion. Instead of challenging people to justify why they believe "a copy of you isn't you", I made the error of trying to explain why I think this is wrong - as opposed to you guys having to explain why you believe this is right. I just realized that nothing I wrote here today was actually accessible or useful to anyone and I'm actually turning into losethos as I realize that I'm producing content that makes no sense to anyone. ;)


No, your argument is not what is aggravating. Don't assume someone disagrees with you just because your argument is aggravating.

What is aggravating is that you claim to be getting the point, but you're not.

> Death is two things: your mind stops being executed, and the information making up your mind is being destroyed. Neither of these things happens to an uploaded person if the upload is sufficiently recent.

This is an improper reduction.

If you want to extend the computer analogy: If I die, a particular instance of my mind's execution goes away forever.

You may argue that that doesn't matter for some reason (perhaps by analogy to going to sleep and then waking up, as I do every day), but you have to first accept the that it is a fact.

Your account pretends that it isn't a fact.

For example:

> The notion of an "original" and a "copy" is not even applicable if the copy isn't degraded somehow.

This ignores the fact that a copy is a different instance.


A copy of you IS you. Replacing all that atoms in your brain and you would still be the same person. Likewise if you built another out of different atoms.


This is obviously false. In certain respects it is true, but not in all respects. It is the non-true respects that I am concerned about.


What is false about it? How is a copy of you not obviously you? The physical material your brain is made out of doesn't matter. I could swap it all with different material and you wouldn't even notice. It could be happening every 1 second and you wouldn't have any idea. The universe would be exactly the same as far as you are concerned. Hell we happen to live in a universe where it doesn't even make a difference. Atoms don't have little tags saying "atom 1265". And your physical make up does change all the time.

It's not the physical material that is important. It is the information in it, the pattern or algorithm that represents your consciousness.


My life and my consciousness is a continuous, ongoing, self-sustaining process. If you make a copy of me, that is a different instance of that process.

An analogy can be made between two instances of a program running on a computer, or two instances of a particular class in OO programming.

Ultimately, all of my values reduce to physical pain and pleasure and the anticipation of my continued and increasing ability to experience more pleasure than pain. Sure, I have lots of higher-level, "spiritual" values, but the underlying biological process is what makes those possible. I have no biological or higher-level (rational) reason to care about a different instance of my body and consciousness. I only care about this one.

When I go to sleep, I am putting my conscious mental life on hold, but not my physical life. And I know that my concsiousness will resume in a few hours when I wake up. So making a copy of yourself and killing the original is not obviously analogous to sleeping.


>I only care about this one.

That is the whole misconception I am trying to get rid of. There is no "this one". It would be like if you are a program running on a computer, and you move the entire program from one spot of the memory in RAM, to another. You could also save it to disk and then put it back later. It makes no difference to the user or the program or anything. The concept of "this" doesn't even make sense in terms of information like a computer program. A copy of some information is the same information.

You could also move your head around in space. And yet you are still the same person, are you not? How would it make any difference if you deleted all the atoms in your head and moved them a slight amount to the left. Or deleted all the bits in RAM and moved them to a different place.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: