Forty years is incredibly optimistic. But beyond that issue, I don't see how transferring my brain to a computer does anything for me. I don't want a copy of me to live forever, I want to live forever myself. Copying my brain won't copy my conscious self. (It can't; after all, if you can make one copy then you can make ten copies.)
Ship of Theseus argument applies here - done gradually enough there's no perceptual difference between a brain running in a computer and on a biological substrate. You 'die' in that sense every time you lose conciousness - a different you wakes up.
I think we have a much better chance of extending human life biologically in the next 40 years than we do running the brain in software. 100TB for a brain map is extremely optimistic - that's assuming a byte per connection, and I bet it's a lot higher than that.
Like effective fusion power, it's always 50 years away, unfortunately.
And what about the following scenario: your brain and DNA gets transferred to a computer, and after millenia you get a young body and your brain transferred back. In this case we need to get funded only the brain to computer transfer technology.
Excellent point. But do we ever lose consciousness, though? Arguably, a continuum of consciousness is what defines us as individuals. If there was an absolute loss of consciousness, you would indeed die every time you fell asleep, but this is never the case. The impression of loss of consciousness is related to the fact that memories cannot (usually) be transferred between different states of consciousness.
Moreover, when you talk about the substrate of consciousness, that begs the question of what consciousness actually is, and how it can be transferred. Creating an exact copy of the substrate (or an exact copy of the data pattern, but on a different kind of substrate), and then destroying the original substrate and crossing your fingers - this is a product of magical thinking on the part of scientists.
Another instance of me is not me; everyone will think it's me (I will think so too, however I see a nice psychological Philip K Dickian mess coming up with you thinking the rest of your life that you are not you) but it's not. I wouldn't really care at all for that kind of 'immortality'.
I don't remember which sci-fi book it was, but I read that book (or story) a long time ago where the brain itself was, in a flash (no losing consciousness), turned into a kind of 'plastic'; literally every part, so you are you and after that nano machines would create the new cells, destroy others and create connections from that plastic mimicking the exact way the brain works. Give me one of those.
But like someone else already said; it doesn't solve the dumping of the brain and the feeling of self. Copy the brain onto a different substrate (I have no problem seeing that as a real possibility) but after you did that the two 'you's (one human, one something else) look at each other and only 1 is you. There is no magic 'soul' or whatever flying over during the transfer taking the self with it. The clone you won't believe it's you as he/she is aware of the procedure. And the original you will die as every human always did before.
Maybe someone has some good articles about this, but I have a feeling they mean to slowly (for the definition of slow applicable here) replace tissue in the brain so you gradually turn 'silicon' (or whatever it may be) and 'hope' that it mimics every cell close enough and slow enough so that you will remain you after the transition? In which there will never be 2 'you's in this process, however it will be possible to clone/backup your brain after the process? I'm concerned with the 'hope' of it being like this and it would be very optimistic to think we can reach this kind of understanding of the brain within 40 years imho.
>but after you did that the two 'you's (one human, one something else) look at each other and only 1 is you.
Why? A copy of me IS me.
>There is no magic 'soul' or whatever flying over during the transfer taking the self with it.
The information in your brain IS your soul, basically. What else defines "you" if not that?
>Another instance of me is not me; everyone will think it's me (I will think so too, however I see a nice psychological Philip K Dickian mess coming up with you thinking the rest of your life that you are not you) but it's not. I wouldn't really care at all for that kind of 'immortality'.
I can not understand this preference, it just makes no sense to me. As long as one copy of me lives on that is all that matters. It doesn't matter which copy. The specific atoms that make me up don't matter. Why would it matter if I live on being made up of one set of atoms or another?
I don't think I agree with you; you are doing a magical movement of 'self' here. Even a sharing of 'self' in your first example. If there is 2 copies of you then 1 of you is you, the other is not. The thought experiment is very simple; sit yourself in a chamber with a glass mirror; your side says 'original' the other side says 'copy'. You are aware you will be copied in 1 second. The other you appears on the 'copy' side of the mirror. The original (you) now knows that the copy is the copy as he saw this process. And the copy knows he's the copy as he always is aware of the process the same you are.
So you are talking either in a way where the 'self' is shared (both think they are the original even though that contradicts their own knowledge) or you don't care if you die as long as a clone of you lives on (but then you will die; for you as the original 'self' it's lights-out). The latter is not immortality to me, at least no more than having kids (which is actually a big reason many people have kids) and the first makes no logical sense.
Your thought experiment doesn't really prove anything except that you can have two identical beings in the same room and they can be aware they are made of different atoms. But they are still the same person.
Perhaps you are hung up over the fact they diverge by a few seconds of experience? I would trade losing some memories for immortality. People take drugs or suffer from brain damage which has similar effects and I don't consider them "dead". In any case making a copy of the person is likely to be a destructive process, so that isn't even an issue (there's no reason to keep your physical brain after you upload it to a computer.)
Making a copy of the person isn't any different than if I just took the person and replaced all the atoms in their brain instantly with different ones. Hell this could be happening to you every second and you would have no idea. (In our universe, at least, it isn't even possible, since atoms are all identical and don't have little tags saying "atom 1250". I'm just trying to give an idea why basing your identity on a physical object makes no sense.)
If you don't consider your experiences and brain state to be you, and you don't consider the physical object of your brain to be you, then I have no idea how you define personal identity.
Not a biologist here. However, haven't all the cells I was born with died a long time ago? Am I not a whole new set of cells, just with some stored electrical impulses in my brain?
I'm skeptical that it will happen in the planned timescale - but the current estimates for "effective fusion power" are actually 20 years away. This is based on ITER (currently being built) going well and the DEMO plant - which will actually generated power, being built after that.
"there's no perceptual difference between a brain running in a computer and on a biological substrate"
Who "perceives" a brain running? What do you mean? Is there some higher order process that can do a "perceptual" comparison between a brain running in the world and an identical brain running in a computer? Is the brain doing this comparison?
It may seem like I'm "nitpicking", but I'm not sure if your statement is even well-defined.
> It can't; after all, if you can make one copy then you can make ten copies.
Just because you find a notion inconvenient to deal with doesn't mean it's not factually possible. It basically comes down to your view on the supernatural, and by extension the role of your brain. If you're a materialist, we can talk. Otherwise it's just about religion (which is fine if that works for you personally) and in that case a conversation isn't really possible.
Much like space flight and killer drones, this will one day mature from science fiction to reality and we'll have to deal with it in some form. At this point, you have a choice to make: either hang on to centuries worth of convoluted ideas about the soul made by people who didn't know anything about anything, or move on to a new substrate that is much better aligned with the way our minds actually work and better suited for moving us forward as a civilization.
And it's most likely not going to be a do-or-be-damned choice. Barring unforeseen social dynamics, nothing is going to prevent you from choosing to stick with a biological existence.
You didn't actually understand the parent post. If we make a copy of you and I kill you, am I extending your life? You're dead (you are not capable of having conscious experience). But there's a biological system with a nearly identical behavior that's living. That copy of you will have a similar conscious experience, but it won't be you. That's the point of the parent. The parent's not talking about a soul.
> If we make a copy of you and I kill you, am I extending your life?
If you make a copy of me, for a brief time there will be two of "me". After living for some time, those two will diverge increasingly. That's what living is, changing and developing.
So next after copying me, you want to kill me and you want to know if you're thereby extending my life? That seems like a non sequitur statement. What do you mean by killing "me", anyway? Killing both copies? Killing one copy? How does killing extend the life of anything?
> But there's a biological system with a nearly identical behavior that's living. That copy of you will have a similar conscious experience, but it won't be you.
Okay, now we're moving closer. The first mistake you make is assuming there's an "original" and a "copy". If they're functionally identical, they're both me in equal measures. Killing one is still murder, becoming more tragic the more time the two minds had to develop on their own.
Your are basing this example on the hidden assumption that a mind is trapped inside a body and forever bound to it, that a copy of my mind won't be "me". I recognize this sounds intuitively correct to a lot of people, but it's not actually reflecting physical reality. The atoms making up my mind are not specially tagged. The information they represent can be copied in principle, and the process that is "me" can be executed on other platforms. In this, the assumption of uniqueness is just not applicable.
> That's the point of the parent. The parent's not talking about a soul.
I realize that "soul" is a loaded term. But in the end, you're operating on those same loaded assumptions that seem obvious to yourself, but aren't really compatible with a purely physical model. It's an easy mistake to make without realizing it. Our everyday intuition is not always a good indicator of physical reality.
Everyday intuition suggest that there can only be one "me". It suggests that consciousness is one unbroken process, and that my mind is a fundamentally static kernel. It suggests that I'm a black box with unknowable inner workings. None of these things are actually true.
>If you make a copy of me, for a brief time there will be two of "me". After living for some time, those two will diverge increasingly. That's what living is, changing and developing.
ok.
>What do you mean by killing "me", anyway? Killing both copies? Killing one copy? How does killing extend the life of anything?
I label you with 1. I label your copy as 2. I kill #1.
>The first mistake you make is assuming there's an "original" and a "copy".
That's not a mistake at all. I'm talking about biological systems. There are two distinct biological systems here in the sense that #1 and #2 occupy different portions of space. If there are two objects in different places in memory on a computer that represent the same thing, they are conceptually equal, but they are not equal if you compare their addresses.
>Your are basing this example on the hidden assumption that a mind is trapped inside a body and forever bound to it, that a copy of my mind won't be "me". I recognize this sounds intuitively correct to a lot of people, but it's not actually reflecting physical reality. The atoms making up my mind are not specially tagged. The information they represent can be copied in principle, and the process that is "me" can be executed on other platforms. In this, the assumption of uniqueness is just not applicable.
This is entirely a misinterpretation of what I've said.
>I realize that "soul" is a loaded term. But in the end, you're operating on those same loaded assumptions that seem obvious to yourself, but aren't really compatible with a purely physical model. It's an easy mistake to make without realizing it. Our everyday intuition is not always a good indicator of physical reality.
Everyday intuition suggest that there can only be one "me". It suggests that consciousness is one unbroken process, and that my mind is a fundamentally static kernel. It suggests that I'm a black box with unknowable inner workings. None of these things are actually true.
> I label you with 1. I label your copy as 2. I kill #1.
Which one is "me" again?
> If there are two objects in different places in memory on a computer that represent the same thing, they are conceptually equal, but they are not equal if you compare their addresses.
Exactly my point. And once both minds exist separately, their both based on me, but are rapidly diverging forks of me at a certain point in time. The same thing happens to me without the fork, it's just that there is one "copy" in existence not two.
> This is entirely a misinterpretation of what I've said.
There is a machine with a chair. The machine is linked to chamber with a door, where the "copy" exits. We put you in the chair. I hope you can identify yourself. We turn on the machine. A person comes out of the chamber. We write a #2 on the forehead of the person that comes out of the chamber. I call the person without the 2 "the original".
Sure, we could call any one of those two bodies "the original", but what's the point. They're both based on a single me at a time X, and from that point on they're two separate beings. Both of them are going to have somewhat different lives and become different people. The same thing would have happened to a single me.
So do you believe there is something immaterial in my "original" body that could not be copied over? It sure sounds like you do, or am I missing something?
All I've been doing in this branch is trying to clarify the parent post. The parent's point was that 1!=2, therefore "you're not helping my situation". That's it.
>So do you believe there is something immaterial in my "original" body that could not be copied over?
> I don't want a copy of me to live forever, I want to live forever myself. Copying my brain won't copy my conscious self.
... is a notion that I think is based on faulty assumptions. I've been trying to address why I believe this is only intuitively right but physically wrong in like 10 posts all over here against pretty much anyone who cared to comment, without a single voice of support.
I recognize I'm alone in this. I get this means I'm likely wrong about this, and you're right in some way to assume that I just don't understand (or to quote an email: don't have the mental capacity nor the education required to understand).
I don't understand how, once you introduce the concept of making an adequate copy, that still means there is an "original" and a "copy" which somehow isn't another instance of the original. I don't understand why the concept of making a copy is fine when we're talking about, say, a piece of text but then immediately seizes to be valid when we're talking about minds.
Collectively, you guys tried to make me understand by invoking several things. For example, the argument was raised that because you obviously can't have several instances of your consciousness at the same time, this can never work. There was the argument that any would-be left-over physical body somehow remains the home of the mind, and there would obviously be an impostor around pretending to be me but ultimately not being me. At two points, it was suggested that I (specifically, I, Udo) be killed to prove that I care about my existence and the existence of forks based on me, which I do and which does not really have any meaningful connection to the question of the feasibility of copying. I also said that the "problem" of the left-over body most likely won't arise due to technical limitations of the process. Still, that didn't count and people still think that uploading minds is like building pyramids. Finally, the argument was made that both instances are by necessity different, if only because they occupy a different positions in space, to which I tried to explain that, yes, they're different and they'll continue to become more different over time, and yet that doesn't have any real bearing on the proposed possibility or impossibility of the whole concept.
This is where I apologize and withdraw from the discussion. You certainly tried to explain it harder than most people here, but in the end that didn't work out. I'm old enough to have the strong feeling that at some point in the future, a populist Kurzweil 2.0 will come along and everybody will just come around to this conclusion as if it had just been invented.
And that's fine with me. I don't need to be right, but I do want to have the option of being uploaded. Repulsion to the very idea is deeply ingrained within our society. Attempts to prolong life are virtually unknown outside horror literature where it's always portrayed as monstrous. Even "progressive" sites such as io9 draw the line here. It goes against the foundation of most religions, and it contradicts the intuition of most people.
Of all things, a Buffy quote comes to mind here, allow me to paraphrase it: "You'll live again, but it won't be you. A demon sets up in your body, having your memories and thinking your thoughts, but you'll be gone." - I think this sums up the opinion of most people when it comes to transhumanism. And once again, you're right, I struggle to understand this at every level.
I didn't mean it as an accusation. The argument was literally that this can't be possible because that would allow multiple copies of a person!
> Even from a materialist perspective, I think the guy was raising one of the most important questions.
I fail to see how that's even a debate at this point. Either you see that the mind is a process being executed on a certain kind of substrate, or you believe that something from outside the physical universe is making up the mind and the brain is just an actuator. Pretty much everything else derives from one of these points of view.
If you think there is "special matter" and "ordinary matter", or that there is something special going on in the brain beyond its information theoretical content, that's not really a scientific/materialist position.
If there's any defensible position in between, I'm genuinely interested in hearing it. Again, I don't mean to attack or upset anyone, please forgive me if I inadvertently gave the wrong signals.
But would a duplicate of me feel like I live forever? Won't I just feel it's my duplicate that lives on when my biological self dies? Maybe a little more satisfying than having a pyramid "live on" but still.
As software people, we have a unique advantage over other humans when it comes to understanding the process of copying information. I invite you to look at it from that perspective.
> But would a duplicate of me feel like I live forever?
Your duplicate is you, it doesn't even matter how many of you there are at the end. Unless the initial brain upload is a destructive scan, this pertains to your biological "original" as well. How things feel like in that situation would be pretty much down to your state of mind, so no difference there to how things are now.
Also, this doesn't mean you'll live forever. Nothing is forever. It's just another form of existence, albeit there is reason to be hopeful it will be longer and individually better than a biological existence.
> Won't I just feel it's my duplicate that lives on when my biological self dies?
Again, that depends on the tech level at the time. Most likely, the first forays into this territory will be made by dead people who have their brains scanned. There will be no biological self left in that case. Once technology moves further, it may become possible to do a non-destructive scan of the brain. In that case, people will most likely opt for this technology as a backup plan (though some - including myself - may not and choose to literally fork themselves).
> Maybe a little more satisfying than having a pyramid "live on" but still.
The difference, as I said, is that it is you who lives on.
Even if you could somehow confer upon a pyramid a portion of yourself, it's still just a static monument. Transferring your mind onto a virtual machine is not just a matter of preserving it for posterity, the point is to continue living and thereby changing. The concept of you is not a static thing.
>As software people, we have a unique advantage over other humans when it comes to understanding the process of copying information. I invite you to look at it from that perspective.
Still small relief compared to the existencial angst of dying to see your copy survive.
>The difference, as I said, is that it is you who lives on.
So, if I have managed to offload a complete and working copy of your brain and memories, you'd be OK with me killing you, right?
> So, if I have managed to offload a complete and working copy of your brain and memories, you'd be OK with me killing you, right?
I already talked about that. You're the second commenter who wants to kill me, by the way. It's not just a matter of "offloading", it's about continuing my existence on another substrate. Just sitting on a hard drive somewhere doesn't mean anything.
Anyway, actually killing me would entail erasing both instances of me, so I'm against that. And like I said, from the moment of creation every instance will continue to live (if you let them) and make their own way through life, becoming an entity in its own right. So, again, no killing please.
I want you to let go of the notion of a "copy", this seems to be the major psychological hurdle here. Think of it as a fork, or rapidly diverging instances.
>I want you to let go of the notion of a "copy", this seems to be the major psychological hurdle here. Think of it as a fork, or rapidly diverging instances.
I think the main psychological hurdle here is your belief that the notion of self is a mere personal issue to be overcome by changing one's ideas about "copies".
In essense, you claim victory not by solving a very real issue (the fact that we identify with the notion of our self and its continuity) but by declaring it a "non problem".
It's not about "letting go"; can you solve the ACTUAL problem of our identification with our continuity of self?
A gradual process of replacement of our body/brain ("ship of Theseus" style) could be a valid answer to this.
Your "just accept that copies of you are just as you as you are" is not. For one, I could not give a flying fuck for my copies -- I'd only care for them as a means to save my memories externally, e.g like a more evolved hard disk.
I'm very sure you're right. But that's what I meant when I said that physical reality does not bend to our everyday intuitions. Physics doesn't care whether people think something should be prohibited from working. ;)
> If I get "duplicated" then there are 2 consciousnesses of me.
They're both based on you. There is nothing in principle preventing multiple instances of yourself from existing at the same time. Of course, as already said, those instances diverge over time in the same way that you're not the same person as today when you wake up tomorrow.
>They maybe an illusion, which gets created every moment in time, but it still feels like they exist.
Consciousness doesn't have to be an illusion. It's just the sensation of existing, of being alive, of processing information. It is in a very real sense the feeling of being yourself. There is no contradiction here with the ability to fork it.
I have an inkling that, when we finally have the means to live forever, we will realize that we actually are just a program running on a brain, and it won't matter if we die.
Again, there is no "forever" in this universe. But this technology certainly holds the promise of a drastically prolonged existence, which may be unappealing to some.
> we will realize that we actually are just a program running on a brain
I think a lot of people realized this quite some time ago already, and they're fine with it :)
> it won't matter if we die.
That's a personal decision to make. It depends on your definition of importance, and whether you think that a mind can have value in of itself.
The Tad WIlliams Otherland series deals with these ideas. In the book, the biological original is killed the instant that the copy goes online. Thus there is no me and my twin, there is only me.
At the end of the day its my memories and knowledge that I treasure, not the meat its running on. The real question is: can we make a synthetic analog for brain goo that can run the consciousness program?
I think you completely missed the point. What you are calling his argument, is not his argument.
The point is this:
> I don't see how transferring my brain to a computer does anything for me. I don't want a copy of me to live forever, I want to live forever myself.
Copying someone's brain is, unambiguously, a copy of that person. The original still dies. I'm the original. I don't want a copy of me to live forever. That doesn't do anything for me. I want to live forever.
I don't think I'm missing the point as such, it's rather that my position is just sufficiently aggravating it looks from your perspective as if I didn't see your core problem.
I'm painfully aware of the point you are trying to make, and no, I still don't agree with the premise behind it. Death is two things: your mind stops being executed, and the information making up your mind is being destroyed. Neither of these things happens to an uploaded person if the upload is sufficiently recent. If you make a copy of yourself, they're both you. If you make a working copy of a dead person, that means there is still a you after the whole thing.
People wonder about strange things like continuity of consciousness, which does not exist to begin with, but it's always invoked nevertheless. How will it feel to be duplicated? And the hidden assumption is that by describing the proposed weirdness of the feeling it will somehow contradict and disprove the feasibility of the whole concept.
What it feels like to be duplicated will very much depend on the external circumstances at that time. If you get run over by a bus and wake up in a piece of silicon, that's how it will be. If you step into a matter replicator or a non-destructive scanner, you'll suddenly look at another instance of yourself. Twice. None of these things are spacetime-shattering paradoxons.
The notion of an "original" and a "copy" is not even applicable if the copy isn't degraded somehow.
-
In closing I think I made a serious mistake during this whole discussion. Instead of challenging people to justify why they believe "a copy of you isn't you", I made the error of trying to explain why I think this is wrong - as opposed to you guys having to explain why you believe this is right. I just realized that nothing I wrote here today was actually accessible or useful to anyone and I'm actually turning into losethos as I realize that I'm producing content that makes no sense to anyone. ;)
No, your argument is not what is aggravating. Don't assume someone disagrees with you just because your argument is aggravating.
What is aggravating is that you claim to be getting the point, but you're not.
> Death is two things: your mind stops being executed, and the information making up your mind is being destroyed. Neither of these things happens to an uploaded person if the upload is sufficiently recent.
This is an improper reduction.
If you want to extend the computer analogy: If I die, a particular instance of my mind's execution goes away forever.
You may argue that that doesn't matter for some reason (perhaps by analogy to going to sleep and then waking up, as I do every day), but you have to first accept the that it is a fact.
Your account pretends that it isn't a fact.
For example:
> The notion of an "original" and a "copy" is not even applicable if the copy isn't degraded somehow.
This ignores the fact that a copy is a different instance.
A copy of you IS you. Replacing all that atoms in your brain and you would still be the same person. Likewise if you built another out of different atoms.
What is false about it? How is a copy of you not obviously you? The physical material your brain is made out of doesn't matter. I could swap it all with different material and you wouldn't even notice. It could be happening every 1 second and you wouldn't have any idea. The universe would be exactly the same as far as you are concerned. Hell we happen to live in a universe where it doesn't even make a difference. Atoms don't have little tags saying "atom 1265". And your physical make up does change all the time.
It's not the physical material that is important. It is the information in it, the pattern or algorithm that represents your consciousness.
My life and my consciousness is a continuous, ongoing, self-sustaining process. If you make a copy of me, that is a different instance of that process.
An analogy can be made between two instances of a program running on a computer, or two instances of a particular class in OO programming.
Ultimately, all of my values reduce to physical pain and pleasure and the anticipation of my continued and increasing ability to experience more pleasure than pain. Sure, I have lots of higher-level, "spiritual" values, but the underlying biological process is what makes those possible. I have no biological or higher-level (rational) reason to care about a different instance of my body and consciousness. I only care about this one.
When I go to sleep, I am putting my conscious mental life on hold, but not my physical life. And I know that my concsiousness will resume in a few hours when I wake up. So making a copy of yourself and killing the original is not obviously analogous to sleeping.
That is the whole misconception I am trying to get rid of. There is no "this one". It would be like if you are a program running on a computer, and you move the entire program from one spot of the memory in RAM, to another. You could also save it to disk and then put it back later. It makes no difference to the user or the program or anything. The concept of "this" doesn't even make sense in terms of information like a computer program. A copy of some information is the same information.
You could also move your head around in space. And yet you are still the same person, are you not? How would it make any difference if you deleted all the atoms in your head and moved them a slight amount to the left. Or deleted all the bits in RAM and moved them to a different place.
Forty years isn't optimistic at all if you allow for the possibility of smarter than human intelligences speeding up technological progress in that time.
A copy of you IS you. If you change the atoms that make up your brain you are still the same person. It's not like the atoms themselves are sacred. Likewise if you replace the computational substrate your brain runs on it's still you. Why do you care if it's neurons or transistors? Consciousness is a pattern, an algorithm maybe, not a physical thing.
Define "proven" as that discovery would contradict pretty much everything we know about the universe. You might as well imply that magic hasn't been proven to not exist.
I don't have to believe in magic or even in god to have my doubts. There is so much we still can't explain. Especially about the brain.
But let's suppose we are able to copy a human atom by atom. Even then a copy of me is not me. Maybe it is identical to me on a atomical level for a tiny fraction of time. But then it would be a different person with different impressions and different thoughts just because it looks in a different direction.
There is stuff that hasn't been explained, that doesn't mean you can therefore assume your hypothesis which is overly complicated, completely contradicts everything we know about physics, evolution, neuroscience, etc, has a non-negligible probability of being true.
By your logic, the person who will call themselves "you" two seconds from now isn't "you" either. I'd still rather die with a copy of me with 1 second worth of different experiences then no copies at all.
Which hypothesis do you mean? The existence of something like a soul? I don't have a hypothesis on that. I'm just not totally convinced that a concept like a soul does not exist on some meta-level.
Yes, I think you can say that the person I was yesterday is a different person than the one I am today or ten seconds ago.
I don't want to die but a copy of myself does not help I think. When my existence ends I don't experience the world anymore. It would be nice to know that someone very similar to me will be there but it does not really prevent me from dying. Transferring me to another (younger) body would be better. But what do we need to transfer and how?
I don't know what you mean by "soul" then, but I can confidently say that the brain is responsible for all human behavior.
If I instantly replaced all the atoms in your brain with different atoms, would you die? Would it still be you? Likewise if I made an exact copy of your brain.
That made me think, what if technology was developed that allowed a portion of the brain's processing functions to be offloaded to an external system.
So, you would copy most motor functions to a "brain emulator" first.
And after you have acclimated, you would continue to the copying process (I have very little doubt that if properly done, the brain wouldn't be able to readjust to this, just like a prosthetic limb or "seeing" through stimulation of nerves on the tongue).
And this process is gradually performed until the entire "consciousness" is running on the external artificial computer.
I wonder how the perception of self would be affected by such a method.
If the computer can actually model the brain itself, copying your brain may well copy your conscious self as well. If true, this would raise quite a few metaphysical questions, to say the least.
What's wrong with dying ? I mean, obviously people want to live longer for some (mostly selfish) reasons, but isn't dying part of the 'living' deal ?
Dying is similar to being born - your whole universe changed when you transitioned from womb to this world and the same happens when you die.
In other words, they have cookies and chocolate there :).
The whole deal about spirituality is that one discovers that we are in fact immortal and that this life is a step of an infinity of other steps, before this life and after.
The other thing one realizes is that each individual is actually a facet of an infinite whole (some call this God, but it is actually You), or in other words - We are all One.
That's what all the religions and mystics have been telling us for millenia.
This is somewhat equivalent to saying "isn't dying from infectious diseases part of the 'living' deal?" It is, but antibiotics are really nice, and now that we have them, it's hard to imagine a world where people would happily give them up because getting infected is just part of living.
Anti-death technology sounds really similar. If we were to have this technology now, it would be a pretty big stretch of imagination to picture a world where people would happily give it away because death is part of living.
The whole deal about spirituality is that one discovers that we are in fact immortal
We're immortal once you break down duality, but that's not what we're talking about here. People want to have the option to live forever in the conventional reality, not the ultimate reality.
It's such a terrible waste. All that knowledge accumulated over years of life... gone. We don't know what the species is capable of if we could double or triple human life span. What if we could indefinitely extend the most productive years? Where might we be if we could have given Einstein the chance to live hundreds of years with his mind and body in prime condition?
> "same happens when you die. In other words, they have cookies and chocolate there"
wouldn't it be great if that were the case? Unfortunately there is exactly zero proof of any kind that this is so.
> "each individual is actually a facet of an infinite whole, in other words - We are all One"
This sentence is completely free of information.
> "That's what all the religions and mystics have been telling us for millenia."
Religions and mystics have been exploiting the normal human fear of death for millennia. Science might one day give us the keys to finally dodge that inevitable final bullet.
>Unfortunately there is exactly zero proof of any kind that this is so
Oh but there is for those who seek it!
One kind of proof is called dimethyltryptamine. It lasts for 10 minutes, but when you come back, you might have a radically different view on these matters.
Other psychedelics, like LSD, also offer a different perspective on life and death and there are also different kinds of spiritual work one can do - meditation, holotropic breathing or living life in nature. They all help you realise just how fragile and transitory life is in nature and that that is a good thing.
>Religions and mystics have been exploiting the normal human fear of death
Isn't science (or rather scientists) driven by the same fear ? Isn't the knowledge that we're going to die a factor in making us go out and do stuff to survive ?
Wouldn't immortality lead to a state where we'd have to invent a simluation of mortal life in order to actually feel alive ?
How can one tell that this life isn't such a simulation ?
What's wrong with dying? I don't want to die. I don't want to see the people I love die. It's not "part of life" or some other crazy rationalization to make us feel better about it. It's just ceasing to exist. There is no other life.
We used to think there was some magical soul thing, but now we can look and see how the very neurons in your brain work. And when they stop working, that's it.
I don't accept death. It isn't part of any greater purpose. The universe isn't optimized for human values. But we can change that.
One day future humans will look back on ancient Earth and wonder how terrible it must have been to live for mere decades. If you were to try to convince them that dying was a good thing and they should give up their immortality, they would laugh at you.
> That's what all the religions and mystics have been telling us for millenia.
If they have been telling it for millenia, they are probably right.
>What's wrong with dying ?
This is actually a subject I speak of with my parents once in a while, they never expected to live longer than a century, so they accept death as a 'natural' part of life, whereas I see death (even of old age) as tragedy, because to me it is supposed to be (or at least to become) optional.
I was convinced death will be optional for me before my majority. It is a belief, since I have no proof of what will be, a belief in mankind's capacities.
Even being a baseless belief, it has literally changed the way I live, and the way I plan my life. I have literally all the time in the world (and hopefully quite a bit more), so no rushed decision.
It also changed the way I see knowledge acquisition, so even if I am actually a currently employed software developer, if someone have an interesting job in genetics, biology or unconventional computing for a software developer, I'm ready to jump ship.
I also came here to say what greeneggs already stated about the digital brain - I want to live forever, not have a copy of my brilliance survive for future generations (though I'm sure they'd be grateful).
The human brain is designed to live inside an innervated body. It receives constant updates on how the body is doing, and unconscious processes, some of them chemical, control every aspect of our internal life, from homeostasis, to the perception of time, to breathing, mood and muscle tone. Upset one part of that balance and the brain goes into blind panic. Although the brain also has electrical activity, it is not just a sophisticated computer program running on a general purpose computer. For one thing, the hardware is unique in every individual! Secondly, what makes you you? And what is consciousness? We have no idea. Perhaps it is not possible to reconstruct the real you, rather than an emulation of you, without copying the exact quantum states of the atoms in your body. But this may require complete destruction of the old you and construction of an identical quantum copy elsewhere (teleportation), which is hardly an electronic download of "data" into a computer. You might be able to extract your memories and store them in a Von Neumann architecture, but the rest is a pipe dream.
A computer with input and output, yes. Some representation of relevant neurotransmitters and sensory inputs and such seems necessary. Think whole brain emulation, rather than extracting the "mind information" while ignoring all the physical information.
And if I make 100 copies of this computer program and run them on 100 different computers, which one will I be aware is me? Or will I just be aware of being 100 different beings at once? I think it should be obvious that a simulation of reality is not the same thing as reality itself. For one thing, perhaps we need to know a lot more about how the universe works at the quantum level to even emulate reality faithfully.
They are all you, until they start reacting differently to their different inputs and anything stochastic within their simulation, and at which point they stop being "the same" you is basically up to you. The difference between them after a minute or two would probably be nothing compared to the difference between you and you a year ago; if you call the latter "you", the former should probably count too. You wouldn't be aware of being 100 beings, but 100 yous would be aware of existing.
Calculating down to the quantum level is hopefully not necessary (that seems impossible to get efficiently out of any computing substrate physics will let us have). The few people suggesting that physics at that scale impacts consciousness aren't taken very seriously by neuroscientists in general.
Might not do. There have been experiments with people with mental issues or just general experiments abusing or cutting nerve endings in different ways and a lot of the more radical ones (like this would be, at least for the first 1000s of subjects) end up in insanity. And you would rather be dead I think.
"Once the brain is understood well enough to be modeled digitally, it stands to reason that the complete set of data representing an individual person (at a particular state of time) could be copied or transferred just like saving a file."
What exactly are you going to save to that file? Positions of molecules? States of atoms?
Oh, so we're assuming we can abstract away from the physical system? Either we abstract or we save everything. Well, what's everything? Or can the abstraction be "good enough"?
It's scientifically certain that we can abstract away from the physical system. In fact, we're already doing that today. We run simulations of molecular interactions all the time. We can even "do" a cortical column in incredible detail.
The problem is not that we don't know whether it's possible. There are two issues concerning this development: we need to know what the right detail/abstraction level is for simulating a mind, and once we do we need to actually get that information. Doing the simulation itself is comparatively easy, though certainly not trivial.
> What exactly are you going to save to that file? Positions of molecules? States of atoms?
To be useful, it can't be just a file, it needs to be a database that can be executed. Positions of molecules seems to be the right detail level according to our current understanding. That includes the spatial configuration of proteins and their exact position in relation to another. But there might not be that many unique molecules to store, just a few thousand maybe.
An interesting question then becomes how can we abstract these scans even further and still be able to have a high fidelity mind simulation?
We can abstract however we like whatever we like. The point is: will the abstraction accurately represent the original physical system? And how do we know? We run simulations of molecular interactions. So what?
I'm not sure where this hostility is coming from, this is the second time you implicitly accuse me of stupidity in this thread.
Edit: actually you're in good company if you think that. I have two emails from HNers (pertaining to different threads on this subject) attesting that I lack the education and probably the intellect to carry that kind of conversation forward. :)
> The point is: will the abstraction accurately represent the original physical system?
I already talked about that in the very comment you're replying to.
Why wouldn't it? It wouldn't be exactly the same, but the important information is there. Does it matter if it's slightly off?
Consider that your real brain is influenced by tons of random factors and meaningless information. If a protein drifts to a slightly different location maybe it could slightly affect the output of that neuron. But if just assume that it is where it is supposed to be for the purposes of the simulation, it should be fine.
It's not like something like that encodes any important information or has anything to do with the algorithm that your brain is running.
1. We're limited by our own models of physics. We can only record what we know about. That's a pretty significant issue when you're trying to simulate physics.
2. In physics, we can run an experiment over and over again so as to refine our model. We isolate systems and try to study simple interactions. When you study gases, for example, I think you will find that things are modeled statistically. We record details to the extent that they help with our model.
What I'm wondering is how we would do this with biological systems. Can you run the "experiment" over and over to try and understand the "right way" this biological system was supposed to behave? Is there a "right" biological system? How do you know what the relevant biological (sub)systems are? How do you identify what's important? How do you abstract? What's a "good enough" replica?
What I'm trying to get at is that I don't think there is a good layer of abstraction for biological systems. I think the physics is the only good layer of abstractions. But...
3. If we had to record states of molecules, for example, then making a digital copy of everyone seems really not feasible. It doesn't seem feasible because of memory or computation.
I think you could abstract to the individual neurons or maybe the different parts of the neuron. It doesn't matter that you don't have an exact physical simulation of every atom.
Some abstraction would probably be involved; the idea is to recreate all the relevant activity that impacts consciousness while avoiding going any deeper, and most neuroscientists would (I think) speculate that individual molecules aren't necessarily the finest level of detail needed. Neurotransmitters, maybe, but I imagine a neuron doesn't have to be simulated as its constituent atoms to exhibit identical enough behavior.
I find the simple-minded dogmatism that the mind = brain hugely amusing. The fact is that right now science has absolutely no idea how consciousness works. There is no proof that reductionist materialism can explain how a mind arises from bunchs of neurons. And yet some believe without proof.
Yes, seriously! Look into the work of the philospher David Chambers. His position is called non-reductive functionalism. Instead of saying "the brain causes consciousness", he would say "it is not ONLY the brain that causes consciousness".[1]
In other words "consciousness is a fundamental property ontologically autonomous of any known (or even possible) physical properties" of the physical universe. [2]
I am amused by discussions like these where simplistic functionalist arguments [3] are taken as being proven. The whole area of consciousness is still side open, none of our current theories are adequate and it is simply wrong to assume that consciousness can be explained away.
I am very confident that the human brain is responsible for all human behavior. That's the simplest explanation by far.
You'd have to explain a completely new and different set of physics that no one has observed before, and explain how and why evolution found it and took advantage of it (and yet no one has ever observed it before), and how it interacts with the body, etc.
That would be extremely boring to live endlessly.
Farewell services must be introduced, exciting ways to go "events". :)
then you wake up as a yourself from copy
Would you be more afraid to loose it?
Right now you will die anyway, we accept that(some more, some less). Some build fear based communities around death.
Would everyone regress in preserving "life" instead of being bold and exploring where no man would go. What gives if I get deadly radiation poisoning, I might discover something. What gives if I drown in my new innovative diving gear, this shit is revolutionary.
I rather do nothing on that sort if I can live forever, I will be as safe as possible, taking no risks?
But, there is saying about crossing bridge when one gets there.
This reminds me of 1900-1950 prognoses that by 2000 we should all have flying cars and apple growing on Mars. These prognoses were inspired by rapid advancements of the technologies of that time. Instead electronics happened which was not imagined or at least underestimated.
Nowadays prognoses are inspired by current advancements but let's just spell out the one thing we can be pretty sure about - we have no idea what the next big thing is.
Kurzweill is 65, so he has 20-25 years to live probably in the currently system; figures he would say 20-25 years. The blog owner probably thinks he has 40-50 years more to live so is gunning for 40 years. I remember Larry Ellison mentioning 20 years for such a revolution in a speech some years ago. People are less likely to say 'in 100 years' if they won't make that themselves. 1000 years is easier to talk about than 'just' after you're dead yourself, but what would be the fun in saying 'in 1000 years' ; no-one would listen.
Kurzweil is just a popularizer of this idea, which is an important part of the road ahead. If nothing else, this article illustrates that more and more people are beginning to see the possibilities on the road ahead.
However, neither Kurzweil nor anybody else has actually enough information to make a solid prediction as to the time frame.
I'm a little skeptical of this whole mind uploading business. How do we really know consciousness isn't in the brain, anyway? I prefer to achieve immortality by not dying, like Woody Allen.