Hacker News new | past | comments | ask | show | jobs | submit login
A Son’s Race to Give His Dying Father Artificial Immortality (2017) (wired.com)
175 points by EndXA 31 days ago | hide | past | web | favorite | 115 comments



This topic reminds me of an Ebert review of Solaris (2002):

The genius of Lem's underlying idea is that the duplicates, or replicants, or whatever we choose to call them, are self-conscious and seem to carry on with free will from the moment they are evoked by the planet. Rheya, for example, says, "I'm not the person I remember. I don't remember experiencing these things." And later, "I'm suicidal because that's how you remember me." In other words, Kelvin gets back not his dead wife, but a being who incorporates all he knows about his dead wife, and nothing else, and starts over from there. She has no secrets because he did not know her secrets. If she is suicidal, it is because he thought she was.

The deep irony here is that all of our relationships in the real world are exactly like that, even without the benefit of Solaris. We do not know the actual other person. What we know is the sum of everything we think we know about them. Even empathy is perhaps of no use; we think it helps us understand how other people feel, but maybe it only tells us how we would feel, if we were them.

https://www.rogerebert.com/reviews/solaris-2002

I think it is a deep mistake to correlate one's exterior appearance and behavior with a sense of selfhood. At best, it's a shallow surface imitation of the actual person. At worst, it creates a nightmarish, Uncanny Valley-esque simulacrum, not unlike Harlow's wire monkey mother surrogates. It would be better to learn how to accept and deal with loss - a reality that everyone inevitably faces.


There's a similar plot in the Black Mirror episode "Be Right Back" (from an earlier season when it was still UK only).


    "And yet, if only from my dreams when I was asleep, I might have
    learned that my grief for my grandmother's death was diminishing, for
    she appeared in them less crushed by the idea that I had formed of her
    non-existence. I saw her an invalid still, but on the road to
    recovery, I found her in better health. And if she made any allusion
    to what she had suffered, I stopped her mouth with my kisses and
    assured her that she was now permanently cured. I should have liked to
    call the sceptics to witness that death is indeed a malady from which
    one recovers. Only, I no longer found in my grandmother the rich
    spontaneity of old times. Her words were no more than a feeble, docile
    response, almost a mere echo of mine; she was nothing more than the
    reflexion of my own thoughts."
    —Marcel Proust, Cities of the Plain


"And yet, if only from my dreams when I was asleep, I might have learned that my grief for my grandmother's death was diminishing, for she appeared in them less crushed by the idea that I had formed of her non-existence. I saw her an invalid still, but on the road to recovery, I found her in better health. And if she made any allusion to what she had suffered, I stopped her mouth with my kisses and assured her that she was now permanently cured. I should have liked to call the sceptics to witness that death is indeed a malady from which one recovers. Only, I no longer found in my grandmother the rich spontaneity of old times. Her words were no more than a feeble, docile response, almost a mere echo of mine; she was nothing more than the reflexion of my own thoughts." —Marcel Proust, Cities of the Plain


In the end of Solaris (at least the American version) it seems the protagonist chooses the illusion.

Right or wrong perhaps the illusion is enough / or all we really know.


The movie's finale (both) is not what happen in the book.

In the book Kelvin accepts that he's standing in front of a godlike creature that the human mind will never understand, he surrender, lands on the planet's surface hoping that "time of cruel miracles was not past".

The acceptance is not of the illusion itself, but of the fact that we would never understand the whys, the rational thinking and scientific positivism have become the real illusion, all is lost, Solaris won, there is only the hope that Solaris will keep making "cruel miracles".


That sure sounds like he is ok with such miracles as far as living his life from there out.

He doesn't understand them anymore than he understood the people who were created, or the real people.

I always thought that was the point, to ask if his mind couldn't be used to create the other person....then did he really know them? Can anyone.


I'm really, really torn on this. On one hand, this is an amazing way to immortalize memories, experience, whole lifetimes of accumulated knowledge and wisdom. On the other, it feels like a severe form of escapism and refusal to move on.


Try watching the movie About Time, from 2013. In short the main character can travel through time to visit his passing father... Until a point. Sound creepy, but actually isn't. A very wholesome movie.


Interesting movie but they couldn't make up their minds about what the rules were. Every stated rule was broken when convenient.


I've seen the movie, and enjoyed it immensely, I don't think the ideas from the movie wholly apply here though.


> On the other, it feels like a severe form of escapism and refusal to move on.

There is nothing wrong with it. People deal differently, I guess. I, for myself, would have loved to have something similiar from my parents. Not a whole chatbot, but... Written down and audible memories. The tech is there, just the opportunity wasn't.


Yeah... this doesn't appeal to me at all (speaking as someone who lost one parent a few years ago, and sees the other growing frailer), but if he wants to use this form to capture and share memories, I guess that's fine.

My main gripe is with the headline phrase "Artificial Immortality". It's so far removed from that as to be laughable.

(The subtitle about "re-create his dad as an AI" seems severely over-hyped, too, although I suppose that's true of most uses of the term "AI".)


> My main gripe is with the headline phrase "Artificial Immortality". It's so far removed from that as to be laughable.

Yes, it reminds me of Woody Allen's remark: "I don't want to achieve immortality through my work, I want to achieve it through not dying."


> There is nothing wrong with it

Actually there is.

People deal differently with the loss, but eventually they all get over it, because it is a natural process.

A severe refusal to move on is not what one can call a good thing.

Think about the people around you and the consequences it could have.


>Actually there is.

statements of fact like that with regards to a fairly unexplored territory are worse than useless, they're damaging.

By what criteria do we know that this is a 'severe refusal to move on'?

Should we accuse people who peruse old photographs from their history of the same 'severe refusal'?

Why can't this AI idea be just yet another tool at our disposal for memory recall and recollection?

Why paint this as some mental ineptitude, rather than just an archival opportunity?

Most stores of data have possible consequence by their mere existence. What's different here exactly?

Would the idea be more palatable if, rather than the clickbait ideas of immortality, it was framed as a trained chatbot?


"At first, many Native Americans were wary of having their photographs taken and often refused. They believed that the process could steal a person's soul and disrespected the spiritual world.

Over time, however, some Native Americans came to cherish photographs as links to ancestors and even integrated them into important ceremonies."


I zinged straight to (what is for me) the ultimate question in these matters: When will I be able to upload my mind and have it function fully in machina?

Alas, still (for practical purposes) never.

Why you’ll never be able to upload your brain to the cloud - https://theconversation.com/why-youll-never-be-able-to-uploa...


What will it matter? You'll still be dead. Will your machine likeness be anything like you, except in simulation? Or will it simply be an empty Chinese room, giving your loved ones a doll-shell to hold onto, like in a cruel Harry Harlow experiment?


It would matter if true consciousness transfer turns out to actually be theoretically possible. First we have to figure out creating a near-perfect replica (give that a few centuries, at least, probably) and what substrates it could be implanted into, and then if it's possible to really retain subjective self-continuity. Non-transfered replicas will also each feel like they have continuity, so this is a difficult problem, but there's still a lot that needs to be understood before we can definitively rule the possibility in or out.

If the former is possible but not the latter, then I agree it isn't really very interesting. It's just a new way to spawn intelligent lifeforms, but we'll surely have many ways to do that by then. But I feel like if we can actually crack that, then we can eventually crack the second problem, too. The most popular hypothesis seems to be a Ship of Theseus-style gradual replacement / transition for some level of granularity (perhaps individual atoms, molecules, or neurons). We're fundamentally in the dark in so many ways at the moment that it seems unclear if these are plausible, obvious, or absurd.

Possible in the lifetimes of anyone reading this today? No (even if longevity breakthroughs accelerate like crazy). But if human or human-derived intelligence is still around in 1000, 100,000, 100,000,000 years from now, it's impossible to speculate what we may be capable of. If colonizing millions of galaxies becomes a boring walk in the park (via self-propagating autonomous constructs, for instance), then moving your consciousness to different tenancies while fully preserving your original self doesn't seem that insane. Maybe future "humans" will even skip the "middleman" (literally) and just be "born" directly into a non-biological housing from the start so that the transfer step won't be necessary, and maybe the "A" in AGI will become a non-sequitur.


Kaibeezy linked to Greg Egan, who's one of my favorites in this area.

To my mind the latter is logically impossible. Subjective self-continuity is not intrinsic to the experience of consciousness, it is a conclusion we draw upon introspection, based on circumstantial evidence. Subjectively, all we ever have is a sequence of fleeting moments. We only have the "word" of our memory to trust that the Now Me is a descendant of Past Me. As soon as perfect replication is possible, the replicant and the original have exactly equivalent claims to identity in every sense that matters, just one will have a better claim to physical continuity. But physical continuity is not the most important aspect of identity, it's just our intuitions favor it so heavily because it has historically perfectly coincided with conscious identity.


Right, and we don't even have subjective continuity ourselves. Every night we fall asleep, and our consciousness dies. Every morning it is reborn. We don't worry about it.


Maybe you are different than me, but I really don't feel as "reborn" when I wake up - I mean I immediately feel all the time that has passed (and how much, usually to 1 hour precision), and even during the night, when I'm sleeping, I'm still there, just in a different state and not making many memories, but I am definitely not gone.


A better example is going under anesthesia. I had a procedure done, and after I drifted to sleep, 30 minutes passed instantly and they were rolling me to the other room.

The point is that we can lose our continuity of consciousness, yet we don't consider this "dying" in normal life. If we duplicate our exact physical brain and processes and immediately destroy the original while this happens, there is no loss of continuity, but this feels like dying still.


Luckily I never had that done to me.

To me, there is clear loss of continuity when you duplicate a brain. The process inside has been broken/does not continue. The structure is not the key, the key is the running state of the brain. Under anesthesia, the brain still functions.


I think you have to disambiguate two types of "breakage". One is suspending the thread, so to speak, and the other is functional destruction. I think perfectly non-destructive suspension of the process should not influence identity at all. Does an algorithm change its semantics if we pause the execution for a millisecond?

This also assumes that duplication requires either type of breakage--for thought experiment purposes you may assume that cloning is possible with 0 delay and with 0 destructive change.

If you care about which physical substrate the process is running on specifically, this is the bias towards physical continuity I noted.


If I duplicate the state of a paused algorithm and then immediately unpause both of them, even if I did it with no time in between, I can't claim that the second one is the first one. If the algorithm makes up some "mind", a new one is created - and there is a new person.

My point is basically that I will not magically look through someone else's eyes, even if the someone is my copy.


Each day the you that gets up is slightly different to the one that went to sleep. You could make the same argument - what does it matter - the yesterday you is gone, why bother with the new one. I think the reason to bother is that life is kind of cool whether traditional or perhaps AI enhanced.


You’re argument is a metaphoric one; yes we do change and evolve as people. Whereas the GPs point was a literal one; if you upload your consciousness the “physical you” will still quite literally die.

The uploaded version is really more of an interactive Egyptian statue designed to immortalise yourself from the perspective of others rather than you literally living past death.

This is why I don’t see the point of arguments such as that. I would still die. So I might as well survive my memory in more meaningful ways like building a family. If I’m forgotten about 100 years after my death then that’s just the natural order of things and it’s not like I’ll be around to care anyway.


The "physical you" changes too. Your cells are constantly being replaced resulting in a "whole new physical you" roughly every year. [0]

"You" must be a state. If carried to a complex enough substrate my state may continue while the old substrate dies. What concern to me is the nature of this substrate so long as it adequately exists to run my state machine?

It is true that the substrate holding my state may have its operations sabotaged to update my state in strange non-standard ways, but this appears also to be the situation now, in the physical realm. There are untold conditions I can apply to my body and brain that would have marked affect on the unfolding state of my mind.

The concept of multiple mes is confusing, as would be the potential to merge mes and access memories of my own "death(s)". Techniques for confirming consciousness in others are largely non-existent. These are deep and still open questions that cannot be so easily dismissed with a claim that all that's artificial must be artifice.

[0] https://www.quora.com/How-long-does-it-take-for-most-of-the-...


That’s pretty much what the GP said as well and platitudes about how we are constantly changing are fine but change isn’t the same as copy and destroy; which is what the upload process would be.

That distinction is critically important. You can change an original and it will still be the original. But taking a copy and making that your primary copy means destroying the original. Which is fine if you’re talking purely about dumb digital bits like we normally do. However when those bits become self-aware programs executing, destroying the original is a literal death sentence.

I guess a prolonged “merger” with the internet where the artificial mediums were used, at first, as an external processing unit. Then gradually you switch over. I guess that might solve the problem of “switching off” the original. However that’s not how the process is typically described.


So you agree with Ellen May Ngwethu (Cassini Division) that uploads are flatlines even if they exhibit free will and have all the memories of their original?

I recommend anyone interested in this subject to read The Cassini Division by Ken McLeod and Accelerando by Charles Stross.


>The uploaded version is really more of an interactive Egyptian statue

Well that remains to be seen. It could be more than that. If you think how go playing programs have gone from rubbish to beating all humans it seems likely a future upload of a human could have abilities beyond us lot.


I was talking more in terms of it being a memorial to your living self rather than talking about it’s capabilities. I fully expect we’ll eventually hit the technological singularity.


That's why I'll never get in the Star Trek transporter. It takes you apart into information, then reconstructs you somewhere else. But it reconstructs a clone of you - you are dead. The clone believes the transporter works perfectly, and the clone was transported.

But the original you is no more.


For once, rather than take apart the naive philosophy of the teleporter clone nonsense, I'm going to get pedantic about the rules of Star Trek lore.

The transporter does not create a clone. This is explicitly stated to not be the way transporters work in Second Chances (TNG). In that episode, it is considered a very unusual phenomena that there is a second Riker specifically because transporters don't work that way. All LaForge can offer is supposition.

Of course, Star Trek has always played fast and loose with its own technology's mechanics, so this may be contradicted by episodes I'm not currently recalling.


From my memories of reading the Trek Technical Manual:

1. The transporter blows you apart at the atomic level.

2. It then encodes all your atomic information into the transporter buffer. The buffer is a massive storage array (even for trek tech) but only has a limited lifespan (see: Relics, TNG).

3. The data in the buffer is then streamed over an 'energy beam' carrier to the destination location.

4. At the destination the atomic information is rebuilt into a new 'you'. Right down to what your synapses were doing at the time of step #1.

...

In the way that a jigsaw is still the same jigsaw no matter how many times you take it apart and rebuild it, the theory is that it's not a clone of you, as the original never exists at the same time the 'clone' does.

The above steps totally ignore the implausibility of being able to rebuild the subject at the other end, without any infrastructure. Transporter-pad to transporter-pad transfers make logical sense, but transporter-pad to anywhere-on-a-planet... not so much.


Derek Parfit argued that there is no defendable criterion to differentiate between the duplicate and the original.

That led him to the conclusion that the feeling of being a particular person is itself an illusion - just a side-effect of sequential brain events so to say. This is obviously quite a counter-intuitive proposition, but also quite hard to argue against without resorting to mere believes.

I guess Parfit would have considered this AI project an (albeit miniscule) achievement of immortality.


> But it reconstructs a clone of you

This is definitely wrong, there's a no cloning theorem in quantum mechanics [0].

But getting into possible methods of transportation is complicated. QM does allow for teleportation. So you can teleport the exact structure, but not have two copies at each end.

The teleporter could also reassemble you in an inexact way and you then would be a clone (at least conventionally speaking).

But the big question is: If you reconstruct everything in the exact same way is the being that comes out on the other end a different being that went in? This may be completely unverifiable because the being coming out the other end would have all the memories of the being sent in. But because of no cloning, you might be the same being.

Who knows?

[0] https://en.wikipedia.org/wiki/No-cloning_theorem


I far I am concerned, the most crucial aspect is my mind. I think, but wouldn't know how to prove, that the information needed to satisfactorily copy a human mind is finite. You may not be able to replicate quantum states, but can definitely replicate bits.


Meh. That’s what we do everyday as our body sheds old cells and grows new ones. It’s the ship of Theseus already anyway. The me of yesterday is dead. The me of today only believes I am the same.


Not an expert on the subject at all but to my knowledge not quite. Yes, this applies for most cells in the body but in the case of neurons and glial cells, they stay for decades, so in a real, practical sense, the you of yesteryear is still the you of today right down to the cellular level where it most matters in the brain.


Not only that, there's a difference in repairing a ship over time and swapping out all its parts at once. I'm pretty sure most people would consider the second scenario a new ship.


With cars, there's a distinction between a "restoration" and a "survivor". A survivor hasn't had things done to it other than maintenance.

There's debate about where the line is between a "restoration" and a "replica".


On that distinction i'm not sure I agree, if in both cases all parts of the ship have been replaced, the difference between doing it all at once and gradually is a semantic one, with rejection of the "new" ship as the same thing as the old ship being more of an emotional reaction. In either case, it's somewhat separate from what I mentioned about the cells in the brain and how they do stay the same over decades even if others in the body change just like the parts of the ship. Like I said though, that needs a bit of further confirmation because the science seems a bit ambiguous on the details.


I don't think the difference is as trivial as you think. When you replace things slowly there are shared experiences for those parts. They have commonality among them. When you replace something instantly there is no shared experiences or commonality.


This is an interesting point that I hadn't quite considered. Not sure how it would apply on the scale of inanimate objects but I could see it being relevant to a macroscopic organic system with consciousness, like our brains and bodies.


Well the analogy was directed at macroscopic organic systems with consciousness.

But with Theseus's ship we can see a similarity. If we had a ship, tore it down, then built a new ship without using any parts from the old ship, every sane person would call that a different ship. Frankly, the tearing it down step isn't required (an important factor). Conversely, if we repair a ship over time we do consider it the same ship. Maybe this is just personifying objects, but these conclusions about new ship/same ship would be pretty standard. It is why we distinguish replica and restoration.


If it were possible to swap out a ship with an “identical” replica instantaneously (I’m not sure what the benefit would be) I think most people on board would consider it to be the same ship. And, for that matter, it’s relatively common for ship owners to name their ships after the prior ship that held the same function, as if it was the same ship, even when it’s a totally different make/model!


But instead of swap make an identical copy. You now have two ships.


This reminds me of relics in the Catholic Church. An object held by a saint gains divinity due to their commonality, and being in the presence of that divine object passes along the divinity once again.


And the whole point of the thought experiment is to show that such a distinction is meaningless.


I'm not quite sure. There's multiple scenarios here. If we upload your consciousness into a computer and you're still alive, then there's multiple entities. If we replace your organs with machinery over time, then there is only a single entity that is changing. The latter is like the ship, but the former is not.


This is a really interesting point. From an economic perspective, I’ll say that if there are multiple entities, it dilutes the rarity of the original, and thus it is no longer the same. Destroying the original maintains the entity’s relationship to all external entities, preserving not only the original, but the original’s place in the system.


So by that reasoning, if we upload your consciousness into a computer and make sure we destroy you before we activate it, same difference right? We've just made the "slow replacement" process significantly more efficient.


I'm not sure if you're trolling, willfully missing the point to be contrarian (trolling), or just missing the point. Multiple entities makes a distinction in this thought experiment.


I didn't think brain cells were replaced.


That used to be the dominant theory - there is now evidence that new cells continue to grow - just the replacement rate slows down as we age. However we still do not really understand the mechanisms of neurogenesis (growth of new brain cells), the interplay with neural stem cells, cellular development dependencies or the pathological mechanisms associated with various diseases such as Alzheimer's. There continue to be new theories, emergent lab work etc but we still have a long long way to go before we have a clear insightful model of the mechanisms involved, never mind a predictive understanding for systemic outcomes.


To be fair there's a few other more primary reasons you'll never get in the Star Trek transporter :)


You need to read Clifford Simak's "The Way Station"


I keep meaning to read that one :-)


I think if you could upload your mind and interact, a la Speaker for the Dead, it would be more than just your doll she’ll getting interacted with.

I’m Buddhist, I work on my acceptance of death. Sometimes it feels deceptively easy, and sometimes I look at my son and pre ache for what won’t be. Uploading my thoughts for him to review isn’t tempting to me, just like cloning my dog won’t give me my dog. But I get the temptation to wish we were close to a time where it felt plausible. But I don’t expect it to be.


This is a very narrow and naive view of self. If I removed part of your brain would you still be you? It happens to people all the time, personalities are sometimes altered, yet we don't consider these people to be a different person. You yourself have undergone tremendous change since you were a child, yet we do not consider the child-you to be dead and the present-you to be a different being.

Why then, should a being even more like you than a previous version of yourself not be considered the same person?

Self is illusion. We are a collection of energy swirling in the great sea of energy we call the world, forming for a brief time a pattern that believes itself to be separate from that which comprises it, and of which it in turn is a part.


I don't think the parent's idea of self is naive at all. Let's imagine you can upload your consciousness into the cloud or whatever. There's two of "you" now. Clearly the one on the cloud is a copy. And as time progresses, that being is no longer "you". It has new experiences that "you" do not have. They are not shared experiences, and so you are different entities. This point might seem moot if you kill the body immediately, but it brings up a good question of what is "you". I think one thing is clear though, the being that you currently associate with "you" is not the same being that is in the computer.


> It has new experiences that "you" do not have.

Then this is equally true of you now and the you who first read that comment, isn't it?

> the being that you currently associate with "you" is not the same being that is in the computer.

This is not clear, at least not from the perspective of the being in the computer. As far as it is concerned, it stepped into the uploader device (or whatever) and woke up in a computer.

Hypothetical: lets say curious aliens abducted you one night and made an identically functioning copy of the right hemisphere of your brain with circuitry, then replaced your meaty original. You would wake up none the wiser and believe you are the same person right? And so would everyone else, since you behave identically. Ok, so they come back the next night and do the same thing to the left hemisphere. Still you? Now they take your original organic halves and put them back together in a robot body.


> Then this is equally true of you now and the you who first read that comment, isn't it?

The difference here is that there are not two entities, one that read the comment and one that didn't. All beings that associate with me have that shared experience, frankly because as far as we can tell there's only a single being here (and no evidence to suggest otherwise). (The scenario I set above there is a being in the computer and a being outside)

That's the big difference. That's where we can pose the question "Which one is 'you'". We need multiple entities for this to become a valid question. For an instant, I think they both would be. But an instant more, they are different.


I don't think we're disagreeing per-se. It's merely a matter of perspective. Continuity and singularity are what you're leaning on for your definition, and I'm just rejecting that criteria as illusory because it is self-imposed and arbitrary. Certainly such distinctions are useful for legal or societal purposes, but those themselves are just constructs of the human mind, not the intrinsic nature of the universe.


We're talking about the same thing in two threads. Multiple entities matter. And I'll refer you to my other statement https://news.ycombinator.com/item?id=21044119. Your username isn't giving me confidence that you aren't trolling.


I'm not sure why you think I'm trolling. I'm certainly disagreeing that "multiple entities" is somehow a firm metric for determining "self". That doesn't even stand up in non-hypotheticals. Why do people sacrifice themselves for their children, families, communities, or nations?

However, trying to stick more closely to the thread of conversation so far: I'm saying conceptually there isn't any difference between a having multiple entities because they both exist in the immediate present and having multiple entities separated temporally, as you and your past self are. Concept of self is a matter of arbitrary line drawing when you get right down to it, though some lines are more advantageous in certain circumstances than others.

If you read my comment history you'll see that I'm not a troll (though I am sometimes an ass). My account is named what it is as a reminder of what we all are in places like these and why we shouldn't take them so seriously.


Multiple entities does matter, a lot. Let me try to explain better.

Before we do the upload there is a clear distinction of who is "you". It is the person I am talking to right now. The person with all your past experiences and the person that moves forward through time having those unique experiences. I think we can agree on that.

Now let's upload your brain to the computer. You still exist and there's another exact copy of you on the computer. In that first instance you both have the exact same history and shared experiences. The problem is moving forward.

You sitting at the computer will identify your experiences as those that "you" experience, not those of the computer you. Conversely s/physical you/digital you. Both entities are you (had the same past experiences) and only one is (has the current experiences), at the same time. The one is because "you" is composed out of your experiences. If you can't experience it, well it is not really part of your identity. One you can look at the other you on the computer monitor. We have distinct and unique experiences now. It is also clear that nothing has happened to the physical you. The physical "you" will clearly not identify the digital "you" as "self", because it is outside yourself (your experiences). It may be hard to define what "you" is, but it is clear that "you" has to have the experiences that are being processed.

Key point: The two entities have different experiences and identify the other entity as a consciousness that they do not share current experiences with.

We can take this to the single entity case, but that becomes convoluted in that there's a distinction of transferring your consciousness vs copying it. We may talk past one another in this case because it isn't clear if I'm imagining transfer and you're imagining copying, or vise versa. The two entity problem makes clear the distinction of "you" that is before the upload and you after the upload. I think it is clear that you can't have two "you"s (because they are different entities!).


> You sitting at the computer will identify your experiences as those that "you" experience, not those of the computer you. Conversely s/physical you/digital you. Both entities are you (had the same past experiences) and only one is (has the current experiences), at the same time. The one is because "you" is composed out of your experiences. If you can't experience it, well it is not really part of your identity.

Ah, but that only really works if our memory is perfect and continuous, which we know it isn't. If I get black-out drunk and do something, I might have no memory of it whatsoever. It is not part of my current experience even in memory form. That doesn't mean it was someone else who did that thing though, does it? This is not all that dissimilar from the computer copy example.

What I'm trying to show is that concept of self cannot be made concrete. Any definition you use will be an approximation based on a gut feeling of how things are as opposed to logical deduction. Right now, if I understand your reasoning, you're saying that having two entities existing simultaneously (in so far as such a word is applicable in a relativistic spacetime) with a completely identical memory are separate beings, but you do extend that definition to encompass beings who do not share temporal locality. But why not? A copy of me existing at the exact same moment as me is very much more objectively identical to me than my 10 year old self, but the latter case is considered to be the same person and the former is not. I haven't seen any explanation for why this should be the case, merely a description of how we feel it is the case. Which is the point really: self is not an objectively definable property, but rather a fluid construct of the mind.


> Ah, but that only really works if our memory is perfect and continuous,

It is statements like this that make me think you troll. This statement doesn't matter. Physical you and digital you aren't blackout drunk the entire time. At some point they should be forming memories. You're bringing in a contrarian point of view to prove your own. Which no one is arguing that self is hard to define. That's not the conversation going on here.

> self is not an objectively definable property, but rather a fluid construct of the mind.

Self can be defined and a fluid construct. Here's an easy way to think of self. You recognize that you are not the chair you are sitting on. That the chair is different from you. My cat recognizes this kind of self. Such things like that I am not her and that another person is neither me nor her (this is a different level of self recognition than the classic mirror test tests. This is as basic as it gets). The self is the instantaneous thing having the experiences.You do not see through your clones eyes. You DID see through the 10 year old's eyes though. It does not matter that recall isn't perfect, you still had the exact same experiences as that 10 year old, but you don't have the exact same experiences as the clone (you're not looking through your clone's eyes and experiencing from their body).


> It is statements like this that make me think you troll.

I honestly cannot fathom why you can't see what I'm trying to say. If it sounds like I'm repeating myself it is only because I keep trying to find a way to phrase it such that my reasoning can be understood.

> Self can be defined and a fluid construct. Here's an easy way to think of self. You recognize that you are not the chair you are sitting on. That the chair is different from you. My cat recognizes this kind of self.

Yes. But that is because this is how the human mind conceptualizes 'self' in this instance. It draws other lines when necessary to fulfill the purpose it was evolved for. My children are 'self' in many circumstances, or my community, or my nation. Sometimes, even ideals. If that isn't true, then why would a rational mind be willing to die for these things? These lines are arbitrary, and drawn by the mind to aid in fulfilling a purpose. As far as the universe itself is concerned, there is no difference between me and the chair (and in fact even humans would struggle to draw a perfect molecular outline where the chair ends and I begin).

> You DID see through the 10 year old's eyes though. It does not matter that recall isn't perfect, but you don't have the exact same experiences as the clone

Even if I don't remember the event at all? That doesn't make sense. In what sense did I experience an event I have no recollection of? It may as well have happened to a different person. Not to mention that the child is not sharing large swaths of my experience. The clone actually shares significantly more experience with me than the child, yet one is considered the same being and the other isn't. I keep saying this to point out the inherent inconsistency of the how this line is drawn, not because it is not useful but to show that it is not objectively real.

Here's a question for you: Should the digital copy be punished for crimes the physical one committed before the copy took place? After all, it does have the experience of committing that crime. It was, by your own reasoning, at the very instant of the copy, the same person that committed that crime. If it is now a different person, is it responsible for the actions of the physical being in the past?

Originally this whole discussion started because someone claimed that the upload (not copy, in this case) might not be the same as the person it perfectly mimics. That is, even though it is indistinguishable from the original, it is somehow not the same. This is the kind of strange conclusion that is drawn from having a naive conception that self is not a fluid construct but rather a rigid objective reality. This is the flawed thinking I'm seeking to remedy.


> Originally this whole discussion started because someone claimed that the upload (not copy, in this case) might not be the same as the person it perfectly mimics. That is, even though it is indistinguishable from the original, it is somehow not the same. This is the kind of strange conclusion that is drawn from having a naive conception that self is not a fluid construct but rather a rigid objective reality. This is the flawed thinking I'm seeking to remedy.

Why do you think I'm talking about physical you vs digital you? That's why I'm so focused on shared experiences and things. I'm not sure what self is, but this is clearly part of it.

The problem with this discussion is that you already have an answer and are trying to teach me. That doesn't make for a very good discussion. I am also confident that your answer is not correct, though it has merits to it. I'm sure there are flaws to mine as well. But at this point we're talking past one another because you already have an answer.


A moment one second after death is like a moment one second ago. It's gone. Except this time there is no next time.


Ideally, uploaded to Konishi polis.

http://www.gregegan.net/DIASPORA/01/Orphanogenesis.html


I don't know if it came from a dream or what, but I've had this scene in my mind where you're looking at the screen on a machine, and in the reflection of that screen you can see doctors milling around a subject on a table. In unison they all turn their heads to the screen and you see words and shit flicker on it, faster and faster until it's evident that a consciousness is booting up in the machine. A few seconds later you hear a voice coming from the machine like someone waking up from being knocked out. It's the person on the table and the docs are talking to him. The guy slowly gets his bearings and starts celebrating that it worked and he's saying how amazing it is and he feels like his mind is infinite.

Then as the scene starts to go dark you hear a meek voice from the table rise over the din and say 'hey Doc, i don't think it worked'.


Check out the video game "Soma". It's a horror-themed game but you can play in "Safe" mode if that's not your thing. I'd say more but that would spoil things for you.


Nice, will do!


if you're like me and don't particularly have time (or decent enough hardware to run it), i highly recommend this video by Joseph Anderson: https://youtu.be/J4tbbcWqDyY

spoilers obviously, but this is how i generally consume games nowadays. vicarious experience and critical analysis are almost as good as actually playing a game imo


The article seems a bit pessimistic to me. Ok we neither know how the brain stores information, nor are able to make a human level AI currently, both of which would have to happen before uploading was feasible but research is progressing rapidly on both fronts. Personally I'd expect it to be possible before the century is out.

There's already a YC startup working on the stuff https://nectome.com/ who seem to have had some ups and downs but it's early days. Here's the Daily Mail on Sam Altman signing up https://www.dailymail.co.uk/news/article-5503045/Tech-billio...


Anyone interested in fiction, Stephenson's new book "Fall; or, Dodge in Hell" explores this concept and computational power needed. https://en.wikipedia.org/wiki/Fall;_or,_Dodge_in_Hell


Some thoughts I've had on these details and a few points I think are worth mentioning:

First: we have to be careful about comparing mental/physical states like sleep or even deep comatose unconsciousness with notions of mind/body replacement and revival of consciousness. I say this because it's possible that extremely subtle processes might be at work in the former two states that do indeed give us a continuity of perception of same self which wouldn't be the case if you perfectly copied someone and then destroyed them in the instant of reanimating the new perfect copy.

second: for the above i'm not even really taking into consideration the possibility of consciousness being a a partly quantum phenomenon that's impossible to emulate, clone or copy in any way that really recreates the original person's self into a new substrate

Third: assuming we can actually perform whole mind emulation, cloning of a new body and transfer of a self into a new physical or digital substrate (that the brain is capable of being perfectly recreated as if it truly were the original in terms of all functional, practical measurements and subject perception), this still has to keep in mind the danger of what I call uncopied tail end consciousness in the original. In other words, if a person's original self even for a split second or two perceives consciousness beyond the moment of emulation, then truly, they will die and with a knowledge of the copy being just a copy, separate from them, But if on the other hand the transition of consciousness can be made totally seamless as far as perception is concerned, or graduated to that the two gently merge into a new copy, then it would really possibly be possible to create a continuous, clean perception of self.

Fourth, as a tangent of Third. One idea I've always had is of conscious perception having a granular quality, in that we can perceive increments of self awareness/time down to some fraction of a second, and any emulation process would have to create a "backup" for emulation to a new copy at a rate just slightly faster than that natural granularity of perception in order to give us seamless continuity of self-perception in case the original has to suddenly be destroyed and the new copy of your consciousness reanimated.


How do you plan to pay the rent? (i.e. whoever owns the computational power needed to run your mind)


Might be interesting to combine with the latest advancements in voice synthesis[0,1,2]. Actually, surprised they didn't delve into those possibilities.

0: https://news.ycombinator.com/item?id=14182262

1: https://news.ycombinator.com/item?id=20819672

2: https://news.ycombinator.com/item?id=17858246


Black Mirror covered something a few steps beyond this in the episode called “San Junipero” [1]

[1] https://en.m.wikipedia.org/wiki/San_Junipero


I find the "Be Right Back" episode to be more directly fitting here. https://en.wikipedia.org/wiki/Be_Right_Back


It's definitely more applicable. The simulacrum literally starts off as a chat bot.


For those interested in stories about stuff like this: Malagash by Joey Comeau.

Sunday’s father is dying of cancer. They’ve come home to Malagash, on the north shore of Nova Scotia, so he can die where he grew up. Her mother and her brother are both devastated. But devastated isn’t good enough. Devastated doesn’t fix anything. Sunday has a plan.

She’s started recording everything her father says. His boring stories. His stupid jokes. Everything. She’s recording every single “I love you” right alongside every “Could we turn the heat up in here?” It’s all important.

Because Sunday is writing a computer virus. A computer virus that will live secretly on the hard drives of millions of people all over the world. A computer virus that will think her father’s thoughts and say her father’s words. She has thousands of lines of code to write. Cryptography to understand. Exploits to test. She doesn’t have time to be sad. Her father is going to live forever.


Should I pass my mind/memory by value, or by reference? Which one is more expensive?


By value, and definitely onto better storage.


Probably by value, unless you want to risk that it ends up dangling...


Might want to run advanced algorithms in my brain maybe...


The difference between his recordings of 91,970 words, the books of those words bound, and the Dadbot are merely format differences.

Sure he added (and subtracted) some stuff programming the bot, but there is no AI here.


Ray Kurzweil of “A.I. Singularity” fame has collected his deceased father’s writings and possessions for a similar attempt at mechanical reanimation:

https://abcnews.go.com/Technology/futurist-ray-kurzweil-brin...


Does Pullstring save (or export) all of this metadata into something that can work on a different platform?

I wonder about the chances of all this work being lost.

Edit: Story was from 2017, Apple acquired them in Feb 2019, and Pullstring's site is now gone, though a few pages remain in the Google cache. Sounds like there is some risk.


I think death isn't bad when you think about infinity. The universe can repeat the same occurrences needed for what made our life and where the exact same conditions happen again for having the actions & thoughts we already lived. Eternal reoccurrence is the name. Theoretical physics even has it make more sense if the universe continuously expands & retracts repetitively. Thus I would rather just die while in tremendous pain from an illness and where it would appear instant for my conscious to appear again once the forces of the universe repeat the needed conditions.


ah yes. assuming the universe does operate in a cyclical manner and that consciousness itself is an actual fundamental feature of the universe (and not just emergent) , it certainly does make sense from a logical perspective that eventually there will be a beginning that matches the configuration of our current universe and evolves over time such that an Earth will exist again and the consciousness we inhabit will be reborn in the same manner it was leading to our ability to experience it this go round :)


Somehow, this title gave me goosebump in a way the end of Inception gave me

Great idea for a movie


For a comedic version of this concept, check out the fake infomercial "Live Forever as You Are Now" (1)

Great example of the promise of virtual immorality done badly with bad software

(1) https://youtu.be/xg29TuWo0Yo


alan resnick and wham city comedy make some absolutely brilliant content! thank you for sharing this, I forgot all about it.


This is also a sub-plot of the television series Caprica, where a researcher uses her daughter as the model for the first true AI in a robot cylon.


Caprica is actually a very good series, often forgotten in the shadow of Battlestar Galactica. But then it doesn't really feature cool spaceships. ;)


There's a black mirror episode based on this.



A similar idea moved a bit further is the theme of the film Transcendence


You should totally watch Source Code, it's a much deeper dive on this subject. Great movie.


Will do ;)


If this catches on then at a certain moment there will be more bots than real people.


Hosted in a data center, living lives in San Junipero no less.


Solving captchas to keep their rankly old bits flipping..


One could take lots of videos of a person, get a large corpus of their phrases, sentences, behaviors, use deepfakes to generate videos, profit


I think that's the plot of that Black Mirror episode, Be Right Back.


Rudy Rucker discusses a project like this in the Lifebox, the Seashell, and the Soul. Remarkable that someone has actually tried to do it.


I wonder if it occurred to the son how much more of his father's consciousness lives on in himself?




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: