The genius of Lem's underlying idea is that the duplicates, or replicants, or whatever we choose to call them, are self-conscious and seem to carry on with free will from the moment they are evoked by the planet. Rheya, for example, says, "I'm not the person I remember. I don't remember experiencing these things." And later, "I'm suicidal because that's how you remember me." In other words, Kelvin gets back not his dead wife, but a being who incorporates all he knows about his dead wife, and nothing else, and starts over from there. She has no secrets because he did not know her secrets. If she is suicidal, it is because he thought she was.
The deep irony here is that all of our relationships in the real world are exactly like that, even without the benefit of Solaris. We do not know the actual other person. What we know is the sum of everything we think we know about them. Even empathy is perhaps of no use; we think it helps us understand how other people feel, but maybe it only tells us how we would feel, if we were them.
I think it is a deep mistake to correlate one's exterior appearance and behavior with a sense of selfhood. At best, it's a shallow surface imitation of the actual person. At worst, it creates a nightmarish, Uncanny Valley-esque simulacrum, not unlike Harlow's wire monkey mother surrogates. It would be better to learn how to accept and deal with loss - a reality that everyone inevitably faces.
"And yet, if only from my dreams when I was asleep, I might have
learned that my grief for my grandmother's death was diminishing, for
she appeared in them less crushed by the idea that I had formed of her
non-existence. I saw her an invalid still, but on the road to
recovery, I found her in better health. And if she made any allusion
to what she had suffered, I stopped her mouth with my kisses and
assured her that she was now permanently cured. I should have liked to
call the sceptics to witness that death is indeed a malady from which
one recovers. Only, I no longer found in my grandmother the rich
spontaneity of old times. Her words were no more than a feeble, docile
response, almost a mere echo of mine; she was nothing more than the
reflexion of my own thoughts."
—Marcel Proust, Cities of the Plain
Right or wrong perhaps the illusion is enough / or all we really know.
In the book Kelvin accepts that he's standing in front of a godlike creature that the human mind will never understand, he surrender, lands on the planet's surface hoping that "time of cruel miracles was not past".
The acceptance is not of the illusion itself, but of the fact that we would never understand the whys, the rational thinking and scientific positivism have become the real illusion, all is lost, Solaris won, there is only the hope that Solaris will keep making "cruel miracles".
He doesn't understand them anymore than he understood the people who were created, or the real people.
I always thought that was the point, to ask if his mind couldn't be used to create the other person....then did he really know them? Can anyone.
There is nothing wrong with it. People deal differently, I guess. I, for myself, would have loved to have something similiar from my parents. Not a whole chatbot, but... Written down and audible memories. The tech is there, just the opportunity wasn't.
My main gripe is with the headline phrase "Artificial Immortality". It's so far removed from that as to be laughable.
(The subtitle about "re-create his dad as an AI" seems severely over-hyped, too, although I suppose that's true of most uses of the term "AI".)
Yes, it reminds me of Woody Allen's remark: "I don't want to achieve immortality through my work, I want to achieve it through not dying."
Actually there is.
People deal differently with the loss, but eventually they all get over it, because it is a natural process.
A severe refusal to move on is not what one can call a good thing.
Think about the people around you and the consequences it could have.
statements of fact like that with regards to a fairly unexplored territory are worse than useless, they're damaging.
By what criteria do we know that this is a 'severe refusal to move on'?
Should we accuse people who peruse old photographs from their history of the same 'severe refusal'?
Why can't this AI idea be just yet another tool at our disposal for memory recall and recollection?
Why paint this as some mental ineptitude, rather than just an archival opportunity?
Most stores of data have possible consequence by their mere existence. What's different here exactly?
Would the idea be more palatable if, rather than the clickbait ideas of immortality, it was framed as a trained chatbot?
Over time, however, some Native Americans came to cherish photographs as links to ancestors and even integrated them into important ceremonies."
Alas, still (for practical purposes) never.
Why you’ll never be able to upload your brain to the cloud - https://theconversation.com/why-youll-never-be-able-to-uploa...
If the former is possible but not the latter, then I agree it isn't really very interesting. It's just a new way to spawn intelligent lifeforms, but we'll surely have many ways to do that by then. But I feel like if we can actually crack that, then we can eventually crack the second problem, too. The most popular hypothesis seems to be a Ship of Theseus-style gradual replacement / transition for some level of granularity (perhaps individual atoms, molecules, or neurons). We're fundamentally in the dark in so many ways at the moment that it seems unclear if these are plausible, obvious, or absurd.
Possible in the lifetimes of anyone reading this today? No (even if longevity breakthroughs accelerate like crazy). But if human or human-derived intelligence is still around in 1000, 100,000, 100,000,000 years from now, it's impossible to speculate what we may be capable of. If colonizing millions of galaxies becomes a boring walk in the park (via self-propagating autonomous constructs, for instance), then moving your consciousness to different tenancies while fully preserving your original self doesn't seem that insane. Maybe future "humans" will even skip the "middleman" (literally) and just be "born" directly into a non-biological housing from the start so that the transfer step won't be necessary, and maybe the "A" in AGI will become a non-sequitur.
To my mind the latter is logically impossible. Subjective self-continuity is not intrinsic to the experience of consciousness, it is a conclusion we draw upon introspection, based on circumstantial evidence. Subjectively, all we ever have is a sequence of fleeting moments. We only have the "word" of our memory to trust that the Now Me is a descendant of Past Me. As soon as perfect replication is possible, the replicant and the original have exactly equivalent claims to identity in every sense that matters, just one will have a better claim to physical continuity. But physical continuity is not the most important aspect of identity, it's just our intuitions favor it so heavily because it has historically perfectly coincided with conscious identity.
The point is that we can lose our continuity of consciousness, yet we don't consider this "dying" in normal life. If we duplicate our exact physical brain and processes and immediately destroy the original while this happens, there is no loss of continuity, but this feels like dying still.
To me, there is clear loss of continuity when you duplicate a brain. The process inside has been broken/does not continue. The structure is not the key, the key is the running state of the brain. Under anesthesia, the brain still functions.
This also assumes that duplication requires either type of breakage--for thought experiment purposes you may assume that cloning is possible with 0 delay and with 0 destructive change.
If you care about which physical substrate the process is running on specifically, this is the bias towards physical continuity I noted.
My point is basically that I will not magically look through someone else's eyes, even if the someone is my copy.
The uploaded version is really more of an interactive Egyptian statue designed to immortalise yourself from the perspective of others rather than you literally living past death.
This is why I don’t see the point of arguments such as that. I would still die. So I might as well survive my memory in more meaningful ways like building a family. If I’m forgotten about 100 years after my death then that’s just the natural order of things and it’s not like I’ll be around to care anyway.
"You" must be a state. If carried to a complex enough substrate my state may continue while the old substrate dies. What concern to me is the nature of this substrate so long as it adequately exists to run my state machine?
It is true that the substrate holding my state may have its operations sabotaged to update my state in strange non-standard ways, but this appears also to be the situation now, in the physical realm. There are untold conditions I can apply to my body and brain that would have marked affect on the unfolding state of my mind.
The concept of multiple mes is confusing, as would be the potential to merge mes and access memories of my own "death(s)". Techniques for confirming consciousness in others are largely non-existent. These are deep and still open questions that cannot be so easily dismissed with a claim that all that's artificial must be artifice.
That distinction is critically important. You can change an original and it will still be the original. But taking a copy and making that your primary copy means destroying the original. Which is fine if you’re talking purely about dumb digital bits like we normally do. However when those bits become self-aware programs executing, destroying the original is a literal death sentence.
I guess a prolonged “merger” with the internet where the artificial mediums were used, at first, as an external processing unit. Then gradually you switch over. I guess that might solve the problem of “switching off” the original. However that’s not how the process is typically described.
I recommend anyone interested in this subject to read The Cassini Division by Ken McLeod and Accelerando by Charles Stross.
Well that remains to be seen. It could be more than that. If you think how go playing programs have gone from rubbish to beating all humans it seems likely a future upload of a human could have abilities beyond us lot.
But the original you is no more.
The transporter does not create a clone. This is explicitly stated to not be the way transporters work in Second Chances (TNG). In that episode, it is considered a very unusual phenomena that there is a second Riker specifically because transporters don't work that way. All LaForge can offer is supposition.
Of course, Star Trek has always played fast and loose with its own technology's mechanics, so this may be contradicted by episodes I'm not currently recalling.
1. The transporter blows you apart at the atomic level.
2. It then encodes all your atomic information into the transporter buffer. The buffer is a massive storage array (even for trek tech) but only has a limited lifespan (see: Relics, TNG).
3. The data in the buffer is then streamed over an 'energy beam' carrier to the destination location.
4. At the destination the atomic information is rebuilt into a new 'you'. Right down to what your synapses were doing at the time of step #1.
In the way that a jigsaw is still the same jigsaw no matter how many times you take it apart and rebuild it, the theory is that it's not a clone of you, as the original never exists at the same time the 'clone' does.
The above steps totally ignore the implausibility of being able to rebuild the subject at the other end, without any infrastructure. Transporter-pad to transporter-pad transfers make logical sense, but transporter-pad to anywhere-on-a-planet... not so much.
That led him to the conclusion that the feeling of being a particular person is itself an illusion - just a side-effect of sequential brain events so to say. This is obviously quite a counter-intuitive proposition, but also quite hard to argue against without resorting to mere believes.
I guess Parfit would have considered this AI project an (albeit miniscule) achievement of immortality.
This is definitely wrong, there's a no cloning theorem in quantum mechanics .
But getting into possible methods of transportation is complicated. QM does allow for teleportation. So you can teleport the exact structure, but not have two copies at each end.
The teleporter could also reassemble you in an inexact way and you then would be a clone (at least conventionally speaking).
But the big question is: If you reconstruct everything in the exact same way is the being that comes out on the other end a different being that went in? This may be completely unverifiable because the being coming out the other end would have all the memories of the being sent in. But because of no cloning, you might be the same being.
There's debate about where the line is between a "restoration" and a "replica".
But with Theseus's ship we can see a similarity. If we had a ship, tore it down, then built a new ship without using any parts from the old ship, every sane person would call that a different ship. Frankly, the tearing it down step isn't required (an important factor). Conversely, if we repair a ship over time we do consider it the same ship. Maybe this is just personifying objects, but these conclusions about new ship/same ship would be pretty standard. It is why we distinguish replica and restoration.
I’m Buddhist, I work on my acceptance of death. Sometimes it feels deceptively easy, and sometimes I look at my son and pre ache for what won’t be. Uploading my thoughts for him to review isn’t tempting to me, just like cloning my dog won’t give me my dog. But I get the temptation to wish we were close to a time where it felt plausible. But I don’t expect it to be.
Why then, should a being even more like you than a previous version of yourself not be considered the same person?
Self is illusion. We are a collection of energy swirling in the great sea of energy we call the world, forming for a brief time a pattern that believes itself to be separate from that which comprises it, and of which it in turn is a part.
Then this is equally true of you now and the you who first read that comment, isn't it?
> the being that you currently associate with "you" is not the same being that is in the computer.
This is not clear, at least not from the perspective of the being in the computer. As far as it is concerned, it stepped into the uploader device (or whatever) and woke up in a computer.
Hypothetical: lets say curious aliens abducted you one night and made an identically functioning copy of the right hemisphere of your brain with circuitry, then replaced your meaty original. You would wake up none the wiser and believe you are the same person right? And so would everyone else, since you behave identically. Ok, so they come back the next night and do the same thing to the left hemisphere. Still you? Now they take your original organic halves and put them back together in a robot body.
The difference here is that there are not two entities, one that read the comment and one that didn't. All beings that associate with me have that shared experience, frankly because as far as we can tell there's only a single being here (and no evidence to suggest otherwise). (The scenario I set above there is a being in the computer and a being outside)
That's the big difference. That's where we can pose the question "Which one is 'you'". We need multiple entities for this to become a valid question. For an instant, I think they both would be. But an instant more, they are different.
However, trying to stick more closely to the thread of conversation so far: I'm saying conceptually there isn't any difference between a having multiple entities because they both exist in the immediate present and having multiple entities separated temporally, as you and your past self are. Concept of self is a matter of arbitrary line drawing when you get right down to it, though some lines are more advantageous in certain circumstances than others.
If you read my comment history you'll see that I'm not a troll (though I am sometimes an ass). My account is named what it is as a reminder of what we all are in places like these and why we shouldn't take them so seriously.
Before we do the upload there is a clear distinction of who is "you". It is the person I am talking to right now. The person with all your past experiences and the person that moves forward through time having those unique experiences. I think we can agree on that.
Now let's upload your brain to the computer. You still exist and there's another exact copy of you on the computer. In that first instance you both have the exact same history and shared experiences. The problem is moving forward.
You sitting at the computer will identify your experiences as those that "you" experience, not those of the computer you. Conversely s/physical you/digital you. Both entities are you (had the same past experiences) and only one is (has the current experiences), at the same time. The one is because "you" is composed out of your experiences. If you can't experience it, well it is not really part of your identity. One you can look at the other you on the computer monitor. We have distinct and unique experiences now. It is also clear that nothing has happened to the physical you. The physical "you" will clearly not identify the digital "you" as "self", because it is outside yourself (your experiences). It may be hard to define what "you" is, but it is clear that "you" has to have the experiences that are being processed.
Key point: The two entities have different experiences and identify the other entity as a consciousness that they do not share current experiences with.
We can take this to the single entity case, but that becomes convoluted in that there's a distinction of transferring your consciousness vs copying it. We may talk past one another in this case because it isn't clear if I'm imagining transfer and you're imagining copying, or vise versa. The two entity problem makes clear the distinction of "you" that is before the upload and you after the upload. I think it is clear that you can't have two "you"s (because they are different entities!).
Ah, but that only really works if our memory is perfect and continuous, which we know it isn't. If I get black-out drunk and do something, I might have no memory of it whatsoever. It is not part of my current experience even in memory form. That doesn't mean it was someone else who did that thing though, does it? This is not all that dissimilar from the computer copy example.
What I'm trying to show is that concept of self cannot be made concrete. Any definition you use will be an approximation based on a gut feeling of how things are as opposed to logical deduction. Right now, if I understand your reasoning, you're saying that having two entities existing simultaneously (in so far as such a word is applicable in a relativistic spacetime) with a completely identical memory are separate beings, but you do extend that definition to encompass beings who do not share temporal locality. But why not? A copy of me existing at the exact same moment as me is very much more objectively identical to me than my 10 year old self, but the latter case is considered to be the same person and the former is not. I haven't seen any explanation for why this should be the case, merely a description of how we feel it is the case. Which is the point really: self is not an objectively definable property, but rather a fluid construct of the mind.
It is statements like this that make me think you troll. This statement doesn't matter. Physical you and digital you aren't blackout drunk the entire time. At some point they should be forming memories. You're bringing in a contrarian point of view to prove your own. Which no one is arguing that self is hard to define. That's not the conversation going on here.
> self is not an objectively definable property, but rather a fluid construct of the mind.
Self can be defined and a fluid construct. Here's an easy way to think of self. You recognize that you are not the chair you are sitting on. That the chair is different from you. My cat recognizes this kind of self. Such things like that I am not her and that another person is neither me nor her (this is a different level of self recognition than the classic mirror test tests. This is as basic as it gets). The self is the instantaneous thing having the experiences.You do not see through your clones eyes. You DID see through the 10 year old's eyes though. It does not matter that recall isn't perfect, you still had the exact same experiences as that 10 year old, but you don't have the exact same experiences as the clone (you're not looking through your clone's eyes and experiencing from their body).
I honestly cannot fathom why you can't see what I'm trying to say. If it sounds like I'm repeating myself it is only because I keep trying to find a way to phrase it such that my reasoning can be understood.
> Self can be defined and a fluid construct. Here's an easy way to think of self. You recognize that you are not the chair you are sitting on. That the chair is different from you. My cat recognizes this kind of self.
Yes. But that is because this is how the human mind conceptualizes 'self' in this instance. It draws other lines when necessary to fulfill the purpose it was evolved for. My children are 'self' in many circumstances, or my community, or my nation. Sometimes, even ideals. If that isn't true, then why would a rational mind be willing to die for these things? These lines are arbitrary, and drawn by the mind to aid in fulfilling a purpose. As far as the universe itself is concerned, there is no difference between me and the chair (and in fact even humans would struggle to draw a perfect molecular outline where the chair ends and I begin).
> You DID see through the 10 year old's eyes though. It does not matter that recall isn't perfect, but you don't have the exact same experiences as the clone
Even if I don't remember the event at all? That doesn't make sense. In what sense did I experience an event I have no recollection of? It may as well have happened to a different person. Not to mention that the child is not sharing large swaths of my experience. The clone actually shares significantly more experience with me than the child, yet one is considered the same being and the other isn't. I keep saying this to point out the inherent inconsistency of the how this line is drawn, not because it is not useful but to show that it is not objectively real.
Here's a question for you: Should the digital copy be punished for crimes the physical one committed before the copy took place? After all, it does have the experience of committing that crime. It was, by your own reasoning, at the very instant of the copy, the same person that committed that crime. If it is now a different person, is it responsible for the actions of the physical being in the past?
Originally this whole discussion started because someone claimed that the upload (not copy, in this case) might not be the same as the person it perfectly mimics. That is, even though it is indistinguishable from the original, it is somehow not the same. This is the kind of strange conclusion that is drawn from having a naive conception that self is not a fluid construct but rather a rigid objective reality. This is the flawed thinking I'm seeking to remedy.
Why do you think I'm talking about physical you vs digital you? That's why I'm so focused on shared experiences and things. I'm not sure what self is, but this is clearly part of it.
The problem with this discussion is that you already have an answer and are trying to teach me. That doesn't make for a very good discussion. I am also confident that your answer is not correct, though it has merits to it. I'm sure there are flaws to mine as well. But at this point we're talking past one another because you already have an answer.
Then as the scene starts to go dark you hear a meek voice from the table rise over the din and say 'hey Doc, i don't think it worked'.
spoilers obviously, but this is how i generally consume games nowadays. vicarious experience and critical analysis are almost as good as actually playing a game imo
There's already a YC startup working on the stuff https://nectome.com/ who seem to have had some ups and downs but it's early days. Here's the Daily Mail on Sam Altman signing up https://www.dailymail.co.uk/news/article-5503045/Tech-billio...
First: we have to be careful about comparing mental/physical states like sleep or even deep comatose unconsciousness with notions of mind/body replacement and revival of consciousness. I say this because it's possible that extremely subtle processes might be at work in the former two states that do indeed give us a continuity of perception of same self which wouldn't be the case if you perfectly copied someone and then destroyed them in the instant of reanimating the new perfect copy.
second: for the above i'm not even really taking into consideration the possibility of consciousness being a a partly quantum phenomenon that's impossible to emulate, clone or copy in any way that really recreates the original person's self into a new substrate
Third: assuming we can actually perform whole mind emulation, cloning of a new body and transfer of a self into a new physical or digital substrate (that the brain is capable of being perfectly recreated as if it truly were the original in terms of all functional, practical measurements and subject perception), this still has to keep in mind the danger of what I call uncopied tail end consciousness in the original. In other words, if a person's original self even for a split second or two perceives consciousness beyond the moment of emulation, then truly, they will die and with a knowledge of the copy being just a copy, separate from them, But if on the other hand the transition of consciousness can be made totally seamless as far as perception is concerned, or graduated to that the two gently merge into a new copy, then it would really possibly be possible to create a continuous, clean perception of self.
Fourth, as a tangent of Third. One idea I've always had is of conscious perception having a granular quality, in that we can perceive increments of self awareness/time down to some fraction of a second, and any emulation process would have to create a "backup" for emulation to a new copy at a rate just slightly faster than that natural granularity of perception in order to give us seamless continuity of self-perception in case the original has to suddenly be destroyed and the new copy of your consciousness reanimated.
Sunday’s father is dying of cancer. They’ve come home to Malagash, on the north shore of Nova Scotia, so he can die where he grew up. Her mother and her brother are both devastated. But devastated isn’t good enough. Devastated doesn’t fix anything. Sunday has a plan.
She’s started recording everything her father says. His boring stories. His stupid jokes. Everything. She’s recording every single “I love you” right alongside every “Could we turn the heat up in here?” It’s all important.
Because Sunday is writing a computer virus. A computer virus that will live secretly on the hard drives of millions of people all over the world. A computer virus that will think her father’s thoughts and say her father’s words. She has thousands of lines of code to write. Cryptography to understand. Exploits to test. She doesn’t have time to be sad. Her father is going to live forever.
Sure he added (and subtracted) some stuff programming the bot, but there is no AI here.
I wonder about the chances of all this work being lost.
Edit: Story was from 2017, Apple acquired them in Feb 2019, and Pullstring's site is now gone, though a few pages remain in the Google cache. Sounds like there is some risk.
Great idea for a movie
Great example of the promise of virtual immorality done badly with bad software