
The Cloned-Consciousness-as-Continuous-Consciousness Fallacy - exolymph
https://exolymph.news/2016/06/21/cloning-artificial-intelligence-death/
======
throwawaysocks
The author's "fallacy" isn't so much a fallacy as much as it is one particular
(and rather unsurprising) opinion on how to answer the questions that Aaronson
and Hanson have asked.

In this sense, the author fundamentally misses the point of what Aaronson and
Hanson have written. Aaronson and Hanson seem to be primarily interested in
identifying _questions_ and exploring the _problems_ that arise when trying to
give simple answers to those questions.

The fact that the author is able to quickly and cleanly resolve his personal
belief about how these questions should be answered doesn't indicate that the
questions are easy to answer or even that there is an obviously correct
answer. At least any more than a Christian's belief in God cleanly settles the
question of whether God exists.

~~~
exolymph
I wish you'd explain why you disagree with my answer.

~~~
glifchits
I think what throwawaysocks is saying is that your refutation doesn't really
add value to the discussion. I agree with you, that my clone's consciousness
is not my own consciousness, but what are the deeper implications?

Particularly when we start to consider AI, and its not your own human
consciousness but rather an artificial one, does our notion of "murder"
change? Then they question, is true "cloning" even possible? (Aaronson uses
quantum uncertainty to say 'no', but Hanson claims 'yes' given total
understanding of the brain.)

Thanks for your article, I'm personally on your side but this whole debate has
really made me think.

------
aappleby
Imagine that I visit a cloning facility that can flawlessly duplicate a human,
memories and all. The techs there put me in a medical gown and sedate me, and
a while later I wake up in an empty room.

Without any additional information, I have no way to determine if I am the
clone or not. I know only that there is another me in the building somewhere,
having the exact same thoughts as I am having right now.

In fact, I have no way to know if the cloning was even successful or not.
There is no way for me to distinguish from this room if I am the clone, if the
other copy was disintegrated, or if no copy was made at all. The room I'm in
could be on Mars for all I know.

Does it make sense for me to worry about the fate of the other potential copy
if I know absolutely that nothing of my essence has been lost either way?

~~~
EGreg
This is all basic stuff. Think deeper.

You go to sleep every night. When you wake up the next day, how do you know
you're the same person as yesterday? What does that mean?

Now consider if you didn't have memories of the past. What would make you
"you"? The body would be the same but your amnesia would change things, eh?

Now what if you knew next week you would lose your memories for sure, but this
week you can go on an amazing vacation. Does it matter where you go? It
matters to your current self but to your future self.

Now think about sacrificing pleasure now so you can live a longer life. Or
feel good about your life at 80.

Now think about the afterlife. What if you are reincarnated with no memories
of the past? We already kind of established that "reincarnated you" wouldn't
care about current you. But will current you care about reincarnated you?

Consider this: if a "future you" doesn't remember being "current you" how are
they really different than a random person with a similar body who also has
amnesia?

In short, long term, is it all about the memories?

What is the self preservation instinct about?

Finally the big one... if there were no conscious observers at all, in what
sense would anything exist? That is, if I described to you a universe that you
can never detect or deduce must exist from any observations in this one, how
would saying "that universe EXISTS be different than saying it DOESN'T EXIST?
It seems the word itself requires concious observers. Thus, the concept of
consciousness and existence may be two sides of the same amazing conundrum.

~~~
chongli
You know you're the same person as yesterday because without you, there is no
_knowing_. A clone is no more _you_ than an identical twin is you. What makes
you _you_ is the ability to ask the question in the first place. If you were
dead, there would be no question.

~~~
TheCoelacanth
I don't follow.

You know based on your ability to ask the question that you currently are you
and not a clone of yourself, but you don't know that the person yesterday was
you and not some other person that you were cloned from.

~~~
chongli
_but you don 't know that the person yesterday was you and not some other
person that you were cloned from._

The parent did not say anything about this and neither did I. Of course you
cannot know whether you are a clone or not. My point is that if a clone is
made of you and you are killed, you will be dead, you won't be the clone.

~~~
TheCoelacanth
You both did.

> When you wake up the next day, how do you know you're the same person as
> yesterday?

> You know you're the same person as yesterday

------
jaxomlotus
Another similar thought experiment is what happen if we cut ourselves
laterally exactly in half and then fill back in the missing half on both
parts. In the case of both halves, we are continuing an existing
consciousness.

Would you be OK killing off one of the halves?

~~~
qbrass
I've been through this a million times before. If they refuse to act with me
as a single entity, they're dead to me.

~~~
jaxomlotus
Which one is "me" though? Which ever side I in my current state end up being?

~~~
joeyo
We have some data on this from split-brain patients. These are people whose
corpus callosum---a bundle of fibers linking their left and right cerebral
hemispheres---is cut. Such people start to act as if they are two individuals.
Each hemisphere only has access to half of the visual field, and due to a
quirk of how language is lateralized (typically left hemisphere only) they can
make conflicting verbal and nonverbal reports of ambiguous stimuli: saying one
thing yet writing another.

So given this, it's not really fair to say that there is a unitary state to
split. Each half would get a different state (perhaps "synchronized" at the
time of the split), and only one half would be able to talk about the
experience.

------
scotchmi_st
We're chasing our tails here. The definition of the copy function requires
both an original and a copy. There is //always// an original and a copy. Thus,
exact clones and copies made in teleport machines are always copies, separate
from the original. There can be no copy function without an original and a
copy. You can think of as many elaborate cloning thought experiments as you
like; if they can be reduced to a copy function then the answer is simple.

I know I'm not the first person in this thread to declare the answer simple,
but the author appears to be reaching the obvious if somewhat occluded correct
conclusion.

~~~
snom380
The particles that make up your will have a probability of being in a certain
place according to some rules. When you move, the particles "disappear" from
one place and "appear" at another place. Do you know they are the same
particles? Does it matter?

Does it matter more that there's no duplicate particles (ie. no changes of a
copy being made)?

In that case, you could device a teleport machine that makes sure to read the
state "bit by bit" (particle by particle), then destroys the original particle
before creating a new particle in another location. How different would that
be compared to what's normally happening to your body?

~~~
scotchmi_st
> Does it matter more that there's no duplicate particles (ie. no changes of a
> copy being made)?

Yes, it does matter. No copy being introduced means it's just a translation of
displacement. That's an entirely different operation.

Quantum tunnelling is fun to think about, but at the scale of atoms it starts
to lose any relevance, and at the scale of neurons has no meaning at all.

------
mickeypi
Consider another version: An exactly similar human being to you is discovered
to exist on another planet. They have your memories, your aspirations, your
habits, etc. They were not created as a copy of you. This person was born to
parents, just like you, and lived the same life that you did. They are reading
this post on HN right now.

Surely in this scenario the answer to "is it OK to kill one of you now?" is
"no".

So why is this different to some people than a synthetically-created clone?

~~~
dTal
>Surely in this scenario the answer to "is it OK to kill one of you now?" is
"no".

I don't know about that.

I'm not sure about this intuition pump. If they are truly identical to me,
then the world they live in must also me identical to mine or we would have
diverged. So if we discovered Earth-2, then Earth-2 would simultaneously
discover us (or Earth-3 etc). No actions can cause divergence, so it's not
possible for anyone on any Earth to only kill one of you. It would be like
trying to kill your reflection.

If we're talking about some kind of god who is outside the mirror/chain Earth
system, and has the ability to break symmetry as they see fit, it's still not
clear-cut. In effect, from such a perspective, there is only one Earth (albeit
with copies) until changes are made. It's not even clear that deleting one of
the several identical copies of the whole Earth is wrong. Moreover, deciding
to kill someone on one of several heretofore identical Earth's and seeing how
they diverge is _informationally_ the same as spawning a new Earth - an act of
_creation_ rather than destruction. Nothing is lost - that person still lives
their life on an Earth somewhere - but now new lives will be led, new
experiences, new sadness but also somewhere new joy.

I think it's pretty clear though that our morality of "no killing" is based
pretty strongly upon the assumption that there's only one of someone, they can
only be killed once, and it's forever. Start violating these assumptions and
it breaks down quickly.

~~~
norea-armozel
Only if you assume the finality of death makes anything less good. If you
assume every conscious being has inherent value then the entire argument has
to go another way which isn't much better, but it can be characterized as a
viable alternative if perfect copies of any one person could be created down
to the Planck constants.

------
vectorpush
One's position in space-time is an inextricable component of "the self". Even
in theory, a perfectly replicated individual has already fundamentally
diverged from the original just by virtue of its existence as a duplicate
(since duplication implies a distinct position in space-time). Ultimately,
what I'm saying is that the idea of a "perfect duplicate" is something of a
contradiction, and any theory of mind that attempts to define one's
"homunculus" in terms of a physical brain state is in some ways moot because
although our human senses might indicate that we are dealing with two
identical individuals we are actually dealing with two fundamentally distinct
beings, despite the apparent similarities.

~~~
SilasX
Do you say the same thing about copies of a program (and their corresponding
hardware)?

~~~
vectorpush
I would, yes, though I'd be careful about drawing comparisons in this regard
between computer programs (as we know them today) and the phenomenon that we
call consciousness, since the framework of computation deliberately extricates
the essence of a program from the physical phenomena that underpins its
existence.

~~~
SilasX
Okay, but then you're just moving the debate over a few feet rather than
addressing the core point: can a _human_ likewise be abstracted away to some
substrate-independent program such that we lose nothing by deleting some
running copies? Can we build a box that gives us the same answers Bill Gates
would, and causes us to deem it "another Bill Gates" without confusion, modulo
having a body and physical access to the world?

~~~
vectorpush
> _can a human likewise be abstracted away to some substrate-independent
> program such that we lose nothing by deleting some running copies_

Oh. Well, assuming our current model of computation, I think a "substrate-
independent" human program is essentially non-human by definition (that is to
say, lacking the particular qualities of 'consciousness' that we commonly
ascribe to humans), so I'd conclude that "nothing is lost" insomuch as
"nothing existed" in the first place. In theory, it seems plausible that we
could create a program that could consistently pass a "Bill Gates" Turing
test, but it is easy for a programmer to imagine how such a program could
exist without actually being conscious.

------
sajid
"This seems really simple and obvious to me. So what am I missing?"

Personal identity is a deep philosophical problem - see Derek Parfit's
'Reasons and Persons'.

------
pjungwir
Just to throw out a few references, since they seem unmentioned so far:

The Ship of Theseus is an ancient recognition of a similar problem.

Greg Egan has written two stories that deal with copy-vs-continuity of
consciousness: Diaspora and Permutation City. Both start with the premise
"what if we could digitize our mind?" but go in different directions.

------
AgentME
It's an interesting problem, but I don't feel like this page accomplished much
besides calling a position he disagreed with a fallacy and using unexplained
appeals to emotion.

>To refute this, let’s conduct a thought experiment. Pretend that you can copy
a human brain. There are ten copies of me. They are all individually conscious
— perfect replicas that only diverge after the point when replication
happened. Is it okay to kill five of these copies? No, of course not! Each one
is a self-aware, intelligent mind, human in everything but body. The
identicalness doesn’t change that.

The things that scare me about death are the ideas that my specific goals
would be left unpursued, that my responsibilities would be abandoned, my
friends would lose my friendship, and that my memories and experiences would
lose their significance upon the world.

If an alien came to Earth with a perfect matter-duplicater (that I have reason
to fully trust), duplicated me such that I'm not even sure which is the
original, then remembers that he's not allowed to use it on humans, and
explains he needs to painlessly euthanize and disassemble one of me, I
wouldn't be particularly shaken up by it. Sure, I'd feel I was teased with a
cool toy and possibility, but the only injury that has happened is in a way
equivalent to a few seconds of memory loss. I've done worse to myself by
drinking. There were no big life experiences lost or goals left unpursued.

\---

Going a little further... if I were an AI or could somehow copy myself, I
imagine I would frequently make short-lived copies to work together with to
accomplish my goals. I as my copies would be told that I'm going to be removed
in a few days; I think I could handle the idea of knowing the surviving-me is
going to be amnesic to my several days of (probably barely)unique experiences
in a world much closer to our goals.

In a world where everyone or all AIs could do the above, I imagine the minds
who did the above would become much more successful in nearly all endeavors.
Choosing not to do so would be choosing to fail, unless the strategy was made
illegal. I struggle to fault the strategy morally given that it would be
something done by a single consenting individual to themselves, though it does
still sound a little weird to my instincts about how survival is supposed to
work. If we encountered otherwise similar aliens which already worked that
way, would we judge them, or only consider it an abomination when humans did
it to themselves?

I'm a bit disappointed that I haven't seen much sci-fi explore concepts like
this.

~~~
michael_h
If you haven't seen Primer, I recommend it.

MILD SPOILER: If you could make short-lived copies of yourself, you should
probably keep a close eye on them. After all, you're clever, they're clever.
You're motivated to keep existing, so are they.

~~~
AgentME
>You're motivated to keep existing, so are they.

This is kind of the point I'm arguing against. I'm sure many people would have
struggles with their copies, but I think someone disciplined and with the
right (well, useful) definitions of self and the meanings of survival
internalized could avoid the issue entirely, and accept their short-livedness
if they know another instance of themselves is going to continue.

------
scythe
The no-cloning theorem together with chaos theory suggests that cloning is
fundamentally impossible. Mostly people will respond to this by saying "the
brain isn't a quantum computer", which is almost certainly true, but what is
actually required for mind uploading is "the brain's behavior is not
influenced by quantum phenomena at all", which is even more certainly false.

Yet we're still obsessed with mind uploading. Why?

I guess it's because, while we accept that life seems to have some "atman", a
sense of identity, and quantum mechanics allows this in a very weak sense, it
makes no sense that our perceptions would be at all influenced by quantum
mechanics. Using the no-cloning theorem to resolve the problem of mind
uploading seems like cheating, and we're all pretty sure that souls don't
exist, so we tend to think that anyone who looks for something like a soul in
fundamental physics must be pushing an agenda. But this latter reasoning is a
fallacy; the well is poisoned when anyone who suggests that quantum mechanics
might derail computational neuroscience must believe in some weird magical
revisionism of ESP or hypercomputations occurring in the brain or whatever.

But there's an easier culprit hiding in plain sight: the tendency of people to
talk more about things they think they can understand, than things where they
run a serious risk of being proven flat wrong. I _could_ be flat wrong (but
this is not my first rodeo); many posters in this thread as well as the
blogger cannot. Loss aversion strikes again.

------
js8
I came to conclusion that torturing Sims is indeed morally wrong. Now it's not
that much wrong, because they are not very self-aware, but if they were more
self-aware, it would be very wrong. So yeah, killing clones or even virtual
replicas willy-nilly is wrong.

On the other hand, I would have no problem with teleportation-via-cloning. I
don't have a problem with being unconscious every night, after all (kudos to
[http://existentialcomics.com/comic/1](http://existentialcomics.com/comic/1)
for pointing that out to me). Also I have no problem, every day when I go to
work, to sit for half an hour in a huge metal structure which is propelled by
enormous speed through a tunnel. What if it crashed, and I would be smashed to
death? It's pretty much the same question the article asks, what if some
incident caused the clone not being recreated successfully? I simply trust the
people who built it and maintain it, and I know many people use it with no
issues, so why not use it?

~~~
chr1
If torturing sims is morally wrong is it also morally wrong to stop the
simulation destroying their universe?

~~~
js8
Yes and no. It's not wrong to just stop the simulation forever. Torturing or
repeatedly killing a virtual being for fun is wrong, but turning off
("killing") a virtual being for the purpose of it being turned off is
acceptable.

However, it can be wrong to erase their universe, if that potentially destroys
something interesting the virtual beings have created.

In other words, the intent matters. Killing a mouse for fun is immoral,
killing it to research cancer is not.

------
enoch_r
Imagine waking up in a strange facility. There's a tray in front of you with a
pill and a label that says "eat me."

In World A, the drug causes amnesia--after 12 hours, you'll fall asleep, and
completely forget the preceding 12 hours.

In World B, the drug is a suicide pill--after 12 hours, you will fall asleep
and die painlessly. However, just before you woke up, a "backup" was uploaded.
And upon your death, this backup will be restored.

It seems to me that the subjective experience(s) in both cases would be the
same. But we consider the latter "death" and the former "too much to drink."

([http://www.overcomingbias.com/2016/04/is-forgotten-party-
dea...](http://www.overcomingbias.com/2016/04/is-forgotten-party-death.html)
got me thinking in these terms--it's by Robin Hanson, who is one of the
examples from the article.)

------
SilasX
This doesn't deserve the label "fallacy"; the only thing I got is that the
author disagrees about the implications of restarting a mind somewhere else.
Well, great, but you need to explain substantively _why_ you think that is,
not simply disagree with it.

But even _then_ , you should only label the _flawed premise_ \-- the one
you're refuting to make this point -- as the fallacy, not the conclusion.

What's worse, the author's disagreement has to deal with the (almost)
established "consciousness branching" that happens all the time. If you accept
the universality of the Schrödinger equation, then your body is branching into
different parts of the wavefunction, yet (each of) you obviously internally
perceive it as the same consciousness.

------
billyjobob
The author seems to believe continuous consciousness is necessary for
identity. I hope he never falls asleep. If he does that means he is dead, and
tomorrow morning an usurper with his memories will wake up in his bed.

------
CapTVK
I'm reminded of this cartoon about an inventor who demonstrates a
teleportation device. Shows you don't have to go all technical or write an
economic treatise about emulated humans for important philisophical questions.
It can be fun as well.

"To be" (by John Weldon)
[https://www.youtube.com/watch?v=pdxucpPq6Lc](https://www.youtube.com/watch?v=pdxucpPq6Lc)

For a more serious answer to these questions (or at least some) you can look
up the work " i am you" By Daniel Kolak.

------
joe_the_user
One thing that should be clear in these argument is that many people intuitive
sense of the nature of consciousness.

It seems absolutely obvious that there's something unique in a human brain-
body that can't be encompassed by just information. It's often denoted by
"living, breathing, feeling".

There's no more concrete argument here. A person whose awakened from general
anesthesia still "feels like they are the same". A person who, hypothetically,
had all cells in their body replace themselves through cell division (say some
youth-regaining process more complete than the ordinary aging process, which
replaces quite a few still) wouldn't face the "am I really me" question is the
same way as that exact duplicate on Mars.

I think these "identity issues" have to do with our internal processes of
psychological adaptation. Each human is constantly changing and has a
multitude of impulses - yet we each have to pursue our interests as a single
being who deals with other single beings. So our brains have a reflexive
tendency to form a sense of unified consciousness.

One might say that these are "just psychological issues" but I think it's
reasonable to say that this reflex towards unified self-conception is very
important for the thing humans do, at in the world as a unified whole rather
than a collection of, say, feedback loops.

------
trav4225
This whole discussion sheds a bit of light on the inevitable difficulties one
encounters when attempting to remain unflinchingly dedicated to a purely
materialistic philosophical worldview.

(IMHO)

~~~
rictic
If you've got a philosophy that doesn't involve biting any counterintuitive
bullets, please share.

~~~
trav4225
My point is essentially to encourage open-mindedness about that which we don't
(and possibly never can) fully understand from a materialist (as opposed to a
philosophical) perspective. :)

------
gerbilly
I think that if you want to copy a conscious being, you'd have to copy the
body as well.[1]

There are a lot of 'memories' stored in the body, and the brain and body (and
endocrine system) are inseparably intertwined in my opinion.

[1]
[https://en.wikipedia.org/wiki/Embodied_cognition](https://en.wikipedia.org/wiki/Embodied_cognition)

------
sharemywin
Since there isn't a computer that seems conscious at this time, the idea of
machine consciousness is supported by thought experiments. Here's one old
chestnut: "What if you replaced your neurons one by one with neuron-sized and
shaped substitutes made of silicon chips that perfectly mimicked the chemical
and electric functions of the originals? If you just replaced one single
neuron, surely you'd feel the same. As you proceed, as more and more neurons
are replaced, you'd stay conscious. Why wouldn't you still be conscious at the
end of the process, when you'd reside in a brain-shaped glob of silicon? And
why couldn't the resulting replacement brain have been manufactured by some
other means?"

[http://www.jaronlanier.com/aichapter.html](http://www.jaronlanier.com/aichapter.html)

------
noonespecial
The author of the article is asking a bit of a different question than the the
authors of the articles he cites. The cited are more about "what is
consciousness" in the context of can a computer be. The author is asking "what
am I" That's a little harder isn't it?

He's asking if he got copied instantly and sent to Mars which one of the
copies would still be _me_? Well, they both would wouldn't they? Each having a
continuous experience leading up to the point they were copies and stepped out
of the machine.

The fun part is that, barring quantum hooey that the articles bring up, there
seems to be no scientific reason to doubt the physical possibility of this
eventually occurring. If you can get copied, and both copies are you, what
happens to the underpinnings of most world religions? Delightfully subversive
thought right there.

~~~
didgeoridoo
"And what of the immortal soul in such transactions? Can this machine transmit
and reattach it as well? Or is it lost forever, leaving a soulless body to
wander the world in despair?"

\-- Sister Miriam Godwinson, "We Must Dissent"

~~~
noonespecial
Look at any photograph or work of art. If you could duplicate exactly the
first tiny dot of color, and then the next and the next, you would end with a
perfect copy of the whole, indistinguishable from the original in every way,
including the so-called "moral value" of the art itself. Nothing can transcend
its smallest elements.

\-- CEO Nwabudike Morgan, "The Ethics of Greed"

------
dustinls
I think fork is a more appropriate description than clone. After the point the
new consciousness exists, it immediately begins creating its own experiences
from that moment on and ceases to be a clone anymore.

I don't think it makes a difference whether a consciousness is destroyed after
its been forked or not. Either way, I wouldn't be volunteering for that kind
of experiment any time soon.

Quantum Immortality comes to mind though. I _think_ it may have been Tesla who
also mentioned that our consciousness is a receiver that tunes in to the
closest match. If this were the case, then I might volunteer after all.

I imagine the only way to know for sure would be to invent the technology to
clone/fork consciousness, separate them, show them different cards, destroy
one, and ask the other to describe both cards.

------
pessimizer
It's less a fallacy than a fantasy. It's a bizarre universalist solipsism: A
perception that all people are is what is important for you to consider,
carried to the most absurd conclusion of thinking that what you are is only
what is important for others to consider.

Therefore, if you duplicate a process, delete the old and copy the new one in
its place, it's the same process.

This is totally acceptable to me if there is a proof somewhere that there is
no such thing as subjective experience, but then it makes no sense to talk
about "consciousnesses" as separable anyway, and the entire discussion is
pointless.

edit: If Peapod suddenly goes out of business and Amazon seamlessly picks up
my grocery deliveries, is Amazon Peapod?

------
milkey_mouse
This sort of reminds me of the movie Sync by Corridor Digital:
[https://youtu.be/vhjimhX9d5U?t=38s](https://youtu.be/vhjimhX9d5U?t=38s)

------
Artlav
I wonder, what does quantum physics have to do with human experience and brain
operation?

It's a very persistent idea floating around, despite contradicting everything
we know about how a brain operates.

More on topic, what he misses is that "consciousness" is a null word. It does
not connect to any real phenomenon, and merely a result of a human being
unable to imagine it's own non-existence, thus producing the long list of
things like anima, spirit soul, consciousness and so on.

------
goldenkey
There is no fallacy. Everyone's here who is intrigued by this question but
dissapointed with the rather simple and unresolving reply in this blog post,
should watch Moon directed by Duncan Jones (David Bowie's son who recently
directed Warcraft.) Questions like these don't have simple answers. I'm rather
dissapointed with the hand wave of emotion the post author ascribes to. But it
is bringing the topic up, so I am happy for that.

------
jondubois
I don't understand why the author of the article believes that it's impossible
to clone consciousness.

If you made an exact copy of yourself, then you would be able to control both
bodies simultaneously and you would also receive sensory feedback from both
bodies at the same time.

The main caveat is that these sensory experiences will be partitioned across
the two bodies - You (your consciousness) will receive sensory information
from both bodies simultaneously but you won't be able to synchronize that
information across the partition (unless you made both of your bodies talk to
each other).

Just imagine that all creatures on earth were born with only half a body split
down the middle; one eye, one ear, one nostril, one arm, etc... As (half) a
human, you would probably think that two half-bodies sharing a single mind is
impossible... Maybe, as (full) humans, when we think about two bodies sharing
a single mind, we also fall into the same psychological trap.

It's true that comparing two half-bodies vs two full-bodies isn't a perfect
comparison because in real life, the right and left halves of our brains are
connected and do exchange some information... But studies show that some
people have been able to live with only half a brain (or half of the brain
severely damaged) - Both hemispheres do experience a certain level of
isolation (partition) from each other; so a lot of the information is not
actually shared across both hemispheres and yet you still experience it. So
this does lend some credence to the notion that a single consciousness (or at
least a part of that single consciousness) CAN exist across a physical
partition.

So the two-body scenario is just a slightly more extreme version of the two-
halves scenario which we all subjectively experience on a daily basis.

------
ringofgyges
Some further insight here: [https://www.inverse.com/article/15728-can-
cryonics-save-the-...](https://www.inverse.com/article/15728-can-cryonics-
save-the-brain-and-the-self-a-doctor-holds-out-hope)

------
charlesism
"Reverting to a previous copy is not the same as continuing to live. This
seems really simple and obvious to me. So what am I missing"

What the author is missing is that something can be "simple and obvious" and
also wrong.

------
robohamburger
The more interesting question to me is if a good enough copy can be made that
is functionally same and what that means.

Reminds me of the Ship of Theseus but with the human body/brain.

