
Your mind will not be uploaded (2014) - walterbell
http://www.softmachines.org/wordpress/?p=1558
======
ncmncm
And if it were, it wouldn't be _you_.

But it might be enough _like_ you that it could convince your bank to give it
your money.

~~~
adrianmonk
There's a Star Trek novel
([https://en.wikipedia.org/wiki/Spock_Must_Die!](https://en.wikipedia.org/wiki/Spock_Must_Die!))
where IIRC they explore how the transporter is sort of destroying the original
you and recreating a new one somewhere else. It's a legit philosophical
question, but practically speaking, the people who use the transporter don't
feel like they're longer themselves. They just feel like they've moved.

That's fiction, of course, but what the characters experience is what I'd
expect to happen in real life if a transporter did exist. If a structure
(human) is destroyed and recreated identically at some other location, I don't
see why that should be any different than moving the atoms over while
preserving the structure.

And I don't see why things would be different if you created the same
structure (or an isomorphic one) inside a computer.

Of course, I'm assuming that what makes you _you_ is simply the structure and
how it operates as a system. That's definitely a philosophical question, but I
think if its answer is what I expect, then a perfect recreation of you _is_
you.

And yes, this raises the question of whether there can be more than one of
you. (Which I think the Star Trek novel also covers.) I don't think this
cloning presents a philosophical barrier, though. I think it's similar to
calling fork(). The only observable differences come from the kernel
intervening to change the pid and fork()'s return value. There's nothing
fundamentally unsound about the idea of splitting into two and both being
"you". Although there are definitely things that are unsettling about it.

~~~
pmoriarty
_" the people who use the transporter don't feel like they're longer
themselves. They just feel like they've moved."_

Well, the people going in the transporter would be killed, so they wouldn't be
feeling anything after that. The people coming out of the transporter might
feel like they just appeared in a new place, but that is immaterial to the
people who were killed by the transporter.

Similarly, a mother could have grounds for rejecting a teleported child, for
the one who came out of the teleporter was not physically the one she gave
birth to.

Incidentally, the philosophical concept that episode explores goes back long,
long before Star Trek, to the _Ship of Theseus_ [1] thought experiment from
Ancient Greece (and I wouldn't be surprised if this was considered even before
then in ancient India or China).

The _Ship of Theseus_ problem asks: if a ship's parts are progressively
replaced with new parts, until all the parts have been replaced, is it still
the same ship?

The cells in our bodies are being progressively replaced. A materialist,
naturalist, or physicalist would have trouble explaining how someone could
retain their original identity if nothing of them was left after this process.

The brain upload, teleporter, and transporter questions are variations on
these.

[1] -
[https://en.wikipedia.org/wiki/Ship_of_Theseus](https://en.wikipedia.org/wiki/Ship_of_Theseus)

~~~
comex
> The cells in our bodies are being progressively replaced. A materialist,
> naturalist, or physicalist would have trouble explaining how someone could
> retain their original identity if nothing of them was left after this
> process.

Suppose I transfer a virtual machine from one physical server to another using
some live migration tool. Of course it migrates the state of all running
programs. But beyond that, for all practical purposes, the new VM assumes the
identity of the old VM: its IP address, its pending requests, its role in a
cluster, et cetera.

Someone might argue that the new VM is not really the same entity as the old
VM; after all, it's stored on a different physical storage medium and
processed by a different physical CPU.

I'd respond: who cares about the physical storage medium? That's the wrong
layer of abstraction to be looking at. A VM does not exist on the plane of
matter but on the plane of information. It is not made of hard disks and CPUs,
but bits. The hardware is just a way to instantiate it in the physical world;
it's not part of its identity.

So it is with people, even if we currently lack a way to transfer the bits
representing them to different storage media.

~~~
pmoriarty
Except that a VM hasn't lived through any events in its life (since it's not
alive and doesn't experience anything anyway), and no one would care if it had
or hadn't. Also, there are no ethical issues with destroying a VM.

The same is not the case with people.

It would matter to a lot of people if their loved ones were the ones they knew
and had intimate contact with, not some clone of theirs that simply shares
their appearance and memories.

The person to whom it would arguably matter most is the person getting
destroyed or killed. What comfort is it to them that some clone of their lives
on?

~~~
comex
> Except that a VM hasn't lived through any events in its life (since it's not
> alive and doesn't experience anything anyway),

It has certainly processed events during its lifetime and accumulated data
about them (~= experience that a person might get from living through events).
There wouldn't be ethical issues with destroying it, but there would be the
downside of losing that data; however, migrating it doesn't have that problem.

> The person to whom it would arguably matter most is the person getting
> destroyed or killed. What comfort is it to them that some clone of their
> lives on?

Personally I would have no problem with myself or anybody I know going through
such a process. Partly because I would consider the result the same person,
but partly because I think more broadly that the difference between "I live
on" and "a clone of me lives on" is unimportant.

For example, suppose I went through a teleporter but due to some mishap the
body at the sending end were not destroyed, so there were two copies of me. If
this happened and were detected immediately, both copies of me would freely
volunteer to kill themselves in favor of keeping the other one. From my
perspective (as either of the copies), the only thing lost would be the
memories I made since teleporting, but losing a few minutes' worth of memories
is not a big deal and something that happens all the time. Beyond that, the
other copy is still "me", so there's nothing to offend my desire for self-
preservation. If on the other hand the situation lasted much longer before
being detected, so I gained more unique memories, or if it was only a few
minutes but I just happened to have some kind of eureka moment in that time, I
might be more ambivalent.

Ideally there would be some way to merge the memories of the two copies into
one person. In that case, I would be interested in intentionally making short-
term forks of myself on a regular basis, to be merged later (say, after a few
hours), so that I could effectively do multiple things at the same time.

~~~
jodrellblank
_Partly because I would consider the result the same person_

Do you consider identical twins the same person? Do you consider Abby and
Brittany Hensel[1] the same person - they share the same body so they must
have shared the same life experiences?

> _I think more broadly that the difference between "I live on" and "a clone
> of me lives on" is unimportant._

Do you think the difference between "I live on" and "my child lives on" is
unimportant? What about the difference between "I live on" and "someone else
lives on"? The clone is a different body in a different place, it's the same
situation as "I live on" vs "any other human lives on", except for what you
believe about the other person.

> _If this happened and were detected immediately, both copies of me would
> freely volunteer to kill themselves in favor of keeping the other one._

Then why don't you suicide right now, if continuity of experience is that
casually irrelevant to you and all you need is a belief that some other body
in the universe shares the same ideas you do? There are a lot of people on the
planet right now, are you not convinced that someone, somewhere, shares enough
of the important ideas with you already? Why not? Is your memory of a
dentist's visit in third grade, or your particular grandmother's generic apple
pie really _that_ special?

[1]
[https://www.youtube.com/watch?v=VKrvtq5vDmk](https://www.youtube.com/watch?v=VKrvtq5vDmk)

~~~
comex
I should have been more clear about the role I believe experiences play. What
matters to me is not just a person's concrete memories of experiences, but
their unique personality that has been built up by a combination of those
experiences, genetics, and a hefty dose of randomness.

In the "accidental clone" scenario, I wouldn't _just_ be losing memories of a
few minutes of experiences; I'd also be losing whatever delta to my
personality the discarded clone went through in those minutes. However, in
most cases I wouldn't care because that delta would be miniscule. As I
mentioned, a _potential_ problematic case is if I suddenly had a eureka moment
in those few minutes that single-handedly changed my personality, but that's
pretty unlikely.

With Abby and Brittany Hensel, on the other hand, they've had three decades to
build up unique personalities, so despite sharing many of the same experiences
they're definitely different people. (Plus, there are various sources of
divergence between their experiences. For example, each of them will remember
a different subset of their experiences, and in any argument between the two
of them, each of them would experience being on a different side of the
argument.)

Similarly, "I live on" and "my child live on" are very different things; a
parent and child don't even share most of the same experiences, let alone
personality.

> Then why don't you suicide right now, if continuity of experience is that
> casually irrelevant to you and all you need is a belief that some other body
> in the universe shares the same ideas you do? There are a lot of people on
> the planet right now, are you not convinced that someone, somewhere, shares
> enough of the important ideas with you already? Why not? Is your memory of a
> dentist's visit in third grade, or your particular grandmother's generic
> apple pie really _that_ special?

I'm a special snowflake… just like everybody else. [ed:] I'm not important,
but I'm unique; nobody is exactly like me.

Do I make the world much better with my existence? Not really. Hopefully I
make it a little better; hopefully I don't make it worse; but I'm only one
person and not a particularly famous one.

But I still want to live, and I want other people who are alive to keep
living. Not as a means to some end, but as an ingrained human desire.

Yet I choose to interpret "I" as "my special-snowflakeness", as opposed to
tying it to some physical object [ed: or continuity or linearity of
experience, which I think are largely violated already when I fall asleep or
lose memories]. An exact copy of me is the same special snowflake, at least at
first, progressively diverging as the two copies start to have different
experiences.

~~~
jodrellblank
Once you say that you would die if someone else carries on, and that the other
person doesn't have to be exactly you but can have for example a few minutes
delta, then we get into the discussion of just what it is that makes your
belief that they are "similar enough" that you can happily suicide.

> " _I 'm not important, but I'm unique; nobody is exactly like me._"; " _I
> still want to live_ "

I submit that if you were in the accidental clone scenario, you _wouldn 't_ be
willing to die as you claim, but would instead feel "I still want to live"
very strongly, and your belief that the clone is "exactly like you" would give
way to "they just came out of a nano-assembler and they know it, they're only
an approximation of me who knows how many things are subtly different inside,
I deserve to live more strongly" or on the other side "I just stepped out of a
nano-assembler, I'm new and fresh and they've been ageing for decades and they
intended to die in that scanner, I deserve to live more strongly", or possibly
"I'm willing to share everything I have and compromise, there's no need for
any death here".

> _[ed: or continuity or linearity of experience, which I think are largely
> violated already when I fall asleep or lose memories]_

I'm not happy with the casual idea that continuity of experience is violated
by sleep. If that were true, waking up somewhere "unexpected" would be
impossible, instead it's very disorienting and scary and a trope of horror
stories. We expect to wake up where we went to sleep, we expect any times of
waking up in the night to be in the right places in the sleep timeline, and we
expect to feel an amount of refreshed/relaxed/over-tired/too hot/aching/with a
numb arm from lying on it/etc. depending on how long we slept for and how well
we slept (which is something we are aware of and can report on), and we can
experience waking up with answers or decisions to things we were thinking
about the night before. Sleep is not 8 hours of non-event.

------
hinkley
If you have not watched the animated short, "The World of Tomorrow" you really
should.

It has a very pointed reference to the potential horrors of having your mind
uploaded by someone who is trying to control costs.

~~~
LukaCEnzo
Did you mean this:
[https://youtu.be/IFe9wiDfb0E](https://youtu.be/IFe9wiDfb0E)

~~~
hinkley
No,
[https://en.wikipedia.org/wiki/World_of_Tomorrow_(film)](https://en.wikipedia.org/wiki/World_of_Tomorrow_\(film\))

The child's doodles make it seem like a lighthearted thing. It is not.

A grownup is attempting to explain some fairly dark things to a very young
child. Mercifully, the child cannot absorb what it is being told. You may not
be so lucky. But there is some biting social commentary and the absurdity of
the whole thing makes it work.

------
marcusestes
Unless it already has been.

~~~
athorax
We are all already uploaded to the simulation!

------
29athrowaway
If your mind is uploaded, the entity that transcends is a copy of you rather
than yourself.

------
zaarn
Finally a friend I can trust to complete my prep work for DnD sessions. Tbh I
don't need the philosophy, the upload is a copy but it's also a person. I
might treat it as me, depending on how good the upload it. If the upload is
destructive I would consider myself to be myself, functionally I continue to
exist beyond death.

Making it more complicated for no good reason is just draining the fun out of
it.

------
benibela
You could freeze your brain and keep it frozen for a few hundred years, till
they have nanobots to repair the freezing damage and scan the brain

~~~
jodrellblank
You could _imagine_ doing that. It's much much less certain that you could
_do_ that.

------
sonofgod
datapoint, to update to 2020: we can now map a third of a fly brain.

[https://www.theverge.com/2020/1/22/21076806/google-
janelia-f...](https://www.theverge.com/2020/1/22/21076806/google-janelia-
flyem-fruit-fly-brain-map-hemibrain-connectome)

~~~
aeternum
This is probably the most convincing evidence that your mind likely can be
uploaded in the near future.

~~~
bognition
Depends on how you define the near future. There's a bunch of reasons to be
skeptical that we'll fully digitize a human connectome anytime soon.

First the fruit fry brain is very small. You can image the entire thing at the
microscopic level with a single image. The human brain is massive by
comparison. Getting a coherent image that traces an axon from the tip of
frontal lobe to the back off the occipital lobe is going to be a huge
challenge.

Second the fruit-fly brain has 25,000 neurons while the human brain has more
than 10,000,000,000. There's 6 orders of magnitude difference there.

Third, it's highly likely that glia (non-neurons) in the brain play a major
role in neural computation so we'll have to image those too. Humans have way
more glia than most other animals.

Lastly the connectivity of neurons in the human brain is very high. Getting
those little connections right is key in all this as we aren't just going for
the neurons but the connections between them.

~~~
trhway
>Second the fruit-fly brain has 25,000 neurons while the human brain has more
than 10,000,000,000. There's 6 orders of magnitude difference there.

>Lastly the connectivity of neurons in the human brain is very high.

yep, the human brain is 100B neurons and has 10e4 connections per neuron while
the fly brain has 10e3 connections/neuron. So we need to emulate 10e15
connections of the human brain. GPT-3 has 175B of weights.

~~~
aeternum
Right, but GPT-2 had 1.5 billion neurons in 2019 so GPT-3 was a 100x increase.
1e15 may not be that far off, especially since these operations are relatively
straightforward to run in parallel.

------
scipute68
The valuable part of you in the ML/AI and state sponsored sense are your
memories. Not how you experienced them. Replicating individual consciousness
as 'you' would be cruelty unless embodied somehow. Not saying it could not be
done I just don't see the point.

------
bigodbiel
With GPT-3 I think we are rushing towards the Black Mirror episode Be Right
Back, where through someone's recorded presence an AI avatar and later a full
bodied android could be created.

~~~
hesdeadjim
I can only begin to imagine the uncanny valley that is an AI version of me
outputting GPT-3 text.

------
lostmsu
TL;DR: the claim is that some quantum process is fundamental to brain
functioning, and that makes upload impossible.

Well, really, there's no proof to that hypothesis. And I find the argument
weak. The complex processes necessary for cells to live might have nothing or
just a tiny bit to do with our consciousness.

------
vertbhrtn
The HN crowd tends to confuse the human's mind with the human's identity. We
think that the final destination of human development is some sort of "super
intelligence" \- a mind with insane processing power and unlimited ability to
memorize and recall things. That's understandable because our life today
revolves around knowledge, but one day ability to think will yield to ability
to see ideas directly. If we could see a theorem's statement and its proof
visibly connected like a tree's roots and its leaves, we wouldn't need to
"think".

