
Welcome To Life - ryannielsen
http://www.tomscott.com/life/
======
sodiumphosphate
While it does make for some entertaining fiction, and may provide some
benefits to the living, I do not believe that 'uploading' is a desirable
experience.

In fact, I don't believe that one can experience it at all. Imagine the
procedure (for a conscious person): your brain is connected to a computer
interface and a copy of your mind is taken. Great. Now there is a digital copy
of your mind. So what? You still get to die.

Consciousness is mortally bound to the physical body, and will die along with
it regardless of how many mental copies are made.

I much prefer the idea of human metamorphosis. I want to inject nanorobots
into my body which will _transform it gradually_ , cell by cell into an
improved synthetic one. In this way, immortality and superhumanity can be
achieved without loss of continuity. I imagine the experience would be one of
waking up a little better (stronger, smarter, etc.) every morning until the
cells are all upgraded.

I don't read enough these days because of chronic eyestrain, but if you are
aware of anything dealing with this concept of human metamorphosis, fiction or
otherwise, I would appreciate a link. It's something I would like to read
about (if it's even been dealt with), someday when I have time (and don't have
a headache).

~~~
oskarth
I agree that human metamorphosis is extremely interesting, but this line I
really don't get:

> Consciousness is mortally bound to the physical body, and will die along
> with it regardless of how many mental copies are made.

I don't understand how you can put out a statement like that, like it's taken
for granted or something. It's far from granted, and I've never heard of
something remotely approaching a proof for this type of assertion. To my mind
it's the same kind of reasoning which proved we would never fly, nor go to
space.

It's an open problem. Please keep it that way until we know more.

We don't have the proper tools to understand _consciousness_ in a scientific
way yet (consider the extremely limitations of EEG and MRI) For those
interested in this _philosophical_ question, there's an area called Philosophy
of Mind [0] which deals with this and related questions.

0: <http://en.wikipedia.org/wiki/Philosophy_of_mind>

~~~
sodiumphosphate
Granted that consciousness is not fully understood, can we not agree that it
is entirely manifest in the brain? I mean, you don't believe that human
consciousness somehow survives the body after brain death, do you?

Even if the cloned memories could be streamed from your brain into a running
simulation, allowing a memory of _crossing the threshold_ from the physical
world to the digital one to be created in the simulation, the 'patient' will
not have this experience. The patient will still have to experience death.

Going further, let's imagine that we want it _really, really badly_. Let's
destroy each neuron immediately after taking it's state and pushing it into
the stream, so that the whole process of uploading and death are completely
synchronized and nothing of the patient's consciousness remains afterward.

I'm sorry, but I still do not think that the patient will experience being
uploaded into a computer. It will only experience having it's brain fried
until death. For the clone, the memory might just be glorious, like some
Hollywood special effect sequence, but personally I find this wholly
unsatisfying.

I'm not saying it shouldn't be done, either. I'm only saying that I suspect
some people may have the wrong perspective about it. If you sign up for this
procedure, _you are not signing up for immortality_ , but merely to _donate a
copy of your memories_ to whomever is running the simulation.

You would be like an organ donor, but instead of saving a life, you'd be
feeding your mind to the new Zombie Second Life.

It would not benefit you, but it might benefit someone else.

As for me, I'm a selfish, arrogant bastard with a desire for physical
immortality and superhuman capabilities.

~~~
feral
I'm not sure. Your body changes all the time. Your mind is always running on
different hardware than it was just a few moments before. The changes are just
small at each step.

>I'm sorry, but I still do not think that the patient will experience being
uploaded into a computer.

If the copy/delete process was (perceptually, hence effectively)
instantaneous, what is the difference?

Lets say your biological body was going to die, in a day. But you have the
option of being perfectly copied into a new biological body now, with the
current body being instantly destroyed. Would you take the option? I think I
would.

Whats the problem with that? How is it any different than what happens to your
body on a day-to-day basis, now, anyway?

I don't see the problem, when we put it in these terms.

~~~
batista
> _Whats the problem with that? How is it any different than what happens to
> your body on a day-to-day basis, now, anyway?_

Or how is it different that going into sleep and waking up?

~~~
ingenium
Going to sleep does not cause the brain to actually power off. It's still
running, your consciousness is just different. You still wake up from loud
noises and light, and external stimuli can still influence your dreams.

~~~
batista
_> Going to sleep does not cause the brain to actually power off. It's still
running, your consciousness is just different. You still wake up from loud
noises and light, and external stimuli can still influence your dreams._

Anesthesia, then, or fainting. You cannot wake up from loud noises, and you
have no consciousness.

One can also image a transfer process that is just like what naturally happens
with regular body cells changing every 7 years or so.

E.g your brain is slowly, over a period of years, replaced cell by cell with
new compatible neurons. One by one, your old neurons are replaced with the
new, designed ones.

Those can read the state of the neuron they exchange AND exchange signals with
your regular neurons, so before and after one is replaced, your brain is still
functioning normally. No "power off" phase.

How about that?

~~~
Lockyy
That is what the person above describes when they say they want to be injected
with nano-bots to be converted. And that would, I believe, give a seamless
transition with no break in the stream of consciousness.

However this is completely different from the concept of mind uploading where
your brain is scanned and simulated separately.

------
nova
There is a lot of the classical discussion on personal identity here.

The argument about a copy being "identical to me but it's not me" is a very
natural and intuitive reaction, but it's probably wrong. A very illuminating
explanation is <http://lesswrong.com/lw/pm/identity_isnt_in_specific_atoms>
and related posts.

I'll try a personal approach: imagine a country with no computers, no
photocopiers, no presses, and a very strict law that forbids making copies of
books which is indoctrinated from birth. There is only one copy of each
literary creation, guarded in the National Library.

We can imagine that the people of that country have a very hard time to
distinguish between "The Lord of the Rings" in an abstract sense and material
object #1434 stored in the second floor, section Fantasy. For them LOTR is
just _that_ set of paper and ink. Even if someone secretly made a copy, letter
by letter, it will be "just a copy", not the "true" LOTR. That's quite obvious
to them.

And in a sense that's a bit our current situation with people (not exactly,
because books are static and people aren't). We can't scan people-as-
information out of people-as-protein-body and store them as people-as-silicon-
body, so we mix the concepts. But it's just confusion due to technological
limitations.

So in a sense we DO have souls, but it's a mathematical or information-
theoretical soul, not a metaphysical one. It's always _bound_ to a material
instantation, but that concrete physical manifestation is not the same as the
dynamic, interactive information process a person is.

So if technology advances enough we'll be able to copy a person, and it'll
make no sense to ask which of them is really "you". Both are.

~~~
jterce
Yes, of course both are. But still, you're only ever going to be conscious
from the perspective contained within the original you. The copy may be
mathematically identical, but it does not make sense that there would be any
kind of a magical transition of consciousness from one to the other. The other
"you" would have its own consciousness entirely separate from yours. I.e., if
you copy yourself and kill the original, you will lose consciousness. You will
not wake up.

~~~
nova
Words fail me.

Imagine I have a magical device with which I can "stop time", or more
precisely, stop movement of all particles except mine. In the beginning I am
at our right. I stop time, move leisurely to your left and resume time.

What you see is me teleporting from your right to your left. You don't feel
"trapped in a body who can't move", because your neurons are also magically
frozen. Your consciousness _feel_ continuous, even is you have been
"suspended" for a while, because your memories are properly structured.
Consciousness is not "something else", it is a property of your memories. (The
blog post on lesswrong.com about timeless physics is important to see this
point of view, I think, even if you don't buy the theory).

So if my copy is mathematically identical then there is nothing else left.
There is no extra consciousness stuff that the "original" has and the "copy"
does not.

Sorry, I really can't explain myself better. I'd suggest reading the link
provided.

~~~
koide
I will read what you suggest, but perhaps you can illustrate how would the
sensation of identity will transfer from the physical body to the digital
body? Because your example does not explain it.

~~~
notJim
(I am not the OP).

Let's suppose we believe that everything about me can be contained in the
physical processes that happen in my body.

Now suppose that we have the ability to digitally simulate those processes
perfectly. Since my existence is purely physical, the "sensation of identity"
you describe is perfectly captured by this digital simulation, _by
definition_.

~~~
koide
But then it would be a whole new sensation of identity _of the copy_ , not the
existing sensation of identity.

------
arethuza
If you find this interesting then I can recommend Greg Egan's novel
Permutation City:

<http://en.wikipedia.org/wiki/Permutation_City>

~~~
lordlicorice
The "rate limited" option in the video reminds me of slow clubs from that
book.

For people who haven't read it, in the novel there are virtual parties called
slow clubs where poor consciousnesses go to meet. Everyone agrees to limit
themselves to the processing speed of the slowest (poorest) person there, so
that people who can only afford the occasional spare clock cycle can have
meaningful human contact.

~~~
arethuza
One idea that I found quite haunting from Permutation City is that idea that
you could set up a higher level controller for your running Copy mind that
would periodically chose a new obsessive interest and then provide you with a
rich environment to support your obsession.

As someone who is prone to rather obsessive interests (as I suspect many are
on HN) I'm increasingly conscious of that fact that I can't directly control
what I find interesting.

~~~
lordlicorice
I know!

People get so attached to the idea of free will. What free will? You do what
you want, but you have no control over what you want.

------
SudarshanP
[http://www.smbc-comics.com/index.php?db=comics&id=2603&#...</a><p>"Here lies
humanity... Do not resucitate"

------
fabricode
The true nightmare begins when it says...

    
    
        Please log in with your Facebook id and password

------
Spoom
Under current laws, wouldn't the way the Terms of Service are presented be
considered a contract signed under duress? I mean, they are saying that if you
don't sign the contract, they'll kill you, which is kind of the textbook
definition of duress.

~~~
rbanffy
They could counter you are already dead and they are offering you the
possibility of "post-mortem health care".

~~~
aeturnum
That doesn't remove the duress. :)

However, before we get to a point where people are being uploaded into a
system, we have a lot of legal president to set. The legal rights of the bits
that are copied from your consciousness, for one.

------
sikhnerd
It's tough not to immediately draw the many parallels to what we see happening
in society right now and the scary part is that this is so easily conceivable
as a realistic future. Well done!

~~~
sigkill
Facebook has almost made everyone have a presence online. Google Glass will
make sure we're always online.

------
delinka
"Your personal brand preferences may be altered to align with those of our
sponsors." What marketer wouldn't want the ability to do exactly this?

------
oskarth
Lets just hope the use of "traffic accident" as a euphemism doesn't increase
in order to end a trial prematurely. I would prefer to use my trial version
for as long as possible (or would I really? Look at all those nice plugins!).

Some (rather lame) lingo for the coming pay model of life:

 _Cheapskate, you're such a trialler_

 _You haven't really lived until you lived with Life TM._

 _He hasn't upgraded in a long, long time_

 _Do this for me and I'll make you live forever_

 _You gotta commit suicide at least once, the rush beats base jumping hands
down, and the white tunnel effect is pure bliss - and it's legal_

 _Well you know, I could sponsor your life subscription if that would help the
situation_

------
sakai
Wow, and I thought this was going to be uplifting. Cool concept nonetheless.

~~~
itmag
_Wow, and I thought this was going to be uplifting._

Pun intended? :)

<http://en.wikipedia.org/wiki/Uplift_(science_fiction)>

------
TamDenholm
Great video, i find it amusing but also quite depressing at the same time. I
would be utterly unsurprised if the original premise was possible, this is
what would happen.

------
vsviridov
Kinda reminded me of Pleix's E-Baby, circa 2003 -
<http://pleix.net/filter/film/E-Baby>

------
ilaksh
Very entertaining and interesting concept.

BUT.. are people really so pessimistic about technology and the future that
the main thing that comes to mind when they think of digital immortality is..
this?

Your virtual body can be anything you want. You can't die as long as your
digital code is backed up somewhere.

If you can't see the bright side of that stuff, there is some kind of
psychological issue.

~~~
jerf
It has both effectively infinite possibility for goodness and infinite
possibility for badness. Your virtual body can be anything you want... _as
long as you control the execution parameters_. This is not guaranteed! What if
$YOUR_LEAST_FAVORITE_COMPANY is the one in control? Or worse? Even the option
of suicide can be removed from you. If Hell does not exist, there is a non-
zero chance Man will make it for himself.

Indeed, if things are just left alone I would not even say the default state
is that you'll be in control of your own code. It _will_ be a company that
develops this technology first, since the alternative is inconceivable, and
they will have their own agenda. There's a lot more possibilities that could
emerge than just a happy utopia.

~~~
satori99
Iain M Banks most recent Culture novel, Surface Detail, dealt with the idea of
Digital Heavens, and of course Digital Hell's. They are run by religious high
technology civilizations.

A virtual hell is a nasty place where sinners are copied into post death, for
eternity. Sometimes young sinners are given day trips there to straighten them
out, before death.

He talk a little about it in this wired interview:
<http://www.wired.com/underwire/2010/10/iain-banks/all/1>

------
ericb
I'm not excited about this future. Whatever is uploaded isn't me, it is merely
_like_ me--a copy. Anything that is theoretically capable of running while I'm
still alive is just a copy--it isn't _me_.

~~~
dmbass
The "copy" becomes the "real" you because it has extended life. Both you and
the "copy" consider yourselves to be the "real one" (or maybe you accept the
coexistence), but you are going to die and the copies will live forever.

You think: "That copy isn't me, ericb!" * dies *

Copy thinks: "Great! I, ericb, just got a new lease on life!"

~~~
ericb
That is great for the copy, but not helpful for me. Death of my consciousness
can't _inflict_ a new identity on the copy. The copy is a copy--he's awesome,
just like me, but not me because _I_ am dead.

~~~
sofal
What if you wake up as the copy? Are you going to be depressed that you're not
you?

~~~
ericb
It wouldn't be me that woke up as a copy. It would be a copy that feels just
like me. However, since he would have my intellect, he would reason that he is
not the original, and if it had died, he would feel sad. On the other hand, he
might be excited if he were in a machine body as that fate might not befall
him for a very, very long time.

~~~
sofal
_It wouldn't be me that woke up as a copy._

Here is where you're wrong. You presuppose the "he" and "I" when they are
perfectly interchangeable right up until the cloning process. You can't
presume that because you're the original now that you will be after the
cloning process, because that has a 50% chance of being wrong. Right now, you
and the clone are one and the same. Any reasoning you do about "me versus him"
before the cloning is shortsighted. You have to reason like this: "If I wake
up as the clone, then X. If not, then Y." It's like calling fork() in your
code and then writing the code after that to presuppose that you're the parent
process without even checking the return code. You don't know that you're the
original after the cloning unless you see evidence that you are.

Think of it this way: let's say instead of a brain copy it is a full body
copy. You walk into a dark room where you can't see anything. They knock you
unconscious, and then you wake up lying next to what appears to be you. How do
you know if you're the original? Because you were the original before? Sorry,
but no.

~~~
ericb
Each running instance of me is a separate, alive, person. Forget about the
memories for the moment. If you kill that person, they are dead. It doesn't
matter who is the copy. I could be a clone for all I care. Doesn't matter. If
you kill this body--the one I'm IN then my stream of thoughts ends. Other
similar streams of thought offer me no comfort.

------
beernutz
I like the idea i read somewhere about "Spinning off" versions of "you" to go
do/learn something, then re-integrating that version of you back into
yourself! I still agree with sodiumphosphate however, in that i prefer
immortality and dodging the "end" of this mind. I MUCH prefer the idea that
nanites can make us better, rather than "uploading" ourselves totally.

This is also why i don't like the idea of a Star Trek style transporter. At
what point do you stop being you? I'd much rather stay alive and in one piece
thank you. 8)

------
jakeonthemove
Hmm, if this was possible, I'd rather rig up an autonomous backup machine that
I would periodically backup to and which would start automatically if I miss a
scheduled backup (which would mean I'm dead). It would be like using hosted vs
self-hosted, buying CTO or pre-built computers.

------
sandieman
What happens if you don't agree to TOS?

~~~
drostie
Then you get to go to heaven. Which might sound better at first, but you might
have to agree to an even stricter TOS to stay there.

~~~
arethuza
Or a copy of your mind could be put into a virtual hell - a rather disturbing
concept from Iain M. Bank's novel Surface Detail:

<http://en.wikipedia.org/wiki/Surface_Detail>

~~~
lordlicorice
Great book, but I don't think the treatment of virtual hells was very
convincing. First of all, the characters in hell would have gone insane with
agony, but they just kind of wince it off and keep chatting. It reminded me of
Niven and Pournelle's _awful_ 1976 adaptation of Dante's Inferno, in which,
for example, the macho protagonist grits his teeth and swims across a boiling
lake.

Anyway, secondly I don't understand where all the devil-with-pitchfork stuff
comes from in Banks's hells. Why not just have everyone floating in blackness
and experiencing the maximum possible amount of pain at all times? To prevent
them from becoming catatonic you just keep resetting their brain state to the
moment of their death, so they're always feeling the first shock of agony. And
it scales fantastically; just copy that small simulation a billion times and
run them concurrently.

Of course, there will be some philosophical problems (are two identical
concurrent brain simulations really two distinct subjective consciousnesses?
If so, what are the implications of error-correcting server memory that store
everything twice? And does resetting someone's memory so that the same
simulation plays out repeatedly actually cause the person to experience things
multiple times, or is it immaterial how many times you replay it?) but that
will be of no comfort to you when you wake up in the pain box hell.

~~~
arethuza
"I don't understand where all the devil-with-pitchfork stuff comes from"

I assumed that it was all cultural (note the lowercase c there) - societies
would construct, using technology, the Heavens and Hells that their religions
had told people already existed.

Other writers have covered the idea that higher powers might use some of their
resources tormenting virtual copies of lesser beings - I'm sure this is
mentioned as something that some Transcendent Powers do in A Fire Upon the
Deep. Not to mention Charlie Stross's excellent A Colder War:

<http://en.wikipedia.org/wiki/A_Colder_War>

------
VMG
This is a little silly.

Simulated life will obey very different economics than normal life.

People should check out Robin Hanson's blog for more info:
<http://www.overcomingbias.com/2012/04/em-econ-101-talk.html>

------
keyle
I think you just have a script for a tv show or even a movie. Right there.

------
dennisgorelik
Human mind keeps only few key details about "copyrighted works" we observed.
These are not enough details stored in order to violate copyright.

~~~
sukuriant
That's why some people can recite whole poems and other people can play songs
from memory while singing them too. That's why you know just when to pause
when you're singing to a song on the radio and the singer stops, and you know
when the pitch changes, and what that particular beat is in the background
that you follow.

There's more than enough details to be considered a violation of copyright.

~~~
dennisgorelik
Only few people remember thole poems and minority of people remember whole
songs. In any case, the number of fully copied items would almost always stay
below one hundred (not hundred thousands). And even those "copies" would not
really be considered a copy by our contemporary judge (circa 2012). These are
"backups" at best.

------
qrlawified
I wonder how different the product would look depending on who was making
this. I.e. Apple vs Google...

------
amalag
As the embodied soul continuously passes, in this body, from boyhood to youth
to old age, the soul similarly passes into another body at death. A sober
person is not bewildered by such a change.

The change of body is like changing your clothes. Does anyone really think
conscoiusness is limited by matter?

------
nazgulnarsil
how can we preclude 4chan from the utility calculations the AI makes?

------
gren
Brilliant!

------
its_so_on
This is a side point, but to me, it was amazing how well the video captured
the usability frustration of our time.

Every time during watching the video that we're jarred out of the 'future',
with an unexpected Rejected, first are offered an option but can only choose
one with others greyed out, etc, it feels so much like the awful experience we
all have using just about anything today.

I mean, even down to the error BEEP jarring you out of the faux-pleasant
music. Maybe the video creator didn't have this in mind at all, but just
wanted to make it seem 'realistic' - but they got the usability nightmare we
live in spot-on!

------
billpatrianakos
Very cool. There were a lot of parallels to a handful of different things. I
loved it but if you're looking for so,e feedback I'd suggest maybe a bit more
focus on any one subject. If I didn't know better I'd guess this was inspired
by Vanilla Sky. But back to focus, I saw references to Vanilla Sky, tiered
subscription services, and copyright. While the message of each gets across
fine they all don't seem to fit really well. It was kind of awkward.

If this is an after-death experience then it really wouldn't matter if you
couldn't remember copyrighted works. But beyond that I don't think copyright
holders would care if you remembered their works because you're no longer able
to be a customer. Same with the ads. If the service works like it did in
Vanilla Sky then advertising is also pointless. Dead people don't buy things
again. They pay for the after death experience and that's it.

But otherwise it's cool and interesting. Maybe I'm reading into it too much.
Maybe there was no message and it was just a fun joke. Or maybe it wasn't
based on vanilla sky. I don't know, I feel bad offering that criticism but
while I was watching it that's what popped in my head.

~~~
drostie
The copyright issue that they've raised is actually one of the parts of
copyright that I consider mind-blowing. The idea that memory is copying
changes a great deal of the discussion -- for example, when the movie industry
says "oh man, we're losing profits because all these people are getting free
copies from the Internet," there is a direct parallel to them saying, "oh man,
we're losing profits because all these people are remembering." Imagine, they
spend all those millions of dollars on new films when they _could_ just spend
that as a one-time cost and zap us all with a Forgetting Ray as we leave the
theater. Did you feel entitled to keep those memories? Ugh, kids these days,
feeling entitled to copy our movie and share it with their friends.

What is not really addressed here, and you're hinting at it, is whether your
after-death experience requires working for a living. The premium service
mentions a 'subscription' so I assume it does. It's not strictly true that
"Dead people don't buy things again." I mean, you might not buy _physical
clothes_ again, but you might pay the Life Store for trendy clothing-data that
someone has coded for them.

I especially liked the "you can either agree to our Terms of Use or die, your
choice."

------
tomrod
This is cool. And terrifying.

