

Cringely: Surviving Immortality - ph0rque
http://www.pbs.org/cringely/pulpit/2007/pulpit_20070817_002727.html

======
motoko
1) Singularity is interesting because it changes what it means to be human.
The idea of "machines enslaving us" suggests that people are thinking like
movie plots: machines become "human-like" amid confused humans doomed as
common everymen. What's more likely is that the modern human will become
obsolete in the same way that the hunter-gather human has and the agrarian
human is. Machines don't "enslave" us, they _become_ us.

2) How long until irresponsible birth is as egregious as irresponsible death?

~~~
mpc
Maybe it won't even be a problem. Birth rates decrease as life expectancy
increases...

~~~
rms
Cory Doctorow wrote a good short story about this called Truncat.
[http://dir.salon.com/story/tech/feature/2003/08/26/truncat/i...](http://dir.salon.com/story/tech/feature/2003/08/26/truncat/index.html)

It describes the coming of age of a member of the last generation of humanity
in a post-singularity society where a reputational economy has drastically
decreased the birthrate.

------
rms
Most people focus on the technological singularity, the development of strong
AI.

I think the more likely singularity is biological singularity, when individual
human consciousnesses become immortal. A single human thinking about a problem
for 10 million years should be able so solve pretty much anything. It gets
easier if we can copy and accelerate ourselves. Hopefully infinite is enough
time to figure out the fourth dimension or black holes. Then we get to go to
the next level and figure out our next problem.

~~~
jey
" _I think the more likely singularity is biological singularity, when
individual human consciousnesses become immortal._ "

OK, why? Seems like we'd at least be able to get Strong AI by the crappy brute
force way by reverse engineering a human brain before that. Your crappily
reverse engineered human brain could then be used to design a proper Strong
AI, and we're back to Strong AI being the key ingredient for a Singularity.

Why is it that you think developing a Strong AI is less likely than us being
able to copy, accelerate, and immortalize a human brain?

" _A single human thinking about a problem for 10 million years should be able
so solve pretty much anything._ "

What's the justification for this claim? Last I checked, humans brains have
finite capacity and cannot recursively self-improve (in the sense of modifying
their architecture and structure).

~~~
rms
Alright, I can't defend what I wrote because I wrote it while somewhat
inebriated and these things are never as coherent the next morning... maybe I
should avoid trolling on news.yc in such states, it's gotten me into trouble
before. :)

I have absolutely no justification for why Strong AI is less likely than
immortality. I think it's wishful thinking on my part because I have a
paranoid fear that a strong AI would freeze humanity at its current level of
evolution. In other words, I think a far future society with infinitely
intelligent humans is much more desirable than a far future society with dumb
humans governed by a God-like AI.

The most likely path to immortality that I see is allowing consciousness
uploads. I think reverse engineering a brain would give us the knowledge to
accomplish such a task. Perhaps there will be some way to merge instead of
upload so that we can maintain our ego/self, but either way, it's a kind of
immortality. From a digitized human consciousness, Strong AI should come
pretty quickly, as the digitized human or outsiders can start modifying the
architecture and structure of copies of uploaded consciousnesses. In this
scenario, the technological and biological singularities happen at the same
time. Which I guess means that the technological and biological singularities
happen together.

------
jey
Singularity Summit 2007 in San Francisco: <http://www.singinst.org/summit2007>

