

The Singularity Summit 2007: AI and the Future of Humanity - Sept 8 & 9 - jey
http://www.singinst.org/summit2007/

======
jey
[http://flickr.com/photo_zoom.gne?id=146209889&size=l&...](http://flickr.com/photo_zoom.gne?id=146209889&size=l&context=set-72057594134491541)

------
ivankirigin
I went last year. Lots of fun. Cory Doctorow was probably the best speaker. I
hadn't read BoingBoing much before that, and now I'm a rapid fan.

I think the discussion with Luddites was fairly useless. It shows a fair
amount of due diligence, where dissenting voices are given a podium far larger
than the they "deserve" given the popularity of their opinions. I did enjoy
that the Luddite gave a talk through teleconference.

The format last year was pretty bad at times. The Open-Mic questions wasted
lots of time. Written questions were submitted but not used. There were no
break-off sessions.

Also, the moderator, Peter Thiel, is as boring as watching paint dry, and
didn't actually do any reasonable moderation of the discussion.

------
rms
Prediction: Strong AI will not exist until human consciousnesses can be copied
to computers.

[http://www.bbsonline.org/Preprints/OldArchive/bbs.searle2.ht...](http://www.bbsonline.org/Preprints/OldArchive/bbs.searle2.html)

~~~
jey
Searle is old hat. He's just playing word games. He assumes that "really
understanding" something means something deep that he has left unspecified.
The beautiful irony of his argument is that Searle's brain itself is a
freakin' Chinese Room, and this is basically a Chinese Room claiming that it
itself cannot be sentient! :-)

What's so special about human consciousness, anyway? or at least please
provide some sort of rationale for your claim.

Here's a kinda long talk that explains why Strong AI is feasible, skip over
all the introductions at the beginning:
<http://ia310111.us.archive.org/2/items/FutureSalon_02_2006/> Same thing in
Google Video format (lower quality):
<http://video.google.com/videoplay?docid=-821191370462819511>

~~~
yters
I know I'm probably just bringing up an issue that has been beaten to death
somewhere in a hideously long thread, but consciousness makes me think strong
AI is impossible. This is my line of thought:

1\. Consciousness is essential to human intelligence, because our great mental
advances come from our awareness, and objective description, of our own mind
and thought processes.

2\. A consciousness is by definition irreplicable, since two different
consciousnesses are essentially different.

3, If strong AI is possible (and computational), and there is no essential
difference between copies of the same code, then multiple identical copies of
the same IA mind can be made.

4\. If 1 and 2 hold, then 3 is false. Therefore, strong AI is impossible.

If any of these assumption need elaboration, let me know. And please, by all
means, direct me to that hideously long thread instead of answering my
argument in detail. I don't like to waste people's time. Much obliged.

~~~
rms
Regarding point 2: why can't you replicate consciousness? Of course we can't
today, but I don't understand why it is inherently impossible.

Just because my consciousness is different from everyone else on the earth
doesn't mean future technology couldn't copy my mind to a computer fifty times
over. It will be very controversial when technology progresses far enough to
allow consciousness copies.

~~~
yters
At least in terms of thought experiments, it makes no sense to me. Think about
it like this. If I can copy my consciousness and place it in a different
environment, then I should be aware of that other environement at the same
time I'm aware of this one. But for this to happen, information has to be
transported from both locations to eachother in no time. While pretty awesome
if true, the laws of physics don't allow for this, as far as I know.

~~~
rms
A copy of your consciousness is just that, a copy. Imagine that the entire
body is copied while we're at it.

So then there are two separate copies of you, interacting with the world.
There could be 100 copies. They have the same set of initial conditions they
are operating under but they won't all see or do the same thing, the different
consciousnesses are still products of their environment.

~~~
yters
So if I said I was going to give you immortality by copying your
consciousness, but you wouldn't be aware, wouldn't you feel gypped?

