

Video of Kurzweil’s Latest Talk at Google - kkleiner
http://singularityhub.com/2009/09/10/video-of-kurzweils-latest-talk-at-google/

======
SirWart
The singularity stuff has always raised my bullshit detector enough that I
never paid any attention to it, but after watching the video I'll admit Ray
definitely has some really intriguing ideas about the exponential development
of technology worth paying attention to.

~~~
Devilboy
People seem to forget that he's been around for decades making predictions -
and he's been pretty accurate so far.

~~~
joe_the_user
Well, even more, the pattern he points to is not complex. It is simple
exponential growth. And the mechanism is not magic, it is the power of
miniaturization.

The main thing I'd question is whether the results will be ... uh, good. There
are some serious downside that one might imagine - the intelligent,
autonomous, killing machines the US army is working on - I know it's cheesy to
say but _what could possibly go wrong_? Seriously, technological shift makes
_nearly_ everyone _nearly_ all-powerful. There might be toes that would wind
being stepped on.

~~~
kiba
Armed robots only make the US military more powerful.

As for everyone else, it is time to upgrade to cyborg technologies, or
counteract with killer bots of their own.

------
joe_the_user
I think Kurzweil's books and pronouncements have been a powerful mover of even
critics of AI.

Has anyone written an intelligent, _thorough_ critique of his ideas? (Not that
I'm necessarily against them but it would be good to hear both sides).

~~~
queensnake
Douglas Hofstadter (more directly) and Peter Norvig (less) have. They were
invited speakers (not sure which year(s)) at the Singularity conference.
Hofstadter's was almost embarrassing to watch, he ripped on everything
(whether you find his argument credible or not), root and branch, IIRC.

~~~
tocomment
Can I see?

~~~
fnid
Here's the video of his speech there at google video:
<http://video.google.com/videoplay?docid=8832143373632003914>

~~~
joe_the_user
How, the interesting thing about the Hofstadter video is that he says more or
less what I say in above post ... _"I'd like to see serious scientists taking
these ideas seriously"_.

"The Singularity" is an incredibly power, scary and difficult understand
scenario. There may be a number of people really gung-ho about the possibility
but since it's a possibility that promises to end the human race _as we know
it_ , there should be many more people thinking about it. But there aren't.
The Singularity may not even be possible but it is past the level of being
debunked like a flying saucer - there are very _plausible_ argument for it.
Not necessarily unavoidable but plausible. The most important thing to
remember is that no every amazing technology has to continue exponential
growth for this to happen. The technologies that continue growing can make up
for others that don't. Especially if Biology or artificial intelligence as
such somehow didn't progress for ten years, nano-technology and information
technology will such incredible tools that they would force further biological
and AI advances unimaginable today.

So, this is important...

~~~
fnid
I agree, I think it is very important. Unfortunately, I think it's just too
far beyond people's understanding of technology.

The interesting thing about it for me is that this advancement, this creation
of humans, is feared. It tells us a lot about ourselves. Why do we fear what
we create? Why do we fear advancement? When we watch movies about the future,
the Terminators and such, they're all bleak.

And because of this fear, we disregard the future. We deny it or abandon
thoughts of it. In doing so, we leave serious contemplation of these
technologies to those who are using it ways that _should_ be feared. We're
building killing robots and unmanned drones -- exactly the thing of our
nightmare scenarios.

Instead, we should be reducing our use of advanced technologies to cause harm
to humanity. Without serious discussion of the ramifications, we are leaving
decisions to the unconscious, to the short term thinkers who will use the
technology in the quest for world domination.

