
Vernor Vinge predicts singularity by 2030 - rms
http://hplusmagazine.com/articles/ai/singularity-101-vernor-vinge
======
10ren
_questions like “what is the meaning of life?” will be practical engineering
questions._

uh... I've heard that from another science fiction writer somewhere... now
what happened with that?

I don't think we have been becoming better at "outsourcing our cognition"
(though it's an inspiring goal). I've tried to do this as much as possible in
my own life, and I've found I'm much more effective when I don't do it _at
all_ (the most effective outsourcing is to try to articulate one's idea to
someone else - but they don't need to actually do anything). This lack of
gradual increase means there's no extrapolation to further increase, let alone
an exponentially increasing acceleration into a singularity - not, at least,
for _me_ :-(.

Mechanizing conceptual understanding remains difficult. Although we human
beings are good at modularizing knowledge (it's what we constantly do), and it
looks like it should be straightforward to automate, we haven't managed it,
_at all_. So many times in history, people have thought they are on the verge
of it...

Did you know that Mr Boolean originally called his algebra the "Rules of
Thought"? What he did was very cool, and you can see what he meant, but...
it's such an awfully long way from "thought".

I think it's possible to create human level intelligence (and probably faster
and broader - but maybe not any deeper); we just have no idea how. And
machines that are 16,000 times faster (21 Moore years) aren't going to help
with that, in themselves.

But it would be awesome.

(My masters was on statistical inference of pattern, without predetermined
patterns, and taking into account the complexity of the pattern sought. I
chose this as the closest approach to automating thought. We are a _long_ way
away, IMHO).

------
fauigerzigerk
I wonder whether super human intelligence is a super set of human intelligence
or something entirely different. I find it rather plausible that machine
intelligence will very soon be capable of doing things we humans cannot do. In
fact, that's already the case to some degree (and I don't mean just doing
things faster).

However, I believe the day when machines will be able to do everything we can
do, only better, is very far out. 2030 is in about 20 years. Looking at things
like anaphora resolution and how far we have progressed in the last 20 years,
I'm not optimistic we'll make the deadline.

And then of course, the singularity would not be a program that does anaphora
resolution, but a program that writes programs to solve any new problem and
determine worthy goals in the first place, because these are all things we
humans can do.

~~~
bsaunder
"Looking at things like anaphora resolution and how far we have progressed in
the last 20 years, I'm not optimistic we'll make the deadline."

It may just take a new perspective on the problem. Just look at the changes in
the internet over the last decade. IMHO, there's been an obvious rate increase
in dissemination and cross-pollenization of ideas. Blogs and sites like this
only decrease the generation time even further.

It's the ever increasing second derivative that makes the idea of the
singularity plausible (with all due respect to the S-curve believers (I think
they are right, but what's the point if the top of the S is so significantly
higher than where we are now)).

~~~
fauigerzigerk
Sure, breakthroughs are always possible, and more pieces of information are
linked to each other and so on. I'm not saying it's not possible that someone
has a good idea tomorrow that changes everything.

But no, ideas don't cross pollinate. It's individuals who cross pollinate
ideas in their heads and there's a limit to how much information a person can
possibly juggle in her head.

It's that creative process that we don't really understand yet which is the
bottleneck. No amount of linked information will be able to somehow
supercharge our coginitive capacity. Sometimes just having time to think
without interruption is more important than more information.

I think the likelyhood for breakthrough ideas is largely a function of the
number of people working on hard problems. I'm not sure how fast that rate
rises. Not very fast I would think. And the second derivative may very well be
zero.

------
jeroen
More facts, less handwaving:

[http://www.ted.com/index.php/talks/juan_enriquez_shares_mind...](http://www.ted.com/index.php/talks/juan_enriquez_shares_mindboggling_new_science.html)

~~~
Hexstream
Absolutely stunning talk. The real fun begins 8:00 in.

------
rms
Kurzweil also predicts the Singularity to occur by 2030. My personal thoughts
are that even if the AI doesn't emerge, we're already close enough to
Singularity that the exponential progress of the next 20 years will still be
pretty damn mind boggling. The internet alone is a Singularity of sorts.

~~~
mixmax
All exponential growth comes to an end eventually.

In the real world there are no "hockey-stick" curves, only S-shaped curves
because the real world doesn't deal with infinity.

I wrote a better explanation here: [http://www.maximise.dk/blog/2008/11/why-
singularity-may-neve...](http://www.maximise.dk/blog/2008/11/why-singularity-
may-never-arrive.html)

~~~
troystribling
Kurzweil suggested in the "Singularity is Near" that the believed speed limit
imposed by the speed of light would be a barrier limiting the available
resources to those in the solar system. Thus, limiting the exponential growth
of technology and leading to the flattening of the growth curve suggested in
your article. He also, gave a date of mid next century when this would occur.

~~~
dandrews
I didn't like Kurzweil's expansionist idea so much. Instead I imagine that the
Singularity will hole-up, and become disinterested in the outside universe.
There might be lots of real estate - read computing resource - there under all
those turtles.

------
100k
The best part about the singularity is that it's always 20 years away!

------
csbartus
I rather believe in cybernetic immortality
(<http://pcp.vub.ac.be/CYBIMM.html>), Vinge's singularity is too vague, some
kind of pop science.

In every metasystem transformation (such as singularity would like to be) a
new form of control arises over the existing reigning control mechanism.

To put in our terms, the next metasystem transformation will be the control of
the culture (humans are culture beings).

And usually this is not about some higher intelligence as Vinge states, it is
about the apparition of new senses which opens a totally new universe /
dimension.

------
theblackbox
I'm not sure if I voiced my particular distaste for the term "singularity"
before, but it's something that really does get on my nerves. I'm reminded of
Pasternak denouncing the phrase "We the people's" in Dr. Zhivago. It's a word
that the community that use it most often are utterly stripping bear and
bastardising. As far as I'm concerned there are only two (with a possible
third) definitive singularities - those of the "black holes and baby
universes". <br/>

I really can't see the useful application of the metaphor here. It's not a
singularity... a wave, certainly. This gives us the idea of the "seventh wave"
which concentrates the potency of the waveform. Or how about a renaissance? It
will get to the point where software development, especially in AI, reaches a
functional plateau. Branching out and diversifying into many stylistic forms.
Or (my personal favourite) a "Babel Event", where we construct a universal
medium, shattering the old converging paradigms and multiplying our
complexities by an order of magnitude. <br/> Surely these are far more "useful
metaphors" than the idea of a singularity - the point at which the rules
change, or the classical conventions are destroyed? I think this is
insidiously vague (and might be why the "singularity meme" has spread so
far/fast?)

~~~
dejb
Also it is quite an inconveniently long word. If you strongly agree you are a
'singularitarian' which is a bit of a tongue twister.

~~~
sep332
It may be long, but this song makes up for it:
<http://www.transhumanism.org/index.php/WTA/more/kamsingsong/>

------
vorador
I've got trouble with the notion of singularity : why is it supposed to be an
event that happens suddenly instead of gradually ?

~~~
sown
I think you are thinking of hard take off vs soft take off.

Come back and ask me in 25 years and I will tell you which happened. :)

------
BerislavLopac
I don't know if singularity is ever going to occur, but it's definitely not
going to happen that soon. Our computer technology -- AI in particular -- is
currently at the goldfish level.

~~~
ivankirigin
Compare the Apple I to today's PCs. Compare dialup to mainframes to scaling
out thousands of severs automatically in the cloud.

These aren't linear changes. If we're at goldfish level today, and we were at
bacterium level yesterday, tomorrow could be a monkey, and the next day more
intelligence than the whole human race combined.

I'm not saying it's going to happen, just that you can apply linear growth
analysis to areas growing exponentially.

~~~
JabavuAdams
Our rate of development is limited by how quickly we can make good decisions.

Any decision support or "cognitive outsourcing" that is invasive will have to
go through the usual approval processes before becoming widespread. It may
eventually transform those same processes, but that will lag far behind the
initial adoption.

I can have a super-mondo-uber tool, but I still need to know what to build. If
I need more than one person for my project, then I also need to be able to
scale high-bandwidth human to human collaboration.

None of these things seem impossible, it's just that closing the circle and
applying these tools to ourselves slows things down tremendously.

~~~
sown
Fortunately, though, there are a lot of people making decisions about what to
do. :)

------
bhiggins
How convenient, since Mr. Vinge, at 85 years old, will hopefully still be
alive!

BTW Vernor Vinge is a great author. I heard he's working on a third book in
the Fire Upon the Deep & A Deepness in the Sky series. Looking forward to it!

