

"The singularity is a religious rather than a scientific vision" - gruseom
http://www.spectrum.ieee.org/jun08/6280/

======
marvin
The author should have submitted the last two paragraphs, and left it at that.
The singularity does indeed seem to be more of a religious than a scientific
vision - rapture for nerds. This is the only worthwhile sentence I find in
this article.

John Horgan sounds like so many other critics declaiming that dramatic
technological progress is impossible. History has in many examples proved
critics wrong in a spectacular way. Although he doesn't say it directly,
Horgan implies that "thinking" machines will not be created in at least 50
years - which to me sounds just as ridiculous as declaring heavier-than-air
flight or space flight impossible.

The article doesn't really contain any decent points at all - the author cites
neuroscience researchers east and west claiming that any dream of smarter-
than-human machines are impossible, which seems to me as far from refuting any
central argument. 'We don't understand the brain, so intelligence and
consciousness can never be understood in a decent time-frame' seems to be as
close to a key point as any I am able to interpret out of this.

    
    
      The implication of his thought experiment is that our 
      psyches will never be totally reducible, computable, 
      predictable, and explainable". 
    

_Never_ explainable? Just like the stars, motion, nuclear physics and the rest
of the human body?

Cochlear implamts are also cited as having "poor sound quality" - how does
this prove that implants won't get better?

There is really nothing to see here - perhaps there would be if the author
made his points a bit clearer.

------
GavinB
We need to stop measuring the intelligence of computers by how many
calculations per second they do. Right now we can put together computing
systems that can do the amount of calculation under discussion, but fairly
slowly. So what if it takes 10x as long to get a thought out? It's all in the
program, and no one has the slightest clue how to program thought for now.

I'm also skeptical that the singularity will really move all that fast -- how
do we know how much time it will take for each 1% of improvement? Just because
a computer is capable of self-improvement doesn't mean that it will be fast
self-improvement. I'd imagine that while the computer is better at each step,
each step is also harder.

------
radu_floricica
Good read. But if there is a conclusion to be drawn here, it's that we have a
severe lack of common-sense POVs about human enhancement. On one end we have
Ray Kurzweil, and on the other we have articles like this. The real useful
path is provigil-like incremental steps, which are utterly marginalized and
looked upon with suspicion. And this kind of small steps can be seen quite
often in the past few years... from a flash game that increases IQ to
electrodes that greatly enhance memory making to 100 times more cognitive
science articles then 20 years ago.

You don't have to upload to live 120, You just have to get over Alzheimer and
cancer and strokes. You don't have to be able to completely understand the
brain in order to learn faster or be more productive. There are already many
many things we can do, but as far as I can tell we have a problem of
perception: making medicine to cure dandruff is commendable, but any kind of
enhancement is frown upon. This also is religion at work.

------
m0nty
So someone offers to upload your brain into a computer and you go along with
it?

"In other words," said Benji, steering his curious little vehicle right over
to Arthur, "there's a good chance that the structure of the question is
encoded in the structure of your brain — so we want to buy it off you."

"What, the question?" said Arthur.

"Yes," said Ford and Trillian.

"For lots of money," said Zaphod. + "No, no," said Frankie, "it's the brain we
want to buy."

"What!"

"I thought you said you could just read his brain electronically," protested
Ford.

"Oh yes," said Frankie, "but we'd have to get it out first. It's got to be
prepared."

"Treated," said Benji.

"Diced."

"Thank you," shouted Arthur, tipping up his chair and backing away from the
table in horror.

"It could always be replaced," said Benji reasonably, "if you think it's
important."

"Yes, an electronic brain," said Frankie, "a simple one would suffice."

"A simple one!" wailed Arthur.

"Yeah," said Zaphod with a sudden evil grin, "you'd just have to program it to
say What? and I don't understand and Where's the tea? — who'd know the
difference?"

"What?" cried Arthur, backing away still further.

"See what I mean?" said Zaphod and howled with pain because of something that
Trillian did at that moment.

"I'd notice the difference," said Arthur.

"No you wouldn't," said Frankie mouse, "you'd be programmed not to."

Ford made for the door.

"Look, I'm sorry, mice old lads," he said. "I don't think we've got a deal."

~~~
rplevy
For one thing, this would make the free/open versus proprietary software issue
a lot more crucial than ever before. I can imagine accepting software updates
(for software I have opted into) from a well-run open source project. Of
course the idea of mind=software is itself only a very loose metaphor that
fails to capture the nuances of development of an embodied mind. But I could
imagine some computer extension to the brain that would add new cognitive
potentialities to the existing repertoire of brain structures. Who knows,
maybe it's not that far off? There are already direct brain interfaces for
very simple kinds of control of machines.

If a programmable brain extension acts like other high-level brain modules
then it should create a new range of abilities rather than dictate a rigid
behavior. The former is something that could gain wide acceptance, and the
latter would be an unethical way of enslaving people. The unethical rigid
control of behavior also seems easier to implement with current knowledge
since much more is known about low-level brain and spinal control networks
than is known about high-level cortical abstraction.

~~~
m0nty
"Who knows, maybe it's not that far off?"

I have a more-or-less open mind, but I can see significant problems. I mean,
how many times have we been told that some big new thing is just "five years
away" and yet it never arrives? I suspect a rapture-like singularity is not
going to happen, but augmentation (which to some extent is already happening)
will accelerate in a very interesting way.

Anyway, it certainly highlights the ethical side of software engineering, as
you point out.

------
Flemlord
Religion is based on faith--believing in the impossible, contrary to all
empiric evidence. Belief in the Singularity is a simply a prediction based on
current technology vectors. If you understand the theories behind artificial
intelligence, and believe Moore's law will continue for 10-20 years, the
Singularity is a logical conclusion. To call it a religion reveals a deep
misunderstanding of the theory, and the motivations of the people who think it
is a possibility.

~~~
gruseom
It's not a religion in any complete sense, but it's kind of obvious that a
gigantic leap of faith is involved (starting with the use of the definite
article in the phrase "The Singularity", as was shrewdly pointed out by a
recent commentator here), and I do think the leap can fairly be called
religious. Why?

First, despite what you and others say about "empiric evidence", Moore's Law
doesn't at all imply a concomitant leap in programming power (is _our_
capacity growing exponentially?), and extrapolation from actual achievements
to date points in a decidedly more mundane direction.

Second, the "singularity" predictions are the latest in a tradition of
extravagant millenarian claims that people have been making about AI for
decades. These claims have consistently turned out to be overblown, if not
vacuous, in retrospect. Extrapolate from _that_ historical trend and you
arrive at quite a different (and rather more plausible) picture.

Third and most interestingly, the emotions and language surrounding the
"singularity" are distinctly religious. This really stands out in what I've
seen of the discourse. Since many advocates of "singularity" are of an
intellectual cast that objects to the more popular religions, it seems to me
as if some are directing their religious impulses into technology. As someone
brilliantly says in the OP, it's "The Rapture For Nerds". (Notice that
definite article again.)

By the way, I don't mean "religious" as a criticism per se. Personally, I
think religious impulses are part of being human. What fascinates me about The
Singularity discourse is the incongruence between its technological subject
matter and its vividly religious emotional language.

------
mhartl
What to think about the singularity? I'm not sure this article tells us much
either way. I'd guess interviews with top mechanical engineers in 1890 about
the prospects of landing men on the moon within 80 years would have sounded
much the same.

~~~
greendestiny
Only if those top engineers were convinced that landing on the moon would be
the turning point for mankind becoming space travellers, and felt the need to
prepare humanity for its coming. Actually that sort of thing did happen, but
the point is just how far those absurd notions are from reality. No human has
been further than the moon 40 years later.

I'm not sure a single bit of the wild speculation that occurred in 1890 will
be useful when if we ever do begin large scale space travel.

------
dejb
"XXXXX is a religious rather than a scientific vision"

where XXXXX = Evolution, Gravity.....

Just because something deals with issues that religion addresses does not mean
it is not scientific.

~~~
elai
You don't notice the rapture of the nerds themes in 'the singularity'? Or the
'god like powers' themes of the singularity? That is totally a religious
pattern of thought.

~~~
dejb
Yep I see them there. But does that really have any bearing on the likely hood
of it being true? Religion encompasses our greatest desires so if technology
becomes sufficiently powerful wouldn't the fulfillment of those desires be a
likely result?

Is there a danger that some people's thinking could be influenced by their
desire for it to happen? Of course. But I think that just as many people are
likely to dismiss it for irrational reasons as well. That is why it is
important for people to keep their eye on the real science.

------
earthboundkid
Here's a good article on why it always seems like the Singularity is just
around the corner:

[http://www.kk.org/thetechnium/archives/2006/02/the_singulari...](http://www.kk.org/thetechnium/archives/2006/02/the_singularity.php)

------
jey
All this (awful) article shows is that there's a big need for
'singularitarians' to do a better job of communicating to the wider community.

~~~
gruseom
This is an emotional reaction that comes up frequently. Arguments are answered
with "but that's not at all what we said, you just don't understand". (I don't
mean "emotional" as a criticism, though I realize it probably sounds that
way.) The problem is that it's a moving target. For example, the objection
that "exponential advances in hardware are not matched by exponential advances
in software, so extrapolation from the former doesn't apply to the latter," is
met with something like, "You are uninformed if you think The Singularity has
to do with extrapolation from Moore's Law. That's a common misunderstanding."
One wonders how the misunderstanding got so common if that's not what the
advocates said! Anyway, this moving-targetness is to me an indicator of
religious attachment to a concept.

~~~
jey
I stopped arguing for this stuff in online forums because it's not the most
effective way to communicate it to a lot of people. Instead I'm working on
writing up stuff that can be useful to a wider audience (instead of the few
dozen people who would read this comment.) I'll submit it here when I
eventually finish writing it up.

Secondly, there's a lot of fractured mutually incompatible thought that goes
under the banner of "Singularity". There's a lot of people who _do_ think it
has to do with some sort of intersecting exponential curves (this is the
Kurzweil school of thought).

See this link for an overview of the three mutually exclusive camps, as told
by Eliezer Yudkowsky: [http://www.singinst.org/blog/2007/09/30/three-major-
singular...](http://www.singinst.org/blog/2007/09/30/three-major-singularity-
schools/)

" _One wonders how the misunderstanding got so common if that's not what the
advocates said!_ "

There is no "the advocates", because "Singularitarianism" isn't a single
movement. The fact that it's fractured only indicates that there's three (or
more) different camps -- these are _different_ people saying _different_
things, not the same people acting as a 'moving target'. I don't see how that
makes it a religion. And you're right that there's a decent chunk of people
who are dreaming of utopia and bliss and do take a Sci-Fi sort of attitude
toward it -- but there are good scientific reasons to work on these issues,
and good reasons to be worried about the implications.

For what it's worth, my claim basically comes down to two things: 1) AI is
feasible. 2) There will be profound changes when AI appears on the scene. And
we need to worry about the big issues relating to this.

I _hate_ the term 'singularity'. It's too associated with Kurzweil's take on
things, and it evokes connotations of a 'hard takeoff'. I prefer to just talk
about AI's importance, feasibility, implications, and dangers.

------
nazgulnarsil
self limiting factors may make our technological approach to the singularity
asymptotic.

~~~
nsrivast
Such as the finite speed at which humans integrate and interpret data and
mentally design experiments and technologies.

------
sabat
Sorry, is there that much of a difference? Really?

