
Be a Visiting Fellow at the Singularity Institute - rms
http://lesswrong.com/lw/29c/be_a_visiting_fellow_at_the_singularity_institute/
======
thunk
_> SIAI is tackling the world’s most important task -- the task of shaping the
Singularity. The task of averting human extinction. We aren’t the only people
tackling this, but the total set is frighteningly small._

Forgive me, but this seems a little on the "self-important quackery" side.
There are any number of threats to our species far more urgent and reality-
based than xenocide by a weakly godlike post-singularity AI. I suppose it
can't be _completely_ ruled out any more than alien invasion or the Nemesis
hypothesis (though I find it pretty far fetched given the state of AGI). But
"the world's most important task"? C'mon.

~~~
apsec112
"There are any number of threats to our species far more urgent and reality-
based than xenocide by a weakly godlike post-singularity AI."

We work on preventing those too. Some of the risks we work to prevent:

\- Deliberate misuse of nanotechnology

\- Nuclear holocaust

\- We’re living in a simulation and it gets shut down

\- Genetically engineered biological agent

\- Accidental misuse of nanotechnology (“gray goo”)

\- Physics disasters

\- Misguided world government or another static social equilibrium stops
technological progress

\- “Dysgenic” pressures

\- Take-over by a transcending upload

\- Repressive totalitarian global regime

\- Our potential or even our core values are eroded by evolutionary
development

For more information, see <http://www.nickbostrom.com/existential/risks.html>.

~~~
westbywest
Industrial food production is quite fragile, and the world basically has yet
to experience that production in any sort of "failure mode," e.g. global
blight due to excessive monocultures.

The probability of such a global failure occurring within the next 100 years
strikes me as near 1.0, and thus possibly more pressing than any of the events
listed above.

~~~
jessriedel
This isn't an existential risk.

~~~
indrax
If there is a significant die-off in industrial nations it could lead to a
major delay in the singularity, perhaps making it impossible altogether.

<http://www.nickbostrom.com/existential/risks.html>

In these terms this could be a 'crunch'.

~~~
jessriedel
Importance-wise, delays in the singularity are trivial compared to a risk of
it not happening at all.

So suppose we are worried that a die-off in industrial nations will actually
make it impossible. Isn't this risk (that there is a major catastrophic
disaster for industrialized nations _and_ this disaster prevents a singularity
from _ever_ taking place) much smaller than the risk that the singularity goes
badly when it happens? If so, then this supports the spirit of my original
comment: worrying about the singularity going well is more pressing than
worrying about major shocks to civilization which do not completely wipe out
humanity.

------
rms
I spent three weeks at the SIAI house in April/early May. Feel free to Ask Me
Anything about the Visiting Fellows program. It will be several hours before I
check this thread again.

~~~
icey
Apologies for such an open ended question, but could you generally describe
the experience?

~~~
rms
Sure. I may not have been the typical visitor, but I'm not sure that there is
a Typical Visiting Fellow.

I went there with low expectations of actually accomplishing anything. I'm not
yet at the point where I can write the sort of hardcore sciencey analytic
philosophy papers that are the main output of the Singularity Institute, but I
think with 6 months of learning I will be able to meaningfully contribute to
that goal.

I tried to share my lessons of entrepreneurship with the housemates that
wanted to learn. I sat for an afternoon with a Brazilian entrepreneur living
at the SIAI house and we went through all of the logistical details of opening
a Delaware corporation as a foreign national. I did a short workshop on SEO. I
went to see Thom Yorke twice in Oakland, saw Orbital in San Francisco, and
Blue Scholars in Berkeley.

I graduated from school with an engineering in December and because of the
success of my kratom business, I didn't need to get a real job, so I had a lot
of time to surf the internet. I started reading Less Wrong while trying to
actively comment as much as possible. In 4 months of posting on Less Wrong, I
learned more than I did in two years of college. In 3 weeks of living at the
SIAI house, I learned much more rapidly than I did while just posting on Less
Wrong.

So, to answer your question, what I mostly did was sit around and talk with
all of the wonderful people that live at the SIAI house or occasionally pass
through or hang out. My favorite types of conversations are those where you
push the barriers of the kinds of thoughts that humans are capable of having,
the kinds of conversations that often stop making sense because there are too
many different infinities or recursive loops. At the SIAI house, I could spend
hours a day talking about those kind of things. It sometimes felt like I was
living at the center of the universe.

Also, the food is good. Lots of delicious healthy things from Trader Joe's and
Costco.

------
jluxenberg
@rms: What concrete things did you work on? Were you writing papers, or coding
AI, or...? The Singularity seems like such a broad area of study, how do you
gauge progress or even decide what to work on?

~~~
apsec112
See <http://singinst.org/accomplishments2009> for some of the stuff that we
work on.

------
apsec112
I did this during the summer of 2009, and it was utterly awesome. It was like
doing a startup, but less stressful.

~~~
saikat
"It was like doing a startup, but less stressful."

How was it like a startup?

