
The Future is Now - ph0rque
http://www.d-n-i.net/dni/2009/05/19/on-war-305-the-future-is-now/
======
J_McQuade
This article, and many others like it, seems to be so anti-science that I'm
surprised it didn't end with "And that's if the LHC doesn't vaporise us all
first!"

I just get massively incensed with the tone of pieces like this - their strong
implication that the pursuit of knowledge is inherently A Bad Thing and the
readiness with which they are prepared to be scared is so pervasive that
sometimes I honestly get the impression that, to the average person, I must
seem like an absolute lunatic for my intellectual pursuits.

For example, I've recently been playing around a lot with the Arduino. An
acquaintance of mine, upon seeing my desk strewn with various wires and
components, immediately asked "Wow, are you making a bomb or something?" - how
crazy is that? Now, this person may not be particularly well versed in science
or technology, but the fact that the first thing he thought of when exposed to
a bit of it was "BOMB!" could well tell you a little about just how dangerous
articles like this one can be once they've become the norm.

Maybe it's just human nature to be scared of things we don't understand, but
something seems to be eroding the average person's very desire _to_ understand
in the first place. I can't help but think that scaremongering such as this is
not helping in the slightest.

(And I apologise for jumping on this article as an excuse to have a bit of a
rant, but it just makes me so very, very angry.)

~~~
jodrellblank
_Wow, are you making a bomb or something?_

"if you've done nothing wrong, you've nothing to hide" isn't going to work, is
it?

------
splat
This article strikes me as far too alarmist. Sure, it would be terrible if
some guy in his garage concocts Black Plague 2.0 and accidentally (or
intentionally) spills it all over a large city. But the author doesn't seem to
assess anywhere the odds of this actually happening. I'm not a biologist, but
I'd bet that engineering a disease in your closet potent enough to wipe out a
significant fraction of the human race before anyone can come up with a
vaccine is nontrivial even if one is deliberately striving toward that goal.

These sorts of diseases walk a fine line -- too lethal and they kill their
hosts before spreading, too benign and they're ineffective; too many mutations
and they quickly mutate into a benign strain, too few mutations and the immune
system can build resistance (and a vaccine can be found). Personally, I'm
going to put the likelihood of this threat in the same basket as "terrorists
developing functional nuclear bomb."

~~~
swombat
Playing devil's advocate here, I think the danger is accident rather than
intention. If there are just a few "evil scientists" tinkering around this
worldwide, that's fine... their chances, as you argue, are low... but what if
there's a hundred thousand amateurs blundering around in their spare evenings?
That surely massively multiplies the likelihood that they come up with
something devastating, by accident.

And even though the likelihood may still be less than that of a plane crash,
it's worth bearing in mind that we're _all_ riding on _this_ plane...

I'm not suggesting that these people somehow need to be rounded up or
something... I don't think that would work, for a start. But perhaps the
subject does deserve some thought. Perhaps biohackers can be provided with
free access to secure facilities, expertise and oversight that will enable
them to conduct their experiments safely.

Sure, that might cost a bit, to build all the infrastructure... but it's less
expensive than a Black Plague 2.0, as you call it.

~~~
splat
One of the problems with present-day technology is that the capability of
wiping out the human species is a possibility, even if vanishingly small. It's
hard to refute the argument that "The value associated with the destruction of
the human race is minus infinity, so the expected value of doing anything that
has a nonzero probability of destroying the human race is also minus
infinity."

But we seem to be comfortable with this existential threat on some level given
that we switched on the LHC and we have thousands of nukes ready to destroy
every major city on the face of the earth within hours.

It all comes down to costs and benefits. If we try to minimize this
existential threat by building some sort of infrastructure so that biohackers
can conduct their experiments with oversight we're going to lose something. In
this case, we'd be losing research that qualified scientists could be doing
instead of babysitting people who, on average, probably have no idea what
they're doing.

Plus, institutionalizing the process seems to me to defeat the purpose of
biohacking--using unconventional techniques to solve problems. In the
facilities you describe, I'm not sure the biohackers would have the freedom to
perform the experiments they want. True, it would significantly reduce the
odds of a catastrophic virus leaking to the general population, but it would
also reduce the odds that someone discovers some "miracle cure." (Of course,
the odds of this happening either way are pretty small, but I'd reckon that
the odds of something good coming out of these garage experiments is at least
an order of magnitude greater than the odds of anything bad happening.)

------
asciilifeform
What the Luddites don't understand is that by attempting to restrict amateur
(and increasingly, official) research, they hasten the arrival of the very
same synthetic plague to-kill-us-all by further alienating smart, resourceful
people.

I must say that my feelings on bioengineered plagues are somewhat mixed. One
one hand, I'd rather not be killed, esp. in a grotesque and uncommonly painful
way. On the other hand, I think perhaps there ought to be a death penalty for
civilizations which alienate and oppress their intelligent, creative and free-
spirited minority.

------
wlievens
While reading this I thought the author didn't seem informed on the subject at
all, and rather more interested in creating panic.

 _Director for the Center for Cultural Conservatism_

Okay. Right.

~~~
jerf
Are you _sure_ he's wrong?

If so, _how_ are you sure?

For one thing, recall that these are amateur hackers at the _beginning_ of the
hacking curve; today playing with E. Coli, five years from now in possession
of full-on sequencers, producing and inserting arbitrary DNA, ten years who
knows?

He may in fact be right. As much as we'd all like to dismiss the possibility,
I don't actually see a rational basis to declare that he's absolutely wrong.
We don't know. Nobody know. There are some arguments you can make about why
he's wrong, but all the arguments are probabilistic, about how evolution
_probably_ would have discovered something of that lethality, but it's only
probabilistic.

~~~
smhinsey
He can be right (i.e., there is a risk from bioterror) and still be
misleading. His complaint boils down to: dangerous bioagents can be easy to
grow. we know this.

~~~
biohacker42
Dangerous bioagents are being grown at a phenomenal rate all around the world,
every day. Swine-flu, HIV, Ebola, MRSA, everything else - comes NATURALLY.

~~~
smhinsey
i am not sure if you think i was saying something else, but i agree 100%. this
is extreme foolishness.

~~~
biohacker42
I might have been replying to someone else.

------
chaosmachine
It should be noted that the "ring around the rosie" story is an urban legend:

<http://www.snopes.com/language/literary/rosie.asp>

~~~
wallflower
Anyone remember that classic Star Trek episode with "Ring around the Rosie"?
Scary in a Twilight Zone fashion.

<http://memory-alpha.org/en/wiki/Ring_Around_the_Rosie>

------
CodeMage
The biologically engineered apocalypse has always been a popular bogeyman.
Last year, the big scare was the biological terrorism:

<http://www.stratfor.com/weekly/busting_anthrax_myth>

A lot of the same logic applies to Mr. Lind's article too.

------
biohacker42
I am saddened by how many votes this is getting.

~~~
ph0rque
I think at least some people are upvoting it for the same reason I submitted
it: to get a good discussion going on Hacker News, with people (like
yourself?) who actually are involved with the pursuit in question.

~~~
randallsquared
That was my upvote, yes.

------
donaldc
This article is looking at things from a static perspective. _If_ biohackers
ever get to the point of releasing killer plagues, something along these lines
would happen:

(1) The earliest killer plague created and released by a biohacker kills
thousands, but isn't even close to being fine-tuned enough to be an
existential threat to civilization.

(2) The powers that be, and society, now that their attention has been
focused, react to this threat by restructuring themselves slightly to better
handle more virulent threats of this sort in the future.

(3) Go to (1), only with a somewhat more virulent outbreak. Then go to (2),
only with a somewhat larger restructuring. By the time biohackers have gotten
far enough up the learning curve to release truly effective plagues, society
will have gotten far enough along the restructuring curve to handle them
without being wiped out or even seriously set back.

------
noonespecial
There certainly is a killer plague coming. Its memetic, infects the brain, and
is spread by articles like this.

------
jodrellblank
_She’s got a DNA "thermocycler" bought on eBay for $59_

Putting that in quotes signifies that you both don't know what it is, and are
afraid of it. For $59 off ebay? I'd guess it's something evil and scary like a
"warming plate".

 _Well, my dear, the fact is_

Yes, why don't you who tell her what she was doing, hmm? You clearly have a
good understanding of what it would take to engineer ecoli into the black
death 2.0 if you can judge that she was "one mistake" away from doing so.

 _A calm, measured, thoughtful response to biohacking would be to run around
madly in one’s underwear screaming “The sky is falling! The sky is falling!”
It is impossible to overstate this threat.

What can we do about it? Probably nothing_

...

