
H5N1 - olivercameron
http://blog.samaltman.com/h5n1
======
JunkDNA
>We now have the tools to create viruses in labs. What happens when someone
creates a virus that spreads extremely easily, has greater than 50% mortality,
and has an incubation period of several weeks? Something like this, released
by a bad guy and without the world having time to prepare, could wipe out more
than half the population in a matter of months. Misguided biotech could
effectively end the world as we know it

Sam is a smart guy, so I really don't want to come off as sounding like a jerk
here, but this grossly underestimates the technical feasibility of creating
such a virus. Computer folks routinely overestimate how much biologists
actually know about the systems we study. We know jack about how the vast
majority of biology works. We have the most fleeting glimpses of understanding
that are regularly crushed by the complexity of dynamic systems with nested
feedback loops and multiple semi-overlapping redundancies. I won't say it's
impossible, but we don't even know enough to know whether the three things:
high mortality, long incubation, and ease of transmission are even possible.
While we can imagine it, there might be biological and epidemiological factors
that prevent such a thing from existing.

This also commits the logical fallacy of ascribing superpowers to the bad guys
cooking up viruses while assuming the good guys are sitting on their duffs
letting bad things happen. H5N1 was a pretty good example of international
collaboration. There were academic competitors and industrial labs working
around the clock collaboratively on it in the early days before much was
known. Whole vaccine divisions at pharmas were all over it. If we're instead
talking about a mythical time in the future when we _do_ understand enough
biology to engineer something like this, one would have to assume the good
guys possess the knowledge to develop countermeasures.

I'm not arguing that pandemics aren't something we should worry about.
Europeans were almost wiped out by the plague and in modern times Africa has
been decimated by HIV. These are _real_ problems that the human race has faced
and will likely face again, irrespective of lab-created stuff. Biotechnology
is the primary mechanism by which we're going to be able to survive when the
next one comes, wherever it comes from.

EDIT: Fixed wrong word usage in 2nd sentence.

~~~
qq66
The scenario might seem far-fetched today, but what if biotech made the same
kind of progress over the next 50 years as computer technology has over the
last 50? A human-engineered super virus might seem as unlikely to a virologist
today as an iPhone would seem to Alan Turing.

~~~
JunkDNA
I address this in my second paragraph. You can't assume the advances all
happen on the negative side (the ability to perfectly engineer a deadly virus)
without corresponding advances on the positive side (enough understanding of
biology to combat new viruses).

~~~
gknoy
I think you have a mistaken assumption, though, namely that advances in
CREATING dangerous things will be paralleled by advances in the ability to
prevent bad things.

Nuclear weapons have been around for over 50 yeras. We do not yet have __in
place__ any ability (other than treaties and fear) to prevent a nuclear
holocaust. Missile shields, "Star Wars" \-- all of those are of questionable
capability, and none of them are deployed.

Given that our only way of preventing nuclear winter is to agree not to launch
(and go to war to prevent Bad Guys from getting them?), an option which is not
available when dealing with a disease, I'm not optimistic about our future
ability to prevent a superbug from wiping out humans.

~~~
snowwrestler
The ballistic technology that can deliver a warhead to a target 12,000 miles
away can also deliver a constellation of remote sensing satellites into orbit.

Remote sensing is what held off nuclear holocaust during the Cold War. The
ability to reliably and quickly detect and respond to nuclear first strike
creates the "mutually assured destruction" strategic framework aka deterrence.

~~~
PavlovsCat
That still boils down to "let's agree not to launch". It doesn't _force_
anyone not to launch.

------
chimeracoder
> But another possibility is that we engineer the perfect happiness drug, with
> no bad side effects, and no one wants to do anything but lay in bed and take
> this drug all day, sapping all ambition from the human race.

Preface: What we're talking about is probably biochemically impossible (truly
no bad side effects, no tolerance, etc.). So, everything that follows is a fun
thought experiment, and should be taken as nothing more.

Let's say someone produces a true wonder drug that is relatively easy to
produce and produces extreme happiness 100% of the time, with no side-effects,
and no diminishing returns due to drug tolerance. This drug produces _more_
happiness than any other activity that we could be pursuing with our time. As
a result, all anybody wants to do is take this drug all day.

The author presumes that this is a bad thing, but let's question that
assumption.

If _everyone_ is completely happy 100% of the time, and - more importantly -
_happier_ than they would be if they were doing whatever it is they would be
doing with that extra ambition, why should we assume that this is a bad thing?

Of course, somebody would need to maintain production of the drug. This means
that people either would take it only part of the time, to maintain enough
ambition, etc. to produce the drug on their own, or (more likely) we would
have some lucky people who take it all the time and are always happy, and a
few people who are tasked with producing all the joy for the rest of the
world.

This exact premise (the second version) has already been explored, in short
story form.
[http://en.wikipedia.org/wiki/The_Ones_Who_Walk_Away_from_Ome...](http://en.wikipedia.org/wiki/The_Ones_Who_Walk_Away_from_Omelas)

(I agree that this situation sounds bad - most people would have a negative
emotional reaction to it, but it's fun to explore _why_ we have an aversion to
the thought of pure, unmoderated happiness.)

~~~
chc
I always hear the situation put this way, but it rings false to me. If you
gave me a perfect happiness drug, I wouldn't want to sit in bed all day and
take it. I would want to take it and go about my normal day only without the
burden of misery — debug Java without wanting to claw my eyes out, help out
people I meet because I don't feel stressed over my own schedule, etc. Being
happy naturally has never driven me to sit in bed and has only ever made me a
better person; I don't know why we assume drug-induced happiness would do the
opposite. I suspect this is just because many of our current drugs do this as
a side effect of inducing euphoria, and we're imagining that instead of a real
happiness drug.

~~~
aaronem
Heroin is as close to "a perfect happiness drug" as anything human ingenuity
has ever produced. My experience of its habitués tends to suggest that, while
it's not impossible you'd become a better person under its effects, it is
quite unlikely.

~~~
puller
If you're not asleep, you might be a better person. The problem would be when
you ran out.

~~~
aaronem
It can be tricky from the outside to distinguish someone who's nodding from
someone who's sleeping, but from the inside the two states aren't all that
similar.

------
gizmo
Tail risk decisions are never easy. Because we lack sufficient data by
definition.

Should we focus on preventing terrorism? Well, if 9/11 was the worst case
scenario then no. If on the other hand a terror attack could bring down the
entire country it's certainly worth being paranoid about. Suppose terrorists
poison our food and water supplies to the extent that we get country-wide food
riots. A civilization is only 9 meals away from anarchy after all.

So the essential question is this:

\- Is our civilization essentially fragile or fundamentally robust?

If our civilization is fundamentally robust we can simply focus on growth and
deal with setbacks (global warming, terrorism, imperialism, wars) as they
come. In the long term prosperity will go up and up. Not always as fast as
we'd like and not always in ways we deem fair but if we keep making progress
we'll get there eventually. This is the whiggish view.

The opposite view is that civilization is fragile. Kingdoms come and go and
foolish decisions can and have lead to centuries of regression. The upward
trend we've seen in the past couple of centuries does not mean our species has
grown up in the slightest. Every new weapon of doom we discover we play with
and we're no better than our imperialist and bloodthirsty forefathers. Our
civilization is determined to self-destruct by either nuclear war,
environmental disaster, political insanity or runaway capitalism. A
civilization that is not capable of planning ahead will eventually walk like a
lemming of a cliff. The best thing we can do is put tons of safeguards and
regulations in place to improve our odds of surviving at all.

Those are the two main views. And the kicker is we don't have enough data to
know for certain which view is correct.

~~~
fleitz
Civilization is not fragile, authoritarian governments are.

'We' (our respective nation states) are our imperialist bloodthirsty fathers.
What country do you live in?

------
oskarth
This is exactly why people like Nassim Taleb [1] (Fooled by Randomness, Black
Swan, Antifragile) are against things like GMO. We can't predict which tail
risks will hit us and how much - the only thing we can do it to make ourselves
robust against the negative ones.

This is also why, in the face of globalism, we should work to make life multi
planetary [2].

And for people who think this is just silly, it might be a good idea to have a
look at some recent history [3] and consider how close we were to being in a
very, very different place. This is not science fiction.

Good piece.

1:
[https://en.wikipedia.org/wiki/Nassim_Nicholas_Taleb](https://en.wikipedia.org/wiki/Nassim_Nicholas_Taleb)

2:
[https://en.wikipedia.org/wiki/Elon_Musk](https://en.wikipedia.org/wiki/Elon_Musk)

3:
[https://en.wikipedia.org/wiki/Cuban_Missile_Crisis](https://en.wikipedia.org/wiki/Cuban_Missile_Crisis)

~~~
jseliger
Yes. I just searched this comment thread for "Taleb" and am distressed to find
the first reference this far down. _The Black Swan_ especially is amazing; I
just read it ([http://jseliger.wordpress.com/2013/11/26/life-the-readers-
ed...](http://jseliger.wordpress.com/2013/11/26/life-the-readers-edition-and-
nassim-talebs-the-black-swan/)) and now can't stop recommending it. Though it
seems like the sort of book whose central idea can be understood through
reviews, it is full of subtle and unexpected comments. Altman wrote this:

 _But maybe there are some tail risks we should really worry about._

and indeed that's what much of Taleb's body of work is about. I am partway
through _Antifragile_ and do not find it as compelling as _The Black Swan_ ,
however.

~~~
oskarth
> I just searched this comment thread for "Taleb"

Ha, I did the same thing.

> I am partway through Antifragile and do not find it as compelling as The
> Black Swan, however.

Personally I think Antifragile is much better, in that it it's a more complete
work that contains his previous books and irons out what the consequences are.
It's possible you need to have a similar view on history, culture and
modernity in place before you appreciate it - I was into stoicism, empiricism
and classics in general before I read Taleb - as he's largely using that as
"the other side of the barbell" in his explanations, with the first side being
mathematical (see his technical online textbook). I've read The Black Swan
once or maybe twice, whereas I am already on my third reading of the
Antifragile (I can't help myself, it's too relevant).

------
GigabyteCoin
I came across an interesting tidbit of information the other day.

It turns out that "mud daubers" (wasps that make houses from mud) are
responsible for at least 2 major airline crashes in the last 33 years killing
at least 223 people.
[http://en.wikipedia.org/wiki/Mud_dauber#Involvement_in_Flori...](http://en.wikipedia.org/wiki/Mud_dauber#Involvement_in_Florida_Commuter_Airlines_accident)

One in 1980, and one in 1996... that we know of.

Apparently these mud daubers love living in long cylinders. If they find one,
say on a plane's uncovered instruments, they'll set up shop.

If you look at it under the right light, mud daubers are approximately 1/20th
as powerful and threatening as the world's terrorists of the last 33 years.

This URL has some numbers of terrorist caused deaths over a similar timeframe:
[http://en.wikipedia.org/wiki/Patterns_of_Global_Terrorism](http://en.wikipedia.org/wiki/Patterns_of_Global_Terrorism)

~~~
mimiflynn
Regarding mud daubers, that article you linked to said:

> This species also brought down another plane in Washington during 1982.

Which, in addition to the crashes of the Florida Commuter Airlines flight
(1980) and Birgenair flight (1996) makes it 3 planes mud daubers have brought
down.

------
slg
The question I always think of when people raise fears like this about bio-
weapons is what what motivations are there for "bad guys" to release
indeterminate killers like a bio-engineered virus? It seems like the
principles of MAD still apply here. Why launch an initial attack that has the
potential to "destroy the world" that either you or you leaders would still
hope to inhabit? It would require someone illogical and/or desperate, but yet
still had the technological prowess to create the weapon in the first place.
It would basically need to be a Bond villain.

~~~
replicatorblog
Or a well funded terrorist group that believes death will bring with it an
eternity of pleasure. Plenty of people strap on suicide vests, and it wouldn't
take that many more true believers to develop a suicide virus.

~~~
slg
I don't know if this is either a cynical or optimistic idea, but I don't know
if those groups truly believe that throughout the ranks. That is why you often
hear about payments to the family of suicide bombers. They aren't simply doing
it for their religious beliefs, but also to help out their family. The same
family that would have a 50% chance of death if the particular example virus
from the article was released.

Or another way to put it, why is the pope-mobile bulletproof?

~~~
chroma
Almost all of them committing these atrocities for religious reasons. The
Chinese government has treated Tibetan Buddhists horribly for decades, but you
don't see any Tibetan Buddhist suicide bombers. There have been some sects of
Buddhism in the past that have done horrible things, but in general it's
harder to become a Buddhist suicide bomber than it is to become a Muslim
suicide bomber. Some religions condemn violence of any kind. For example, I
doubt anyone could be a Jainist suicide bomber.

If you don't believe me, hear it from the horse's mouth. Here is a video from
a Muslim peace conference in Norway:
[http://www.youtube.com/watch?v=bV710c1dgpU#t=45s](http://www.youtube.com/watch?v=bV710c1dgpU#t=45s)
It will take five minutes of your time, but I think it shows best how sincere
these people are.

------
timsally
Terrorists attacks cost more than lives. The direct costs of 9/11 were between
$40 to $100 billion
([http://en.wikipedia.org/wiki/Economic_effects_arising_from_t...](http://en.wikipedia.org/wiki/Economic_effects_arising_from_the_September_11_attacks)).
The direct costs of another terrorist attack targeting something like nuclear
power could cost $700 billion or more
([http://money.cnn.com/2011/03/25/news/economy/nuclear_acciden...](http://money.cnn.com/2011/03/25/news/economy/nuclear_accident_costs/)).
None of these estimates take into account indirect costs, which are
potentially even larger. Preventing terrorist attacks is about saving lives,
but it's also about stopping events that could wipe out a quarter of our
revenue for the year. Massive economic damage can cause a lot of pain and
suffering.

When analyzing risk, it's important to estimate costs as accurately as you
can. Unfortunately Sam missed the boat on this one.

~~~
edanm
But a certain (large?) amount of the economic cost is exactly because of the
overreaction to terrorism, as opposed to worrying about other threats.

Sam's saying "let's spend X amount of that $100 billion on preventing
biological disasters, rather than just terrorism".

~~~
timsally
The overreaction to terrorism isn't a direct cost and its not part of the
numbers I cited. Direct costs are insurance losses, medical care for victims,
relief to widowers/widows, lost wages, etc. Those costs make up the numbers in
my previous comment, which alone justify spending money on preventing
terrorism. Indirect costs would be what your talking about and include things
like unnecessary wars, wasteful spending on defense, currency/stock market
devaluation, etc. Indirect costs are in the _trillions_ but are harder to
prove and reason about, which is why I omitted them.

I think Sam's getting at something important but it would be best to first
start with estimating the costs as accurately as possible. As it stands Sam's
estimates to the cost of terrorism are off by several orders of magnitude, so
you can understand why I pointed that out.

------
timr
Others have already (rightly) made the point that we're not yet able to
synthesize new viruses from scratch. And the "amateur bio-hacking" thing is
completely overblown -- the most advanced amateur work I've seen is stuff like
putting GFP into bacteria, which is just trivial. It requires little more than
some commercially available kits and a warm water bath. Synthesizing new
organisms is many orders of magnitude harder.

That's not to say that things won't change, but if I had to pick a serious
biological threat that exists _right now_ , it would be antibiotic resistance.
Thanks to air travel and long incubation times, we're not that far away from a
global pandemic of multi-drug-resistant TB, yet almost nobody is talking about
it.

~~~
skosuri
We have been able to make viruses from scratch for about a dozen years.
Synthetic poliovirus, 1918 flu, sars like corona amongst a dozen or so other
demonstrations.

~~~
timr
We can re-create existing ones from known (small) genomes and make some
modifications to ones that we understand well. Even that work is well beyond
what a hobbyist can achieve. Creating a novel virus would be a major research
project.

~~~
skosuri
There is a lot more work on synthetic viruses that you might not be aware of
that goes beyond just simple instantiation (like viral attenuation for
vaccines or hypotheses on origins of the viral outbreaks). Creating a novel
virus these days is not that diffult (Grad student project). Anyways, your
first post made it seem like it hadn't been done.

~~~
timr
I'm not aware of every paper in the field, but I know the high points. We're
still a long way off from the day when niche groups can generate novel viruses
with specific infective properties. We're still basically just tinkering with
the existing viruses in labs to figure out what the parts do.

Depending on how you define "novel", it could indeed be a "grad student
project", or it could be a paper that deserves a Nobel. But it still falls
firmly in the "improbable as a weapon" category of threats. I'm still far more
scared of XDR TB than hypothetical synthetic viruses.

------
gboudrias
That's a pretty terrible title. What about H5N1? Why should I click? Are you
just trying to be scary?

~~~
tln
It's the article title and a key topic in the article.

"Also in 2011, some researchers figured out how to reengineer H5N1—avian
influenza virus—to make it much scarier by causing five mutations at the same
time that all together made the virus both easy to spread and quite lethal"

~~~
gboudrias
I'm not blaming the HN submitter, I'm blaming the author of the blog post.
It's not just a key topic, it's the subject. You can't title a programming
article "Programming", it has to be about something. Unless you're going for
clickbait, which seems to work.

------
drzaiusapelord
>Unlike an atomic bomb, which has grave local consequences

This is a little dismissive. Nuclear war under any plausible scenario wouldn't
be an isolated 1945-type event. It would be a global event that drew in other
players and more than likely would conclude in a mass launch by one of the
world powers. We're not nuking Paris, Moscow, or DC and walking away. There
will be retaliation.

Unlike biotech, these things are here, ready, and primed to hit targets. If
there's a tail risk to worry about its human extinction via nuclear arms.

------
ve55
Although biotechnology is definitely a significant risk, there are some other
things such as nanotechnology, flawed super-intelligences, and transhumanism-
related issues that should rank very highly as well.

See
[http://www.nickbostrom.com/existential/risks.html](http://www.nickbostrom.com/existential/risks.html)
for a nice summary of potential existential risks to humanity.

------
happywolf
The main issue is the break-down of _trust_. See 20 years ago when I took
planes, I could bring anything that was reasonable: water, shaver, shampoo,
etc. and the security check was minimal. Now look at what is happening? We are
treated like criminals: we are patted down, all things are stripped to go
through bomb detector and metal detectors, a lot of times we need to take off
our shoes and belts to go through the x-ray machines. Why?

Before we start to accusing others as 'terrorists', first step back and think:
Why someone would sacrifice their lives to attack us? They are crazy people,
some may say. But why we suddenly see so many 'crazy' people in the recent
years? What caused them to be so desperate and so angry to the extend they
want to waste their lives to do damages?

There are countless ways to cause mass damages in the modern societies, and
unless we can understand the _rootcause_ of the attackers' motivation, trying
to seal off all potential attack vector is no more effective than trying to
remove weeds from ground without taking off the roots.

------
larrys
"Trying to keep things secret is not the answer. "

Disagree. Of course it's certainly part of the answer.

If not then why not publish all the details of when you are home and how your
house is protected for anyone to see? And exploit if the appropriate "nut"
decides to? Some walls are helpful as a barrier.

Security (by obscurity?) does provide some protection. Locks do keep some
people out. Going in the other direction (and making it easy for someone and
very available) is not a solution to making things safer.

------
prirun
Bacteria are constantly mutating and looking for new ways to kill us. And
since antibiotics are over-prescribed and being added to livestock feed as a
preventive measure rather than to treat animals that are actually sick,
currently available antibiotics are losing their effectiveness.

Drug companies are no longer doing research on new antibiotics, because
antibiotics actually cure things and hence are not profitable. Today's drug
companies are only interested in treatments that last a lifetime. Pfizer was
the last drug company doing research into new antibiotics, and they closed
that division because it wasn't (and wasn't going to be) profitable.

I think the fact that drug companies and medicine in general are not focused
on important problems, but only on profitable problems, is a major issue we
should be addressing. See the recent Frontline documentary "Hunting the
Nightmare Bacteria"

[http://www.pbs.org/wgbh/pages/frontline/hunting-the-
nightmar...](http://www.pbs.org/wgbh/pages/frontline/hunting-the-nightmare-
bacteria/)

------
argumentum
Very stimulating article.

Minor quibble: I've never thought (most) people actually "fear" terrorism,
rather they have (justified) anger and perhaps an excess feeling of "something
should be done" since terrorists are actual human beings who can be brought to
account.

Perhaps a better comparison would be fearing airplanes over cars, but there I
think much of the fear is in the novelty of the flying experience.

The comparison with nuclear weapons is really interesting, particularly as the
response to fear of nuclear annihilation on the part of most people was
overblown (digging bomb shelters under houses etc). On the other hand, the
actions of the relevant governments (US/USSR) were mostly rational in a game-
theoretic sense (acknowledging the prisoner's dilemma at hand).

Biotech may be different, as Sam mentioned, since only nation-states have the
means to build nukes. On the other hand, computers/the internet still work
despite Y2K and rtm's worm :)

------
molsongolden
Are there reasonable precautions that can be taken at an individual level to
prepare for a pandemic type event?

~~~
Udo
Reasonable? Probably not. You should have about a week of food and supplies in
the house, in case of any kind of disaster (stocking up more is most likely
useless). Depending on how the disease spreads, you could get some surgical
masks and latex gloves to mitigate your chance of infection. In the end
though, none of it is going to be really useful. If/when a pandemic sweeps the
planet, by definition a large number of people are going to get ill, so
chances are for all the preparation we're all still going to be sick at some
point.

------
javajosh
I'm not sure if this is the first science fiction story to tell of a lone
biologist creating a plague, but Frank Herbert's "The White Plague"[1] was
written in 1982 and presages a time (in 1996 no less) that a single actor,
motivated by personal tragedy, could create and release such a thing.

In a sense though, this notion that technology can give individuals
unimaginable power has also been a theme with Robert Heinlein (I think), where
he wrote about an easy-to-manufacture superweapon which basically gave anyone
the ability to destroy the world. It was also a theme in Kurt Vonnegut's "Cat
Cradle" although that's a little bit more of a stretch.

It's true that our destructive capability scales with the amount of energy
that we can harness. And it's also true that biological threats are under-
perceived. But it seems to be that biology is inherently messy to avoid being
too susceptible to annhiliation. That is, yes, it may be possible to create a
plague that wipes out 50% of humans - but in the scheme of things, that's not
the end of the world. Certainly not the end of humans (not even close!). And
it seems that the likelihood of creating such a pathogen is exceedingly small.
Indeed, I'd estimate that you couldn't kill more than 10% of people with a
single plague.

But since we're talking about catastrophe, it's interesting to wonder about
whether a single human could end the world, and if so how. The most likely way
(and rather dramatic way) would be to maneuver a large asteroid to impact the
Earth. Energetically, it's entirely possible to do. As for nuclear war, I'm
not entirely convinced that would really be the end for humans - although it's
certainly possible. Another way to end the world _might_ be to release mega-
tons of CFCs into the upper atmosphere, intentionally destroying the ozone
layer. I can't think of any other scenarios!

[1]
[https://en.wikipedia.org/wiki/The_White_Plague](https://en.wikipedia.org/wiki/The_White_Plague)

------
abalone
He doesn't offer much support for his prescriptions.

Why is it bad to "try to keep things secret" but good to "spend a lot on
proactive defense"?

How would that apply to hydrogen bombs? Should we open source the specific
details on how to engineer a maximally efficient hydrogen bomb from the most
accessible materials, and just spend a lot on hydrogen bomb defense? (Which is
what exactly?)

When you omit support for conclusions, it implies you think the reasons are
obvious. But it is not obvious that we should do away with efforts at secrecy
around hydrogen bomb tech. Nor is it obvious we could defend ourselves from
widely available thermonuclear bombs by being "proactive". It's a hand-wavey
answer that appeals to the "information wants to be free" sentiment, but not
actually well supported.

~~~
htns
In the case of nukes, the key technology, gas centrifuges, was essentially
invented by civilians and then made very public by being sold all over the
world. The US tried to keep it a secret on its end but that achieved nothing
but the stagnation of that particular technology in the US. "Proactive"
international control is literally the only thing keeping every nation from
getting nuclear arms.

------
rayiner
> So everyone smart says that we worry about terrorism way too much, and so
> far, they’ve been right.

And the people who are even smarter realize that people will worry about what
they will worry about, and respond to threats proportionally to how much
people worry about them rather than chiding them about how much they should
worry about things.

Human beings aren't rational when it comes to fear, but the products of that
fear are very real. We live in a world where people freak out if an adult
talks to a child, but happily drive their kids around in the death traps that
are motor vehicles. Not only that, but we've gone to great lengths to
structure our society to treat the former as abnormal and the latter as totaly
normal. Telling people to be rational isn't going to make them that way.

------
Fomite
On the virus he's actually writing about and the "gain of function studies"
that caused so much controversy, part of the risk is not terrorists, or bad
guys, or any ill intent whatsoever.

This kind of research is conducted in BSL-3 labs, and there's a not
insignificant number of laboratory accidents, accidental exposures, etc. in
those labs, by well-intentioned, well trained people.

I saw a presentation recently that estimated, using fairly conservative
numbers, that 10 labs working on those viruses for 10 years had ~1600 deaths
in expectation. Now that distribution isn't normal - lots of zeros and then
some rare but catastrophic outcomes, but like many things, it doesn't require
anyone to do anything actively malign. Just screw up.

------
source99
I'm confused.

We shouldn't worry about terrorism because the likelyhood of dying from it is
very low, but we should worry about genetically engineered virus because the
likelyhood of dying from it is very high?

Last time I checked NO ONE ever died from a genetically engineered virus.

------
Dirlewanger
This is preaching to the choir, especially on this site of self-proclaimed
technologists. Yeah, we know more oversight/regulation needs to happen in
certain nascent industries. We also know how awful things could get; you just
told us. Blog posts like these is just wasting breath. Getting out and _doing
something_ and getting involved in the political process now is what will be
helpful...so when the last baby boomer in a position of political power
finally shuffles off the mortal realm (and we can have a weeklong celebration)
we will have well-educated people on the issues that matter ready to ascend to
power.

Then again, who am I kidding. We're talking about politics.

~~~
hotpockets
The main point of the article is not a point which is widely discussed, known,
or accepted. It has nothing to do with oversight/regulation. The point is that
offensive biotechnology has progressed way ahead of defensive biotechnology.
In other words, we know how to engineer viruses but not how to engineer the
immune system. The human immune system is vastly more complicated than
viruses. Therefore, the author appears to be calling for vastly more funding
into defensive biotechnology. In other words, though research into dna
modification technologies (such as viral engineering) is already heavily
funded, resarch into therapeutic methods and immune response should be much
greater than most are even considering.

My own personal opinion is that that funding should be increased orders of
magnitude; a few trillion over the next decade seems judicious.

------
antirez
The real effort to create such a virus is not clear, but I always wonder why
governments never tried to make an effort in order to make humanity more
resilient to attacks of this kind (natural or artificial) with education. The
incredible thing about viruses is that if you have a disciplined population
that stays home as much as possible, avoid contacts during interactions, and
so forth, during an epidemic issue, you can do _wonders_ at containing the
event. But for some reason we are not prepared at all to act rationally to
such an event.

------
lynchdt
"But another possibility is that we engineer the perfect happiness drug, with
no bad side effects, and no one wants to do anything but lay in bed and take
this drug all day, sapping all ambition from the human race."

That would be a pretty bad side effect in itself. There are some pretty nice
drugs that approximate what you say, yet as a population we mostly get on with
our lives.

It's not in our nature to be satisfied with any persistent state - positive,
negative or neutral. So as a tail risk worth worrying about I'd put this in
your terrorism category.

------
revelation
Is this satire? Why would you start a post with a description why the
remainder of said post is pointless fearmongering?

Theres plenty of dangerous technology out there, right now. We need not
conjure superviruses. The only thing saving us is as usual the incompetence
and scarcity of those that actually want to cause harm on a big scale. Don't
think for a second it's the security theatre that keeps the numbers down or
the lack of weapons of mass destruction.

------
api
I'm personally kind of amazed it hasn't already happened, either through
intentional (mis-)tinkering or natural mutation. The reason I'm surprised is
air travel. People fly everywhere, and anything that appears that is easily
transmissible ought to spread like wildfire.

It means one of several things:

(1) Humans are more resilient against plagues than we think.

(2) It's harder to produce a super-disease than we think, so it's a very rare
event.

(3) We've just been lucky as hell.

------
erkkie
Would it be possible to defend to these kinds of attacks by creating (and
spreading) less lethal forms of the killer virus (assuming general capability
to design viruses but not the immune system) and bring overall lethality down
(similarly how MRSA spreads less when there are natural competitors present)?
This would solve the offensive biotech / defensive biotech dilemma to a
degree.

------
lucb1e
> Based on current data, you are about 35,000 times more likely to die from
> heart disease than from a terrorist attack.

With that kind of logic we can get anywhere. For example it is _way_ more
likely to die after breathing air than after 'breathing' water. It just takes,
say, 75 years on average.

Comparing this with other non-natural death causes, such as murder, would be
much more fair.

~~~
PeterisP
If we're talking about decisions of the type "should I do something to protect
myself against X" or "should my government invest $$$ to protect us all
against X" or "should we sacrifice Y to gain protection from X" \- then in all
these scenarios it makes perfect sense to compare heart attack with terrorist
attack.

You can protect and save orders of magnitude more lives by focusing on the
real threats and ignoring terrorists.

------
FollowSteph3
Unlik say the Homebrew club, some similar hackers exploring viruses could
accidentally release something unintentionally. All it takes is one oops. And
looking at the early computer days security was never a primay or secondary
issue, if at all. But yet amazing things were created. The difference is one
is local while the other can spread with no control...

------
ivanhoe
Speaking of irrationality of fears, people are so obsessively frightened with
the idea of "mad scientists" doing weird genetical experiment that goes wrong,
and in the same time completely ignore the far more realistic horror stories
that are almost imminent, like bacterias becoming widely immune to all known
antibiotics...

------
peteretep

        > But another possibility is that we engineer the perfect
        > happiness drug, with no bad side effects, and no one wants
        > to do anything but lay in bed and take this drug all day,
        > sapping all ambition from the human race.
    

How does this compare to properly administered medical-grade morphine?

------
acidburnNSA
Nathan Myhrvold makes nearly the same point with lots more detail in his
Strategic Terrorism paper. [http://www.lawfareblog.com/wp-
content/uploads/2013/07/Strate...](http://www.lawfareblog.com/wp-
content/uploads/2013/07/Strategic-Terrorism-Myhrvold-7-3-2013.pdf)

------
streptomycin
If you're interested in more, start with the first time people got concerned
about this problem, nearly 40 years ago:
[http://en.wikipedia.org/wiki/Asilomar_Conference_on_Recombin...](http://en.wikipedia.org/wiki/Asilomar_Conference_on_Recombinant_DNA)

------
aero142
This area is just one that Bill Joy covers in
[http://www.wired.com/wired/archive/8.04/joy_pr.html](http://www.wired.com/wired/archive/8.04/joy_pr.html)
_Why the Future Doesn 't Need Us_, which is my favorite essay on lots of these
ideas.

------
jpatokal
Technical impossibility aside, I take issue with the whole assumption that
there exists a Hollywood-movie "bad guy" who would take it upon himself to
create a killer virus.

Yes, there are terrorists who do not shirk from mass murder _to achieve their
goals_ , eg. flying planes into the heart of the enemy's military and
financial centers. But creating a virus that that respects no religion or
national boundary and will kill _everybody_ it touches serves no rational
goal.

The only reason for somebody to do this would be if the extermination of the
human race was the actual goal, and that's just so far-out that even certified
nutcases like Japanese subway sarin attackers Aum Shinrikyo would blanch. The
ideology of these groups is invariably that, while the rest of the human race
may be doomed, _they_ are the Chosen Ones that will survive, but viruses don't
play dat. It would thus only make a smidgen of sense to do this if you had an
absolutely solid antidote/vaccine that would ensure that your group can
survive the onslaught... and if that exists, the rest of humanity can develop
one as well.

------
jcfrei
I absolutely agree. And I believe governments nowadays are doing too little to
prepare us for a potential epidemic of a lethal virus. In my opinion a
reasonable measure would be an emergency plan which (in a matter of a few
days) can provide all households with enough food for a month long curfew.

------
mrcactu5

      hacking our bodies will likely be more powerful than hacking bits
    

on some level we've been doing this for thousands of years. it is a matter of
time before we take it to next level. and there is lots of interest there.

------
debugunit
This is unrelated to the topic at hand, but Sam Altmans' blog has, at some
point in the past week, been blocked by the corporate firewall I'm behind
(large international bank). How long till they block HN, I wonder.

------
wensing
If you're interested in building risk-based applications, come talk to me
about Pulse OS and Riskpulse -
[http://riskpulse.com/offerings/](http://riskpulse.com/offerings/)

------
badjujubees
"Based on current data, you are about 35,000 times more likely to die from
heart disease than from a terrorist attack."

Could we please get a link / source to this current data?

------
fleitz
Given the history of weaponization it would be illogical to worry about 'bad
guys' instead we should be worried about the US Government.

~~~
Udo
Nuclear annihilation didn't happen because a few countries showed some
restraint. In the coming decades, the capability of manufacturing deadly
agents will be shifted to individual people. In fact, one could argue this is
already happening. So our future will most likely be impacted by the behavior
of billions of people, most of which could take a serious stab at causing mass
casualties if they actually wanted. To top it off, this shift of capabilities
to the individual will have a probably small but considerable chance of ending
our civilization outright. This is absolutely something to worry about.

~~~
fleitz
Probably time to give individuals similar sovereignty to nations. Likely what
will occur, similar to nuclear proliferation, is that individuals will
probably realize that the only way to achieve sovereignty similar to nations
is by developing these sorts of technologies.

Congrats to the prepressive nations who are incentivizing the development of
these technologies by being opressive.

