
Adams Complexity Threshold - alexandros
http://dilbert.com/blog/entry/adams_complexity_threshold/
======
stcredzero
_The Adams Complexity Threshold is the point at which something is so
complicated it no longer works.

The Gulf oil spill is probably a case of complexity reaching the threshold. It
was literally impossible for anyone to know if the oil rig was safe or not.
The engineering was too complex. I'm sure management thought it was safe, or
hoped it was safe, or hallucinated that it was safe. It wasn't possible to
know for sure._

Disagree. If the oil companies had put together a consortium to research such
"worst case" scenarios and develop equipment and procedures to deal with them
quickly, then this would have been preventable. (How about an even worse
scenario: oil rig blows up, sinks, smashes the blowout preventer. Even Michael
Bay can think of that one.) The oil companies could've done this, but chose
instead to gamble on what they thought was a "good enough" lower level of
safety.

No one can be 100% sure all of the machinery and safety mechanisms and
procedures will work, but it is possible to make the likelihood reasonably
small and then _add an additional fallback position_ that you are pretty
certain will work. The oil companies didn't have a fallback beyond the blowout
preventer that they knew would work in a short timeframe.

An unreasonable expectation? Perhaps. Technology companies do this with their
data all the time. Of course, protecting data is 1) much cheaper and 2) is
protecting an asset and not preventing an unfortunate externality.

~~~
pigbucket
> _No one can be a 100% sure_

The problem is complexity plus scale. Do something potentially disastrous a
100k times, then you need to be impossibly close to being a 100% sure you've
made it safe for every case. 99.9% doesn't cut it. BP's fallback position is:
if it blows, and we can't stop it, we'll drill a relief well. That's the kind
of decision that looks ethically bankrupt, probably is ethically bankrupt, and
is nonetheless going to be made every time unless all the other oil companies
also agree, or are forced to have, the fallback already in place.

------
chaosmachine
Pre-drill a relief well. It's not that complex. Actually, some countries
already require it.

~~~
ovi256
Drill a relief well. It's not that big of a deal.

------
pchristensen
_The Checklist Manifesto_ ([http://www.amazon.com/Checklist-Manifesto-How-
Things-Right/d...](http://www.amazon.com/Checklist-Manifesto-How-Things-
Right/dp/0805091742/) ) is exactly about managing excessive complexity.

------
bh42
Adams is a smart and funny guy, but his great talent to simply often leads him
to embarrassing oversimplification.

His bold assumption that no one could be sure if the BP rig was safe is
complete BS. Watch the 60 Minutes report, listen to the interview, LOTS of
people knew exactly how unsafe it was.

The same goes for Enron, both Bethany McLean and James Chanos pointed out the
king is naked, long before Enron imploded.

Just because Scott Adams can't comprehend something, it does not mean that it
is incomprehensible.

It is true that our modern lives are very complicated. But primitive people
posses staggering amounts of knowledge, which just happen to be essential for
their survival. We all had knowledge like that a long time ago. I have no idea
exactly who knows more, but I bet we are all in the same ballpark, because we
are all humans.

~~~
Gormo
As they say, hindsight is 20/20.

It might be relatively easy post-hoc to trace a problem that has already
occurred back to a relatively small set of causes, and then ask why these were
overlooked - after all, an ounce of prevention is worth a pound of cure.

But when the system itself is so complex that any of a thousand other factors
might have gone wrong instead, you're really comparing a thousand ounces of
prevention to a pound of cure. You can get really caught up in all of the
plausible contingencies, which grow disproportionately with complexity.

If you were to audit that oil rig inch-by-inch, you might have found dozens of
things which could have led to a problem of near the same impact as the
current leak. And you might have even focused on fixing _all_ of them. But you
can't presume that some other factor that you didn't notice (or couldn't have
noticed given the finite nature of your knowledge) wouldn't have triggered a
slightly different failure, and you can spend an enormous amount of time
fixing things that would not have caused a failure.

Adams makes a good point - the best way to ensure the reliability of a system
is to understand the functioning of the system as a whole, and beyond a
certain scale of complexity, this is extremely difficult and often impossible.

------
macrael
Not sure I buy his point about how engineers and other professionals are the
only ones who manage complexity. I have never taken a poly-sci class before,
but my guess is that the whole point is to work to reduce the enormous
complexity of differing political systems into something you can understand.
And, as a single data point, my English major/aspiring author friend is one of
the deepest thinkers I know. He regularly surprises me by being able to distil
complex situations into something manageable. Engineers definitely don't have
a monopoly on that.

------
jcl
_Complexity is often a natural outgrowth of success. Man-made complexity is
simply a combination of things that we figured out how to do right, one
layered on top of the other, until failure is achieved._

A great line... Reminds me of:

"Debugging is twice as hard as writing the code in the first place. Therefore,
if you write the code as cleverly as possible, you are, by definition, not
smart enough to debug it." -- Brian W. Kernighan

------
arohner
Things like <http://abcnews.go.com/print?id=10763042> make me not convinced
the BP was just an accident that could have happened to any of the oil
companies.

