Hacker News new | comments | show | ask | jobs | submit login

A core fundamental here is that people inherently misjudge the competency of their peers / the state of other projects / components / departments. Thus when they have a "problem" they make an assumption that their single point of failure won't be catastrophic (because everything else is okay), and that there's no need to sound an alarm over it (and potentially jeopardise their career).

Besides, it will be fixed soon enough. Except, then something else comes up, and the last fault goes unresolved.

If our work-culture was less focussed on success and blame and more focussed on communication and effort, fewer catastrophes would happen, I'm sure. Unfortunately, the world doesn't work like that.

Closely related are Celine's Laws, particularly the 2nd:

Accurate communication is possible only in a non-punishing situation.

From Robert Anton Wilson's Illuminatus! trilogy.

From Wikipedia: ""communication occurs only between equals." Celine calls this law "a simple statement of the obvious" and refers to the fact that everyone who labors under an authority figure tends to lie to and flatter that authority figure in order to protect themselves either from violence or from deprivation of security (such as losing one's job). In essence, it is usually more in the interests of any worker to tell his boss what he wants to hear, not what is true."


This reveals the main point emergent from (but not necessarily within) the article: the failure of complex systems is most often the result of poor management practices.

Management, above all, is responsible for an organization that can enable quality through systematic means. There are no other means.

In the case of smaller organizational systems (projects, companies, possibly even states), yes.

At the largest scale, I find the analysis of Diamond and Tainter comes into play. The capacity to survive smaller crises and overcome them just increases the magnitude of your final failure, though Diamond suggests a few means by which failure may be averted (Tainter seems to find it inevitable).

Ultimately, the resources required to maintain a system prove insufficient.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact