
Normalization of deviance in software: broken practices become standard (2015) - sidcool
https://danluu.com/wat/
======
partycoder
How would you feel if:

\- you were a chef. you went to culinary school and know about cooking and
best practices.

\- you are asked to forget everything you learned in school and flip burgers
with overly reused oil and unwashed equipment.

\- you are told that there's no time to clean the kitchen, or wash your hands.
just flip as many burgers as possible to maximize profit.

\- the ones that cook the most burgers are rewarded, treated well, promoted.
as soon as cooks realize this, they start making half-cooked burgers that make
people sick.

\- the customers get sick all the time from the lack of hygiene. you talk
about it, and you are told to keep doing what you are doing, or that you don't
understand what being a chef is, or what the business is.

\- in the end the only source of gratification is getting a paycheck, or doing
something in your spare time. you are not cooking real food, just part of a
machine that makes money for a self-serving company while ripping off the
customer.

This is what it feels to be an engineer in many companies.

(disclaimer: repost)

------
LeoJiWoo
I take some issue with statement "Workers are afraid to speak up".

Workers that don't own their own means of some kind of production are working
for someone else. So its reasonable to be afraid when your livelihood, people
have families to support. The point about cultures of "meaness" and "niceness"
misses the mark. People are afraid of being out-grouped or becoming a social
pariah for reasonable reasons such losing a job, getting blacklists, being
undermined secretly, and etc.

In the current climate of extreme witchhunts, it is wiser to be quiet rather
than risk the wrath of fickle masses.

EDIT: The speaking up context is in reference to project/organization feedback
(office politics not government politics).

~~~
pjc50
Some say "extreme witchhunts", some say "persistent offenders who have relied
on their privilege for years are finding their Get Out Of Jail Free card
doesn't work any more".

~~~
Sniffnoy
These aren't mutually exclusive.

------
twic
_Knowledge is imperfect and uneven. People don’t automatically know what
should be normal_

This has been a big one in my experience. Because our industry is growing
fast, at any point in time, most people have not had very much experience. I
worked a company which had hired a load of bright young things straight out of
university and trained them up. They were great people. They also had no idea
that a lot of what they were doing was really bizarre and unproductive. Or, if
you hire people who have had a few jobs, they're often not very varied: for
all values of 'Rails', someone with five years of moving around doing Rails
will know a lot about Rails, but won't be in a position to appreciate just how
screwed up a lot of things in the Rails world are.

It's hard to escape from this trap, because even when you look outside your
office, there's so much duff information available. You read an enthusiastic
article about CORBA/XML/model-driven
architecture/SOA/MongoDB/microservices/lambdas/blockchain/etc; as someone with
a few years' experience, in which time you've never touched any of these
things, how on earth are you supposed to be able to work out that it is in
fact terrible advice?

~~~
barrkel
Less ageism? Less pop culture? Fundamentally, less fun and more seriousness?

Because it's so much fun to create new things in our industry, so much new
stuff gets created. Often it's only marginally better than what came before,
or more frequently, it's only better in one dimension, and significantly worse
in other dimensions. But it gets adopted because it has social currency,
recency, buzz, etc. and all the deficiencies slowly get filled in. And by the
time it's mature, something else new comes along.

As long as we keep hiring cheap kids to build our stuff, they'll keep
reinventing the wheel, and the workforce available to hire will only
understand, and more importantly only want to work with, the latest popular
fads.

~~~
chris_t
A big reason for the bias towards new things is that greenfield development is
easier and more fun, especially in organisations that aren't capable of
producing good software. Learning a new framework every couple of years makes
sense if it means you get to avoid having to work on confusing spaghetti code
from a decade ago.

------
glitcher
"Perfect is the enemy of good" is another common argument I've heard used as
an excuse to avoid implementing process improvements. Or the 'ol catch 22 of,
we just don't have the time to invest in processes that will save us all time!

~~~
durbatuluk
That fucking phrase was the motto of one of our devs. Fast-forward one year
and we still don't deliver the product to client.

~~~
Pharylon
That was hung on the door to the office of the DBA at my last job.

One day, we had a catastrophic loss of data due to a junior developer running
a destructive script against the production server instead of the test server.
We went to restore the database only to discover the hourly snapshots had been
failing for months and he hadn't noticed

~~~
sitkack
Almost all system failures are due to open loop processes. To evolve you need
to rely on feedback.

------
tedunangst
Two interesting quotes from the linked paper:

> It is very important that after the meeting, the offending employee receive
> an official, written summary of the meeting which outlines the next steps in
> the remediation. The employee should be required to acknowledge that the
> summary is accurate. Interveners might have to repeat this last step,
> because the rule offender can find it distressing to acknowledge the
> meeting’s having taken place

> Richard Cook (1998) has pointed out that complex systems such as healthcare
> are intrinsically hazardous in that they invariably contain changing
> mixtures of failures, weaknesses, and expertise, and always run in a
> “degraded” mode.

By always running in degraded mode, it's not meant that degraded is the normal
state, but that even in a degraded mode everything continues to run. You don't
realize the extent of the degradation as long as the gears keep turning.

------
lkrubner
Self-sabotage is remarkably common. Peter Drucker wrote about this in his book
"Innovation And Entrepreneurship". His book is full of anecdotes similar to
this article. It is surprising how many managers do things that clearly hurt
the business they work for.

Drucker told a story from 1956, about an executive at Macy's who was angry
that so many customers were coming in to buy washing and drying machines for
clothing. Drucker was confused, and asked if the machines had a lower margin
than other products. No, said the manager, they had a higher margin. Then why,
asked Drucker, was it bad that customers wanted to buy the machines? The
executive explained that Macys was all about selling clothes, not machines,
and the customers were going to damage Macys reputation. The executive had
previously worked in the clothing department, for decades, and he felt the
clothing department was the "real Macys" and everything else was a sideshow.

Drucker has more stories like that. It is very common that managers have some
initial bias, that establishes some odd habits in a company, and which over
time become normalized. I recently wrote about a similar incident, at a
startup where things were a bit odd from the beginning. Despite economic
theory, managers are rarely rational.

I consult with many clients in New York City, and I can not begin to tell you
how often this is true:

" _Anyone who comes into one of these companies from Google, Amazon, or
another place with solid ops practices is shocked. Often, they try to fix
things, and then leave when they can’t make a dent._ "

In the last several years, I was at two companies that got rid of all their
previous code, and did radical re-writes of everything:

[http://shermanstravel.com/](http://shermanstravel.com/)

[https://openroadmedia.com/](https://openroadmedia.com/)

but I worked at a dozen other companies where people knew that fundamental
systems were broken, and there was no effort to fix things.

About the politics of fixing broken companies, this is exactly true:

" _Google didn’t go from adding z to the end of names to having the world’s
best security because someone gave a rousing speech or wrote a convincing
essay. They did it after getting embarrassed a few times, which gave people
who wanted to do things “right” the leverage to fix fundamental process
issues. It’s the same story at almost every company I know of that has good
practices._ "

~~~
reificator
> In the last several years, I was at two companies that got rid of all their
> previous code, and did radical re-writes of everything:

Over the last few years the best work I've done has been to take what existed
and slowly refactor it into something better.

Well, I say slowly, but the refactors themselves were often quick. I try to
identify the most common pain points - for both users and developers - and
eliminate those as quickly as possible. The overall codebase might not get
better at a crazy rate that way, but the developer experience goes up, meaning
fewer bugs, meaning quicker iteration times and more confidence making changes
that _should_ be easy but so often are not.

And the end result of that two or three years on is a significant improvement
in the quality of the code, the number of bugs - breaking or otherwise - that
users encounter, and often times a manyfold decrease in the total LOC that
need to be maintained.

I've tried rewriting some things as well: but even when it's a literal
requirement in order to switch underlying platforms, it still causes a lot
more friction than refactoring does. I've got three projects right now where
the deadlines have massively slipped, and I'm now at the point where I'm
looking to try to take some of the rewrites and backport them to the original
platform just so that migration goes a little easier when the rest of the
pieces fall into place.

------
Alex3917
In defense of Flaky, the most common use case is for functional tests, since
the browser can just run out of resources and crash for reasons that have
nothing to do with your code.

If you own tests aren't idempotent then that's obviously a problem, but as
test suites get better that's becoming less of an issue.

------
YeGoblynQueenne
>> The data are clear that humans are really bad at taking the time to do
things that are well understood to incontrovertibly reduce the risk of rare
but catastrophic events.

The big problem with the "normalisation of deviance" is that humans are both
very smart and also get bored, or irriated, very easily. The example of the
anesthesiologist who turned off the anoxia warning is indicative.

But if that's how humans _work_ , then that's something that needs to be taken
into account when designing, essentially, security systems. We can't ignore
the propensity of people to look for shortcuts to repetitive or boring tasks
any more than we can ignore the fact that we tend to have our hands at the top
of our body, or that we can't withstand extremes of temperature.

We wouldn't design a machine that had all its controls placed near the ground,
too low to be manipulated easily by the average human; we wouldn't send
unprotected humans to work in environments where temperatures could reach 1000
°C. Why would we give people tasks that would bore them to death or make them
so mad that they would risk harm to themselves or others to avoid performing
them?

The (IAEA) [1] has a bunch of papers on radiological accidents on its website
and they are full of situations where people have literally hacked through
security measures to stop them from blasting themselves with lethal doses of
radiation- and fried themselves to death as a result.

For instance, see [2], [3] and [4]. These are three cases, one in El Salvador,
one in Israel and one in Belarus, where an experienced operator of an
irradiation facility overrode security mechanisms to enter the irradiation
chamber and unblock a mechanism transporting packages over a radioactive
source [5]. In [3], the chamber was designed like a D&D dungeon, complete with
a concrete maze to absorb radiation, and actual traps: motion and pressure
sensors that, when triggered, would immediately place the source at a safe
position inside a dry pit and even a retractable section of floor near the
maze entrance that created a pit too long to be jumped over. They never found
out exactly how the operator crossed that pit. But- he did and paid for his
ingenuity with his life.

We are smart little monkeys, we get bored easily and we always find a way to
cause untold carnage. Those are factors that no design can ever afford to
ignore.

__________________

[1] International Atomic Energy Agency.

[2] [http://www-pub.iaea.org/books/IAEABooks/3798/The-
Radiologica...](http://www-pub.iaea.org/books/IAEABooks/3798/The-Radiological-
Accident-in-Soreq)

[3] [http://www-pub.iaea.org/MTCD/publications/PDF/Pub847_web.pdf](http://www-
pub.iaea.org/MTCD/publications/PDF/Pub847_web.pdf)

[4] [http://www-pub.iaea.org/books/IAEABooks/4712/The-
Radiologica...](http://www-pub.iaea.org/books/IAEABooks/4712/The-Radiological-
Accident-at-the-Irradiation-Facility-in-Nesvizh)

[5] To sterilise the contents.

------
tedunangst
(2015)

~~~
coldtea
Might as well have written (evergreen)

