
Software has its own Gresham's Law - johndcook
http://www.cs.uni.edu/~wallingf/blog/archives/monthly/2015-06.html#e2015-06-14T09_17_33.htm
======
nostrademons
The same dynamic shows up in a number of other systems. For example:

The Peter Principle. People who are performing well in their current role get
promoted out of that role, until they reach their level of incompetence. At
that point, they get stuck. Eventually the whole organization consists of
nothing but incompetent people.

Gerrymandering. A political incumbent who redraws his district to include more
supporters will last longer than one who doesn't. Eventually, _all_ districts
are gerrymandered, and all incumbents are virtually unassailable.

Vendor lock-in. A company that promotes consumer choice is easy to switch away
from; one that promotes lock-in, by definition, is hard to switch away from.
Eventually everyone will be buying from vendors they're locked into. (Unless
the former company's products are _much_ better - this was the strategy Google
pursued until ~2011. Note, though, that they achieved this through some
measure of _employee_ lock-in.)

Basically the only requirement is that "good" solutions are more liquid than
"bad" ones. Many, many systems exhibit this property.

You could look at most of modern society as a way to generate feedback loops
_on top_ of this dynamic to mitigate it. For example, an organization full of
incompetent people is likely to go out of business and be replaced by a new
one, often one founded by the very people who were driven out of the original.
(See: Disney => Pixar, Apple => NeXT, Shockley => Fairchild => Intel, Netscape
=> Firefox => Chrome.) Similarly, a company full of bad, hard-to-replace code
either embarks on a complete rewrite, or they're vulnerable to a startup
without that baggage.

~~~
crististm
'Basically the only requirement is that "good" solutions are more liquid than
"bad" ones' \- this could be used as a criteria to identify both good and bad
stuff

------
michaelfeathers
I look at code as biology. It competes in its environment. As it becomes more
complex it can fight off all competitors.

Code grows toward irreplaceability. That's why we are surrounded by code that
is hard/impossible to replace.

We shouldn't be surprised if code feels like Kudzu after it has been around
for 10+ years.

~~~
rumcajz
Interesting that nobody have brought up the topic of computer viruses yet.
People tend to grasp the evolutionary parallel better with parasitic software
than with symbiotic one.

------
stcredzero
Exactly this happened with a lot of Smalltalk projects. The ones which were
structured such that the Smalltalk Refactoring Browser parser could be used to
accelerate porting projects -- usually the better architected and factored
codebases -- could leave Smalltalk for other programming environments. Even if
syntax driven code translation wasn't used, the better factored projects were
still easier to port.

(And to head off the usual criticisms of automated code translation, this
tends to work well, when the project has well adhered-to coding standards and
patterns, so that idiomatic code in language A can be matched and translated
to idiomatic code in language B. In other words, if there is a consistent use
of project-level idioms, it's easy to do good idiomatic translation at the
language level. The other necessary ingredient is a powerful parser+meta-
language which can fully express the capabilities of the source and target
languages.)

------
bitwize
And this is why I hate systemd: its primary design criterion, it seems, is to
be as difficult to replace as practically possible -- in stark contrast with
sysvinit, OpenRC, etc. Once it's suitably entrenched it can simply be declared
"the standard" and then Linux systems without systemd will fall out of
compliance and hence out of support by the greater ecosystem.

------
bcg1
I disagree with Sustrik's assumption that software drifts to become a
collection of non-reusable components. His observation is an interesting
theory but I think it breaks down because it could only really be a "law" if
it is true that reusable (does he mean replaceable?) components are always
switched out for irreplaceable ones.

Most projects I've worked on start out hairy messes and if they are cursed
with success new requirements will eventually justify the cost of replacing
the irreplaceable. Well designed components aren't quickly switched out for
poorly designed ones because developers don't want to let that happen. Its not
safe to assume that poorly written components are better suited to survival...
to the contrary, they are the most likely targets for removal in the first
place.

~~~
rcthompson
It doesn't require that replaceable components are always replaced with non-
replaceable ones. The point is that over time, a replaceable component is
likely to be swapped out multiple times, until, probably by accident, it is
replaced with a component that is no easily replaceable, as which point it
becomes stuck (e.g. glue code is written specifically to interface with that
component and would be difficult to rewrite for a replacement).

In mathematical terms, you can imagine it as a Markov chain describing the
component currently slotted in, where non-replaceable components are the
absorbing states.

~~~
geon
If the easly replacable component is good, there will be no reason to replace
it at all. Components aren't getting replaced willy-nilly.

~~~
rcthompson
Yes, but easily-replaceable components are still going to be replaced more
often than difficult-to-replace ones.

~~~
geon
Zero * n is still zero.

------
omouse
You could see this in action with GCC if I remember correctly; they purposely
made it a monolith so that it was harder for proprietary plugins to be added;
therefore only GPL'd and LGPL'd components would be worked on.

------
gweinberg
The Gresham's law analogy is a crock. Gresham's law happens because the
government compels merchants to accept the bad money as being equivalent to
the good, but it can't effectively compel customers to spend the good money.

~~~
gwern
Yes, without government coercion, it's simply Thier's law: good money drives
out bad. So if software tends to degrade over time, what's playing the
perverse role of preventing people from switching to better modules/software?

~~~
skybrian
There are of course all sorts of ways that a public API can become an
entrenched standard (x86, Windows, PHP, JavaScript, C++, HTML, Java).
Collectively we call it "lock in".

The thing is, the software wouldn't have been adopted in the first place if it
didn't fulfill some need. Evolutionarily speaking, it's better for the cost of
removing to be high and the benefit of keeping it to also be high. The most
evolutionarily fit would be software that's essential, yet complicated,
obscure, and unsexy, so it doesn't attract attention of idealists who will put
in sufficient resources into rewriting it.

Even better if it attracts passionate advocates who will _fight_ removal.

~~~
kansface
OpenSSL immediately comes to mind, but I don't think JavaScript really fits
the model - it is still evolving into a better language. In a very real sense,
EMCA7 is an entirely different language than EMCA5.

------
dkarapetyan
This makes sense at the level of programmers as well. A programmer that does
their job well is easily replaceable because the code they write is easy to
maintain and so they will be replaced by someone who is not as good of a
programmer and writes less maintainable code.

~~~
braythwayt
Speaking through my long white beard, I would say this is not _trivially_
true. If we follow what the article says and say this is about survival in a
particular niche, then yes, Programmers who are good at a specific job will
eventually be replaced at that job, while programmers who are terrible at that
job but not obviously so will tend to stay in that job.

Thus, if you want to maintain the same piece of the same legacy application
indefinitely, be terrible at at. Whereas, if you are good, do the job _and
move on_. Whether that means moving upwards, or to new pastures, or embracing
new technologies, being good means constantly renewing yourself.

The caution in all of this is that if you fail to be good, not only will you
be irreplaceable, but you will also be immobile. So, either polish your skills
on a regular basis, or polish your Swingline stapler.

~~~
fauxfauxpas
This made me recall Meilir Page-Jones' book "Practical Project Management"
from way back...

"Second, employees who are truly competent and are eager to make a genuine
contribution to the department soon resign from a mediocracy, leaving behind
them the dross of nonproducers and internecine warriors. I term this effect
the Inverse Gresham’s Law: A mediocracy hoards mediocre people and drives good
people into general circulation."

[http://www.waysys.com/book-excerpts/ppm-
ch15.html](http://www.waysys.com/book-excerpts/ppm-ch15.html)

------
dkbrk
I think this is not so much a "law" as a failure mode. It up to the
programmers, objectives, time constraints and economic considerations whether
a concerted effort is made to increase code quality or to go for the "quick
fix" at the expense of long-term maintainability.

~~~
michaelfeathers
Anything that changes continually with a bias toward preserving existing
structure ends up looking like biology in some way.

When people add to existing methods rather than creating new methods, or add
to existing classes rather creating new classes we end up with that organic
feel. Fruit starts green, grows ripe, and ends rotten.

The fact we need to face is that this is what people do naturally. It's not an
accident. It's more like behavioral economic incentive.

------
mml
I'm currently tending a 16 year old Java codebase, with very little in the way
of maintenance in the years since it was written.

"Software over time tends towards monsterism." is apt in my mind.

~~~
javajosh
Any constraint not imposed by a compiler is a degree-of-freedom that will
drive a system toward complexity. It takes extraordinary effort and discipline
to avoid this, especially over the long term. (One way of thinking about what
architecture is is defining what allowed DoFs are on a system, so if you need
to add a field to a form or implement some feature, the architecture is the
thing that maps to individual concrete artifacts.)

------
rwmj
With the amount of code developed in open git repositories, surely it is
possible now to quantify these observations? Bring hard facts to the
discussion.

~~~
sebastianconcpt
That's a great point but the amount of analysis seems huge. Also what to
quantify exactly? (and the criteria is different per programming paradigm)

~~~
rumcajz
What about looking at evolutionary biology? Those guys already have some
methodological apparatus in place.

------
brockers
systemd anyone?

~~~
datenwolf
Thank you. You beat me with that one :)

------
digi_owl
Gold is never "good money".

it was a good material for making tokens back when making hard worn notes was
a virtual impossibility.

