
Complexity Has to Live Somewhere - mononcqc
https://ferd.ca/complexity-has-to-live-somewhere.html
======
JoeAltmaier
Oh this, I've known for years. I learned it early, as a new OS engineer at
Convergent Technologies. There was a nasty part of the kernel, where
programmable timers lived, that was a large part of the change history of the
OS. Looking at it, I saw folks had been thrashing for some time, to get all
the issues resolved. But it was a '10 lbs of feathers in a 5 lb box' kind of
thing. One change would cause another issue, or regress a bug.

So I took 2 days, made a chart of every path and pattern of events (restarting
a timer _from a timer callback_ ; having another time interval expire _while
processing the previous_ ; restarting while processing _and_ another interval
expires; restarted timer expires _before completing previous expiration_ and
on and on). Then writing exhaustive code to deal with every case. Then running
every degenerate case in test code until it survived for hours.

It never had to be addressed again. But it _did have to be addressed_. So many
folks are unwilling to face the music with complexity.

~~~
stagger87
Regarding being 'unwilling to face the music', is it possible that you were
only able to perform this refactor due to all the 'thrashing' that came before
you? Is it possible that the majority of commits leading up to your refactor
were dealing with unforeseen complexities and addressing real bug fixes that
you were able to conveniently take in and synthesize all at once?

Refactoring too soon, before understanding the entire system (and all the
possible changes/additions/issues that could arise) is probably worse (time-
wise) than thrashing for a bit.

~~~
JoeAltmaier
As I recall (and it was a long time ago) the history gave some insight into
the complexity. But nobody had exhaustively enumerated all the cases. I
remember the chart had more conditions that had ever been addressed.

~~~
hinkley
You'd be surprised how few people look through commit history while bug
fixing.

There's a steady stream of people who don't understand why 'breaking' the
commit history is a problem, and assume you have some sort of untreated OCD if
you bother to even care.

~~~
bJGVygG7MQVF8c
100%. Apparently relatively few have experienced the power of `git bisect
run`. More than once I've had to explain that no, this isn't just some
idiosyncratic aesthetic preference, there's a reason your version control
tooling is designed the way it is.

~~~
XorNot
Higher level tooling doesn't prioritize this. AFAIK neither Gitlab nor Github
have an actual option in their CI implementations for "test _every_ commit
before allowing merge". And both of those are everywhere.

There's also nothing out there which enforces practice - i.e. new code must be
covered by some amount of test case (I'll settle for "it is executed in some
way as part of build").

These are all things that _could_ be done, but they're not priorities to the
big players evidently, and good luck to me trying to get any organization to
commit to putting that tooling in or running something custom that does it.

~~~
Aeolun
Why do we want to test every commit? Does it matter if you broke something if
you end up fixing it later?

~~~
XorNot
In the context of git bisect, it's vital. You can't go looking for bugs commit
by commit with no requirement that any individual commit is actually buildable
or functional.

In my own projects I'm as guilty of this as anyone, because it's nigh
impossible to get the tools to actually do this (which has a massive effect on
the culture which builds up around them).

------
UweSchmidt
Complexity is often frivolously created during specification. If the true
consequences of complexity (wherever it may live) were understood, we'd
simplify things a lot. But they want that next button, they want ancient
emails be searchable instead of having them archived, and they have no idea
that their wish leads to new servers installed in a data center, new bugs
introduced, and development time lost at for each new release.

There are also ways to design things better (or worse) with regards how they
handle complexity. Reuse UI, patterns, workflows and languages. Keep things
consistent, make things discoverable.

Point is, the slogan "Complexity has to live somewhere" could also be used as
an excuse to do sloppy work. Then again, this keeps us all employed.

~~~
majormajor
> Complexity is often frivolously created during specification. If the true
> consequences of complexity (wherever it may live) were understood, we'd
> simplify things a lot. But they want that next button, they want ancient
> emails be searchable instead of having them archived, and they have no idea
> that their wish leads to new servers installed in a data center, new bugs
> introduced, and development time lost at for each new release.

I think features are definitely worth servers being installed in a data
center, and most people coming up with product requirements are fine with the
_monetary_ costs.

Features leading to bugs and wasted development time... that's where the point
of the article lives. There are ways to build software that can more easily
accommodate changes.

However, it's more common that developers try to _hide_ the complexity of
features behind abstractions that hinder more than they help. And _that 's_
what results in breakages and lost time.

------
bertmuthalaly
I think the conversation has shifted slightly this decade to avoiding
_incidental_ (accidental) complexity (I didn’t realize Fred Brooks popularized
the term!), which I wish the author would address.

Otherwise this essay is spot on when it comes to essential conplexity.

Incidentally, the question of “where complexity lives” is one of the focal
points of “A Philosophy of Software Design,” which comes highly recommended if
you’re trying to come up with your strategy for managing complexity from first
principles.

~~~
mononcqc
I think the most contentious part of the post is that I just simply assert
that people are an inherent part of software. You often avoid the incidental
complexity in code by indirectly shifting it to the people working the
software.

Their mental models and their understanding of everything is not fungible, but
is still real and often what lets us shift the complexity outside of the
software.

The teachings of disciplines like resilience engineering and models like
naturalistic decision making is that this tacit knowledge and expertise can be
surfaced, trained, and given the right environment to grow and gain
effectiveness. It expresses itself in the active adaptation of organizations.

But as long as you look at the software as a system of its own that is
independent from the people who use, write, and maintain it, it looks like the
complexity just vanishes if it's not in the code.

~~~
munificent
_> You often avoid the incidental complexity in code by indirectly shifting it
to the people working the software._

Yes, this is Larry Wall's waterbed theory:
[https://en.wikipedia.org/wiki/Waterbed_theory](https://en.wikipedia.org/wiki/Waterbed_theory)

I do think it's important to distinguish accidental and essential complexity.
Some complexity is inherent and if you think you've eliminated it, all you
have really done is made it someone else's problem.

But there is also a lot of complexity that is simply unnecessary and can be
eliminated entirely with effort. Humans make mistakes and some of those
mistakes end up in code. Software that does something that _no one_ ever
intended can be simplified by having that behavior removed.

------
spacedcowboy
Complexity is like energy, it cannot be created or destroyed, it is a
fundamental property of any process or thing.

However, complexity can be managed. Humans do this using abstraction as a
tool. We divide the complex problem up into a sequence of several simpler
states, and we find it easier to understand the simpler problems _along with_
the sequence within which they lie.

Good software uses this same approach to reduce complex issues to a manageable
process. A good tool makes the simple things easy and the complex thing
possible, the design of the tool reflects the effort and work that the
designer out in to u the problem they’re trying to solve, and to produce
something that helps guide others along that self-same path of understanding,
without them having to put in the same level of effort; it establishes the
golden path through the marshes and bogs of difficulties that the problem
domain throws up.

“Embracing complexity” is a measure of last resort, IMHO. It means the tool
developer could not analyze the problem and come up with a good solution; it
means “here, you figure it out”; it means giving up on one of the fundamental
reasons for the tools existence.

 _Sometimes_ , embracing complexity and the ensuing struggle that this
necessitates is simply what you have to do, but not often. Maybe, _maybe_ this
is one of those times, but I always start off with a critical eye when someone
tells me that a complicated thing is “the only way it can be done”. Colour me
sceptical.

~~~
ninjapenguin54
Complexity can easily be manufactured. Comparing it to something as
fundamental as energy is pure bollocks.

Heres an amusing and simple example of manufactured complexity:
[https://github.com/EnterpriseQualityCoding/FizzBuzzEnterpris...](https://github.com/EnterpriseQualityCoding/FizzBuzzEnterpriseEdition)

~~~
pron
If only there were some scientific discipline that studied complexity and
proved that _problems_ possess some essential, minimal, complexity below which
no implementation can go...

If anyone proved such a result, I'm sure they would be awarded a prize of some
kind.

~~~
renewiltord
Well, there are many complexities. People here are talking about the
Programming Complexity (of which one measure is Cyclomatic Complexity) of a
textual program vs. the Time/Space complexity of the same program. There is
the weakly-related concept of the Kolmogorov complexity of a string.

It's obvious that the time/space complexity of a program can stay fixed while
you arbitrarily raise the programming complexity of a program by arbitrarily
raising the number of branches that are rarely taken, an action that won't
affect the asymptotic time complexity of a thing.

And sure, you can talk about the Kolmogorov complexity of a string of computer
code, but the minimal string representation of that code is unlikely to be one
that a programmer would describe as simple. Even minimizing the string that
would behave as that of the original program is usually non-desirable.

[https://en.wikipedia.org/wiki/Programming_complexity](https://en.wikipedia.org/wiki/Programming_complexity)

[https://en.wikipedia.org/wiki/Computational_complexity_theor...](https://en.wikipedia.org/wiki/Computational_complexity_theory)

[https://en.wikipedia.org/wiki/Kolmogorov_complexity](https://en.wikipedia.org/wiki/Kolmogorov_complexity)

~~~
pron
Pick _any_ of those. Given a problem, there is a minimal complexity to the
programs that solve it. You can always _raise_ the complexity, but not reduce
it.

~~~
renewiltord
I'm confused by your response to ninjapenguin[0]. Were you agreeing with him
or disagreeing or something else? I interpreted your response to be
disagreement. I feel that your original style of phrasing "If only x etc.
etc." is not easy to understand. I'm having a hard time placing your
subsequent comments in context of that original response.

0:
[https://news.ycombinator.com/item?id=23042891](https://news.ycombinator.com/item?id=23042891)

~~~
pron
I was disagreeing with the statement that "Comparing [complexity] to something
as fundamental as energy is pure bollocks." There is, in fact, a discipline
that studies complexity -- computer science, and, in particular the field of
complexity theory. Complexity was, in fact, found to be fundamental and
"irreducible". Two people, Hartmanis and Stearns, did, in fact, discover that
through a comparison to physics [1] in 1965, and for that discovery -- that
led to the creation of complexity theory -- they won the Turing Award in 1993.

[1]:
[https://dl.acm.org/doi/10.1145/1283920.1283949](https://dl.acm.org/doi/10.1145/1283920.1283949)

------
deathanatos
I wonder if the author has read _Out of the Tar Pit_ [1]¹. See, in particular,
section 6.

Essentially, the author is, I think, arguing about what the paper calls
"Essential complexity", complexity inherent to the problem one is trying to
solve. And with that, I agree.

I think the author should acknowledge accidental complexity (or provide some
argument as to why that must live somewhere), and I think a lot of comments
here on HN are pointing out the fact that accidental complexity exists, and
_doesn 't_ have to live somewhere. But my guess is that that's not what the
author is saying, and that the author is only arguing about essential
complexity.

[1]: [https://github.com/papers-we-love/papers-we-
love/blob/master...](https://github.com/papers-we-love/papers-we-
love/blob/master/design/out-of-the-tar-pit.pdf)

¹I personally found this paper somewhat mixed. The definitions of complexity
are what make it worth reading. Its conclusion of "functional programming
languages will fix all the woes" I think is not practical.

~~~
emilecantin
Yes, there's definitely a kind of complexity that _doesn't_ stem from the
problem domain, and it can often be eliminated.

Stuff like over-abstracting, insane default values, duplicated state that
needs to be synchronized (e.g in React components) or just overly-repetitive
code, for example.

~~~
elteto
> Yes, there's definitely a kind of complexity that _doesn't_ stem from the
> problem domain, and it can often be eliminated.

Yes, it can be eliminated, but only to some extent. Programming languages add
a baseline of complexity by themselves that _can't_ be removed: programming
languages are not infinitely flexible so there will always be problem
characteristics (invariants, data structures, etc) that will not be easily
expressible. Those are new sources of complexity.

Think how Java simplifies manual memory management when compared with C or
C++. Or how the Rust borrow-checker provably prevents a whole class of bugs.
But these are all tradeoffs, both Java and Rust are ill-suited to express
other problems.

------
x3haloed
“The trap is insidious in software architecture. When we adopt something like
microservices, we try to make it so that each service is individually simple.
But unless this simplicity is so constraining that your actual application
inherits it and is forced into simplicity, it still has to go somewhere. If
it's not in the individual microservices, then where is it?”

The failing of this point is that much of what we call complexity is
disorganization. Certainly, there is a fundamental level of logic in any
desired system that can not be willed away with cute patterns, but to consider
all complexity of an existing system to be necessary is a fallacy. Dividing
systems into problem domains does not inherently reduce the total complexity
of a system. It probably usually adds to it. But organizing systems this way
can drastically reduce the scope of complexity into manageable pieces so that
mere mortals can work on it with out having to hold the entire system in their
mind at one time.

It’s like saying that you can’t make a garage full of junk any less
complicated, because no matter how you arrange it, it will still contain all
the same junk. In fact, organizing all the junk into manageable storage can
make it much easier to understand, work with, sort through, clean, and
identify items that may be unnecessary.

~~~
pm
Indeed, but we need to differentiate between the inherent complexity of the
problem (which can't be mitigated, but only shifted around as the article
points out), and the incidental complexity added when we create the solution.

The inherent complexity can be taken on by the solution (and the inherent
complexity may be hidden behind a simple interface, or it may be introduced in
the interface, but that's another problem), or it gets removed of the problem
domain, and then needs to be dealt with by the user.

There's no correct answer; half the problem is defining the appropriate
problem domain, and even then there are only better or worse solutions. The
incidental complexity of the system comes purely down to the implementation,
which often comes down to how well the problem domain is defined in the first
place.

~~~
x3haloed
Yes. That’s exactly what I’m saying. Although, from the way I read the
article, it doesn’t seem to acknowledge the possibility for significant
overhead in a chosen implementation. Sounds like they’re saying, no matter
what you choose, it’s all the same. Don’t even try.

------
crazygringo
This is a great essay.

Along the same lines, there's a great quote from many years ago that I
unfortunately can't find the exact text of, but it goes like this
(paraphrasing):

"Most Microsoft Word users only use 5% of its features."

"So why don't we get rid of the other 95%, since it's so bloated and complex?"

"Because each user uses a different 5%."

~~~
ebiester
It was Joel Spolsky - [https://www.joelonsoftware.com/2001/03/23/strategy-
letter-iv...](https://www.joelonsoftware.com/2001/03/23/strategy-letter-iv-
bloatware-and-the-8020-myth/)

It was 80/20, but the sentiment holds.

------
marcosdumay
As every discussion about bullshit jobs or frontend frameworks, or hardware
abstraction VMs (recently, WASM) will easily show, we are swimming in
accidental complexity. And even when it's not obvious, every general advance
on science or technology is fundamentally the removal of some complexity that
everybody just adapted to like if it was essential.

So, no, it doesn't have to live somewhere. It can be created and destroyed,
this happens every day. Probably some of it can not be destroyed, but nobody
knows what part so any article about it will be useless.

------
slx26
I don't disagree with the message, but I'm kinda ambivalent about putting the
focus on it like that.

> We try to get rid of the complexity, control it, and seek simplicity.

Well, not really enough. Complexity is a beast that takes many years to
understand, and that's when you are really trying. Many devs don't. And
companies even less. So while it's true some complexity is unavoidable, I
think we still have a long way to go in being aware of it first. We write
software once, and that's not a proper strategy to manage complexity. Anyone
who has looked into computers from top to bottom knows we have tons of
problems with complexity that we really haven't solved properly, and that
bleed into day to day development in the ugliest ways.

My particular take when I don't have the massive amount of time required to
properly deal with complexity is write something like this: "hey, this is very
complex. please don't touch it. if you have to, we have this much headroom. if
you need to go beyond that, please rewrite this entirely / find a better way.
if the code doesn't bite you immediately I will".

------
3pt14159
I completely agree, and even commented the same essential idea a couple years
ago here:

[https://news.ycombinator.com/item?id=18774619](https://news.ycombinator.com/item?id=18774619)

One of my highest upvoted comments with 161 upvotes.

But I've come to another idea too. Part of all this complexity is dealing with
change. We could simplify things if some aspects of our software hit a final
point where only security updates were published after that. Imagine a
programming language that was specified and actually _finished_ without the
constant roll of changing patterns and practices. Or a web framework. Or a
database. Intentionally designed to be robust and secure from the first day
then intentionally set to minimal patches for bugs and security fixes.

Part of the issue though is that things keep changing. New characters are
added to unicode, new timezones emerge or existing ones change. It's a hard
problem to crack.

~~~
peterwwillis
If you designed a mainframe in the 70's, it didn't change. And so we are still
running mainframes from the 70's today for critical infrastructure,
government, financial, and educational work.

~~~
jyounker
I don't know if that's really true. While the software that runs on the
mainframe is the same, mainframe hardware has advanced considerably.

------
apta
This is especially evident for those who used golang for non-trivial projects.
Because the language has very poor support for abstractions and higher level
constructs, you end up with much more verbose, brittle code that's harder to
modify and read. IMO, the language designers, by optimizing for a language
that's relatively easy to pick up* have pushed the complexity onto the
programmer.

* Any language has its paradigms that need to be learned, and golang is no different. Just because you can learn the syntax in a couple of sittings does not mean you know how to use the language in an effective matter. Something I see many people not bring up.

~~~
parshua
That is not my experience from using Go for nearly 10 years at all. Go
actually has significantly removed the complexity of writing asynchronous and
concurrent programs and has pushed it to its own runtime. At this moment in
time, as a language nerd, there are few languages that I know of that compete
with Go in the balance it strikes between performance and complexity.

~~~
apta
"goroutines" is probably the only thing that golang has going for it. But
today with async/await in languages like Rust and C#, C++ getting a coroutine
implementation, and Java getting fibers, golang is no longer special in anyway
in this domain, while it still carries the baggage of a non-expressive verbose
language, and performing worse than the aforementioned.

~~~
parshua
Go is special in that it is _significantly_ simpler than all of these
languages, while having a large overlapping domain of operation with each. I'm
yet to see a single real-life money-making program written in any of the
languages you cited that is as easily readable as a Go program.

------
ebiester
This has been something living in my head for a long time, but the largest
problem is that the complexity needs to live in different places for different
sizes of problem. For example, for simple systems, the build tool that forces
a limit to complexity might be the right one, because the exceptions will be
small and well-understood. For a large system, you may have a team that is
dedicated to the build tool itself, and as such the compromises are different.

If you are lucky, you have someone who is able to analyze the entire system,
can identify all of the stakeholders, and can drive consensus when the change
in abstractions is necessary, and has the budget to do it.

The danger is that people pick the solution for a large solution first because
"we will need this someday" rather than waiting for the accidental complexity
to build, knowing that the rework necessary to move to an intermediate system
is less expensive than the cost of delay in getting the simple solution out
first.

My dream is that we can build incremental complexity systems that could
support simple solutions quickly and highly complex solutions eventually. The
problem is that these are hard to build. :)

------
aazaa
The author seems to be starting with the premise that complexity is
_necessary_.

I'm not sure I'd agree. Some complexity exists because the effort to simplify
was too great. Either cost or skill prevented the refactor. As the saying
goes, "sorry this letter is so long, I didn't have time to make it shorter."

I've repeatedly found that the first iteration of a solution tends to be more
complex. One culprit is that sufficient abstractions were either not
recognized or not implemented. Put those abstractions in, and you simplify the
system.

So that refactoring step after an initial solution is created is crucial and
unfortunately often just not done.

~~~
gmfawcett
It depends on the kind of work. Systems integrations are often complex for
uncontrollable reasons. That one legacy system only speaks ASN.1, not JSON;
the three vendor systems have different semantics for a fundamental concept
(vendor, order, shipment); failure modes are under-documented, so defensive
programming is needed; etc.

If we accept that the integration itself is necessary, then the complexity
necessarily comes along with it. Not to mention all the complexity generated
by the client base: I'm paying for this system, it's mine, so make it do X,
even if you think X is too complicated/impossible/etc.

------
JackRabbitSlim
I know. Lets move it all that annoying complexity to yaml/json/XML config
files!

Now look at this app I can write in 30 lines of code with this framework! Oh
and 500 lines of yaml across 2 dozen files but look the "code" is so sleek and
sexy.

Joking aside I think he could have at least mentioned the difference between
required complexity and unneeded complexity. Ironically it seems to spawn from
attempts to reduce the first type of required complexity in a lot of cases.

------
ken
The title is catchy and it's not _exactly_ wrong. There's an element of truth
to it. And yet, 99% of the times I hear it, it's as an excuse for crap, not
some fundamental truth.

Dirt has to live somewhere, too. That's a good motto for a garbage company,
but if you hear a chef saying it in the kitchen while preparing your meal, you
might be worried.

~~~
akersten
Yep. Most of the "complexity" that is justified by mantras like this is not
_necessary_ complexity (like the inherent baseline complexity in solving a
certain problem space) but cruft/enterprise/poor coding style that winds up
suffocating a project. That kind of complexity should be torn out with
disregard for the notion that it's somehow necessary.

~~~
Gehinnn
I would differentiate between intrinsic complexity of a problem and artifical
complexity of a solution.

Certain knots can be solved without changing the topology! They would just add
artifical complexity.

Also, some truths can have both very simple and very involved proofs. They are
a perfect example how just the right approach can reduce much complexity. Just
formulating the problem in a different way can already reduce much artifical
complexity.

------
arendtio
It is not about simplicity versus complexity. It is about creating simple
abstractions that successfully hide the contained complexity.

When I want to search for a string within some data, I don't want to have to
think about the algorithm it uses (naive, Boyer–Moore [1], etc.). I just want
to know that somebody cared about it and that it will work great.

Bad abstractions, on the other hand, make you think about the underlying
details that you have to be aware of. For one layer that might not be too bad,
but the more you build on top of such things, the more fragile the whole
construct becomes.

[1] [https://en.wikipedia.org/wiki/Boyer%E2%80%93Moore_string-
sea...](https://en.wikipedia.org/wiki/Boyer%E2%80%93Moore_string-
search_algorithm)

~~~
reubenmorais
This is missing their point. If you use a simple abstraction, you won't be
able to address all of the complexity of the underlying task. You'll be
sacrificing at some point, either by leaving performance on the table or by
ignoring an edge case and having it come back to bite you later. "it will work
great" can only go so far without cooperation from the user, and then the
complexity starts creeping back in.

~~~
arendtio
I just read the text a 2nd time and I disagree with you.

The point of the text is, that you have to define where to put the complexity:

    
    
      If you are lucky, it lives in well-defined places.
    

Otherwise, it will be everywhere (obviously bad):

    
    
      With nowhere to go, it has to roam everywhere in your system, both in your code and in people's heads.
    

My point, on the other hand, is that you should focus on good abstractions,
which is just one more step into the direction of defining 'where the
complexity should live'.

When defining the abstraction you focus on the practical use-cases so that it
is as simple as possible but not simpler. The main task here is to find the
right focus. In doubt, it might be better to use multiple different
abstractions (e.g. if there are two distinct use-cases).

The goal is to find a construct where everything looks simple from the outside
because the complexity is hidden by the abstraction.

~~~
reubenmorais
I agree that abstraction is a useful tool in tackling complexity, but in my
experience it is often the _only_ tool people reach for, which can lead to a
continuous rat chase of refactoring after refactoring, because somehow the
ugly can never quite be fully hidden.

This is exactly what the text talks about: shuffling of the complexity around,
trying to hide it, and the end result can often be that you just don't know
where it ended up. Personally I think a lot of this is due to the way software
engineering ideas like "don't repeat yourself" or "design patterns" can attain
an almost cult-like following, and people just blindly apply transformations
instead of deeply looking at what the actual problem is.

As others mentioned in the rest of this thread, complexity must be addressed,
and only from a perspective of understanding what your problem is, where the
traps are, what all of its complexities are, can you then come up with
abstractions that will be any good.

------
peterwwillis
It could be that your complexity is necessary, like a small widget that just
has an enormous amount of variety that it has to constantly account for. Or it
could be that you designed one giant widget made of 10 widgets, and a re-
design could simplify how you think about and build it.

Simplicity is the art of taking complex things and stripping them down to
their essential nature. Simple things can be complex, but no more complex than
they have to be.

With that said, this piece talks a lot about frameworks and tools and
patterns, and I really hope people don't think that's the right way to think
about architecture. Abstractions do not make things simpler, they make them
abstract.

------
zmmmmm
While I agree with the message about essential complexity, I disagree that all
complexity is essential. In fact, I think it let's us off the hook far too
easily to just claim all complexity is essential. It's _easy_ to invent
complexity, in fact it's the default. I'd go as far as to say that about 50%
of the complexity in software is non-essential. And in terms of what to
address, it's a bit like paying off debt - just like you should always pay
your credit card first, always start by eliminating the non-essential
complexity.

------
Discombulator
Paraphrasing the "Fundamental theorem of software engineering", _Every problem
can be solved with enough levels of indirection._ Every level of indirection
adds complexity, so the question should be: is the additional level worth it?

I am fully on board with the overall sentiment of the article that there is
some irreducible complexity that one cannot avoid. Often it comes straight
from the business domain or entity the code is modelling, and then, sure, you
cannot make it any easier, otherwise you are not solving the problem and users
will be unhappy.

However, then the author goes too far: > Accidental complexity is just
essential complexity that shows its age.

You can absolutely add heaps of unnecessary complexity; in fact, this is
almost surely what you will get by attempting to cover every possible future
use case and evolution scenario, or to cater to every minor aesthetic concern
(e.g., "I need to be able to replace my cloud provider / DB / message queue /
user notification medium / etc by changing just one line!").

It takes humility to admit that getting the balance "right" from the start is
difficult, and often the only way to improve is to accept some badness now and
revisit the decision later with more information.

The quote however seems to be saying that we (as software engineers) always
get it right, just later things change and our choice does not appear right
anymore. This mindset is counterproductive, as admitting flaws is the first
step to improve.

------
darkerside
Certainly you can make things more complex than they need to be (for your
given use case). That is self-evident for most engineers.

Doesn't that hold that the inverse must be true? In some cases, you must be
able to remove complexity from the equation, without introducing it somewhere
else.

I agree with the spirit of the post, but I think it's eliding some of the
complexity involved in identifying essential complexity.

------
BenoitEssiambre
I find there are some concepts used to qualify complexity and data-model fit
in information theory and machine learning that can also be used to think
about product-market fit or software-domain fit.

I tried to describe them here. I'm not sure I did a good job:

[https://medium.com/@b.essiambre/product-market-
crossfit-c09b...](https://medium.com/@b.essiambre/product-market-
crossfit-c09b019188f3?source=friends_link&sk=5a57eddd18dd948ebb512afb40a21667)

The gist:

Low resolution fit: The product vaguely and simply fits its market or domain.

High resolution fit: The product is complex and is tightly tailored to the
market or domain.

Overfit: The product is so tightly tailored to some users that it only fits a
small part of the market.

Underfit: The product is so generic that it’s missing important features and
corner cases.

And then another qualifier 'crossfit' which is a bit harder to grasp, inspired
by cross-entropy, that has to do with whether the design language, which can
be seen in some ways as 'encoding' the problem, fits the domain or market
well.

------
im3w1l
Kolmogorov complexity is uncomputable. What that means in this case is that we
may never be 100% sure if we have eliminated all unnecessary complexity in our
programs; if we have expressed things in the simplest way. The complexity may
have to live somewhere or... or there may be a simple underlying principle we
have just not seen yet.

------
l0b0
There is still a huge difference in the amount of complexity in a solution and
the amount of complexity in the computer system implementing that solution. In
extremely simple cases, such as implementing a mathematical formula, the
solution (in the form of words or an equation) and the implementation (actual
code) could be on the same size order - there is virtually no accidental
complexity. For business solutions, like a shopping web site, the difference
could be a factor of 10,000 or more. But that's almost all accidental
complexity, because of how organically the web and most programming languages
have grown to handle cultural concepts like language, visual flow,
authentication, encryption, cross-platform compatibility and so on.

------
rhacker
I think that things only become less complex when it becomes old and reliable.
Even if the internals of the thing are a fucking mess. For example, Linux, in
its entirety is a complex fucking beast for its maintainers.

Yet for other people it's a simple as fuck thing that accomplishes things for
them and lets them move forward with getting shit done.

So really, it's how you look at it.

The human body is so freaking complex, it is entirely possible we'll never
even understand how half of it works. Yet from another point of view we easily
hire a bunch of humans to do tasks (programming, making food, yard work,
etc..). And even though we don't know how the body works, we have a simple
understanding of how that body can accomplish those tasks, which is enough to
move forward.

------
senderista
I think “complexity has to live somewhere” is just the definition of essential
complexity. “Accidental complexity is just essential complexity showing its
age” makes no sense to me: accidental complexity is often the result of
ignoring essential complexity (or using inadequate tools). One often finds
that edge cases can be naturally handled by principled approaches that capture
the essential complexity of a system (see: Paxos). The alternative is a naive,
simplistic approach that ignores essential complexity and quickly accumulates
accidental complexity in the form of workarounds for newly-discovered edge
cases. In other words, most software systems.

------
mcqueenjordan
I think I understand the point the author is trying to make, but I think they
end up accidentally making too strong of a claim.

 _Sometimes_ there is /essential complexity/, but it is far from always. I
think the line of thinking espoused by the author leads to less simplification
and encourages a "well, the complexity has to live somewhere, so why bother?"
type of approach. However, reducing complexity is often possible, and it
usually comes before the implementation stage. Reducing complexity sometimes
means solving a different problem.

------
throwaway55554
Of course it does. It lives in your frameworks and libraries. It lives in your
container classes and threading routines. It lives in that bit of code at the
core of your business rules that people are scared of.

If you write anything, even "hello, world", complexity lives somewhere! Just
the complexity of the print function path all the way to the hardware would
baffle some.

------
hasahmed
Very interesting. I'm happy I read it. I think that seeking simplicity in code
for me is often trying to find the most simple and straightforward way to
write the code so that I can understand it. This usually boils down to good
naming, and good encapsulation of tackling specific problems into specific
(ideally well named) places.

------
hzhou321
It is nice that we recognize that there is certain amount of necessary
complexity and the skill involved is complexity management. But it is a pity
that we don't recognize that 1. we can create extra complexity that often
wraps around necessary complexity; 2. There is such thing that is called
scope.

------
smitty1e
If we think of model, view, controller, much of the complexity seems to be
when these dimensions leak into each other.

Keeping them orthogonal seems the first step in the process of trading the
problem for a smaller problem.

As TFA notes, there is a lower limit to how much complexity can be squeezed
out of a system.

------
dasyatidprime
The strategy recommended at the end with the “embrace it, give it the place it
deserves” feels like the “eating your shadow” of design. It's also what I've
been trying to approximate most of the time, and I appreciate the
representation of it here.

------
patkai
This is one of the best titles I've seen this year. Reminds me of G.G.Márquez,
who told how much he sweated over the first sentence. I haven't read the post
yet, but I feel the discussion has already started.

------
dana321
C++ : Make the language complex, not the code.

Go : Make the language simple, make achieving a simple thing complex.

C : The machine is complex, be careful what you do if you overflow into the
inner workings.

------
grensley
Some complexity cannot be avoided, but that doesn't mean that complexity is a
constant and removing it in one place necessarily means that it has to go
somewhere else.

------
sumnole
The good thing is you can manage inevitable complexity with the proper balance
of abstraction, tooling, and documentation.

------
philipswood
I agree. Often good architecture amounts to finding the prettiest place to put
the ugly.

------
crimsonalucard
The fact that complexity has to exist is obvious.

The real question of design, especially the design and organization of
computer programs, is _where_ that complexity lives, and how do you organize
that complexity so that it doesn't introduce _extra_ complexity.

Usually the way we handle this is through layers, with the initial layers
being simple, the middle layers being complex and the top layers being simple
again. There's no theory about why this is better, it is just usually how it's
done and it seems to work when we can pull it off.

The bottom layers are your primitives and axioms. Simple building blocks. This
is where designers want to create a most minimal and simple set of tools that
can help facilitate unlimited complexity at the upper layers (Think a minimal
set of language primitives that allow for Turing completeness)

The middle layers are your theorems. Different permutations and compositions
of your axioms to implement things like business logic or whatever logic you
want. This is where good designers should try to stuff as much complexity into
as possible.

The Top and final layer is the interface. Something that is built to directly
access the middle layer while providing an interface that is simple and more
understandable from the perspective of a user. This is probably the least
theoretical aspect of the stack as you have to factor in things like art and
human psychology into how you build an interface.

A good example of this will be your phone. The primitives are assembly
language instructions, RISC. The middle layer is all the code that facilitates
programming and applications and the interface layer is the touch screen.

Note that you don't have to look at it from a birds eye view from CPU
instructions all the way to touch screen. You can zoom in and look at just a
web stack: Java on the backend all the way up to react on the front end.

By zooming in you will see systems that violate the design principle I
describe. Java as a primitive has become incredibly bloated and so has React
and the javascript tool chain. There are several reasons for why this happens:
Lack of planning, optimization requirements inevitably violate design
principles, lack of knowledge, lack of central direction and more.

Either way, always remember that the ideal design that most people should
strive for is complexity sandwiched by simplicity at the primitive layer and
the interface layer. This goes for most things within computer programming and
outside of it, like your car.

~~~
crazygringo
> _The fact that complexity has to exist is obvious._

I can't tell you how many engineers and designers I've worked with for whom it
is _not_ obvious.

Unfortunately, the lesson that complexity has to exist seemingly needs to be
re-taught constantly.

A lot of people really truly believe that a beautiful, minimalist, elegant
product is always the best solution -- and that the users needing it to do
other/extra things are the ones who are wrong.

~~~
crimsonalucard
I'm not really talking about that. I'm saying that it's obvious in the sense
that this English sentence has a certain level of complexity... In fact for
anything to exist it must have some level of complexity... that much is
obvious.

The real question is what to do with complexity. Like you said some people
think that complexity should be pushed onto users other people like you and I
disagree.

~~~
jyounker
If you're building real systems, it is not the case that complexity of the
problem is obvious. I've just spent several years in an organization where
inherent complexity was discounted by important technical decision makers in
the organization.

The result is a mess since the system was not designed to cope with the
operational complexity involved.

------
carapace
One of the reasons to study category theory is that it holds out promise of a
way, mathematically rigorous, to discover and describe the _essential_
complexity of a task or problem.

