
Simple, correct, fast: in that order - ddevault
https://drewdevault.com/2018/07/09/Simple-correct-fast.html
======
DannyB2
When I was in school in the 70's. (That's NINTEEN seventies.) There was this
book called The Psychology of Computer Programming. This predates the
microcomputer era as we know it. Punched cards were still common when the book
was written.

A computer was to control a new assembly line for a car company. They couldn't
get the software to work. They called in an outside insultant. The outsider
developed a program that worked. (It was more complex.) The book was about the
psychology part: The original programmer says: "How fast does YOUR program
process a punched card?". Answer: "About one card per second." "Ah!" said the
original programmner, "but MY program processes ten cards per second!"

The outsider said, "Yes, but MY program ACTUALLY WORKS". If the program
doesn't have to work, I could make it read 100 cards per second.

Correctness comes first. Simplicity is highly desirable, adds additional cost,
but always comes after correctness.

~~~
ythn
> Correctness comes first.

Not always. Have you ever used a SNES emulator? There is one emulator that is
more correct than all others combined - it's called BSNES and it's the most
true to the original SNES hardware of all the available emulators. Yet it is
horrifically memory/cpu hungry - that correctness comes at a huge cost.

So no, correctness does not always come first, especially if you value other
things like user experience.

~~~
bmurphy1976
Your definition of correctness is wrong in this case. If the purpose is to
emulate the hardware as accurately as possible, BSNES wins. If the purpose is
to make as many games as possible enjoyable for as many people as possible on
the lowest common denominator hardware available today, BSNES loses.

There's no clinical definition of correctness here. Intent matters.

------
SamuelAdams
I would argue that correct is more important than simple.

Consider timezones: it's simpler to pretend there's 24 time zones, one for
each hour. But the correct assertion is there's 37 time zones (as of this
writing). So, the simple solution results in a third of your potential user
base having issues.

Other issues to pick: accessibility, cross-browser compatibility, legacy
device compatibility... the list goes on.

~~~
lopatin
I think it's more in the spirit of the article to say, forget timezones, use
UTC millis everywhere. If the server doesn't speak in timezones, then you
eliminated all bugs where the server mishandles timezones.

~~~
masklinn
> I think it's more in the spirit of the article to say, forget timezones, use
> UTC millis everywhere. If the server doesn't speak in timezones, then you
> eliminated all bugs where the server mishandles timezones.

That's a flagrant example of "simple and wholly incorrect". If you don't store
timezones, your future dates will eventually turn out incorrect when timezone
offsets change e.g. create a meeting at 9AM local, store as UTC, country
decides to not follow DST that year bam your reminder will ping an hour early
or late.

Or a day off when the country decides to jump across the international date
line
([https://en.wikipedia.org/wiki/International_Date_Line#Samoan...](https://en.wikipedia.org/wiki/International_Date_Line#Samoan_Islands_and_Tokelau_\(1892_and_2011\))).

~~~
y4mi
most applications don't need to schedule events into the future though and its
a smart strategy if all you need to worry about is the past

~~~
masklinn
For very specific cases? Sure, but none of the comments talking about UTC
everywhere cares to specify this _rather important_ bit.

Unless restrictions are specified I will assume we're talking about the
general case, and for the general case it's just plain wrong.

~~~
y4mi
from my point of few its the other way around.

There are very few applications that need to schedule events into the future,
and that is literally the only situation where you have to worry about the
timezone.

Btw, keeping the timezone is insufficient as well if you're building a
calendar/scheduler. If the user changes the timezone after scheduling the
event... do you keep to the old one and alert him whenever, or do you adjust?
There are a lot of edge cases with schedulers -- yet as i said before, most
applications don't schedule into the future. They're mostly just doing things
right now or within the next few minutes and keeping a log of their actions.

~~~
masklinn
> from my point of few its the other way around.

> There are very few applications that need to schedule events into the
> future, and that is literally the only situation where you have to worry
> about the timezone.

My experience is the exact opposite: there are few applications which only
store past dates, and in those said date is usually indicative/barely even
relevant and could just as well be part of a freeform comment or removed
entirely.

------
tacon
Is the software engineering profession doomed to lose its memory every
generation? The premise of this post is ridiculous:

>The single most important quality in a piece of software is simplicity.

How panglossian, imagining the best of all possible worlds. Well, the world is
_intrinsically_ complex, as Fred Brooks explained in his No Silver Bullet
essay from 1986[0].

"The complexity of software is an essential property, not an accidental one."

Sure, there is accidental complexity in most software problems, that can be
tackled with skill and experience, and maybe reduced to zero. But then you are
left with the essential complexity of the world. And you are done reducing the
complexity; you can only manage it from then on. The world is very, very
complex and it is a pipe dream to imagine that we can eliminate its complexity
just by some bold engineering.

[0] [http://worrydream.com/refs/Brooks-
NoSilverBullet.pdf](http://worrydream.com/refs/Brooks-NoSilverBullet.pdf)

~~~
wellpast
Anyone who has spent any time developing anything but a tiny software system
knows that the biggest impediment to productivity (feature delivery, bug
fixing, etc) is the complexity of the system at hand.

In a sense, this post is simply stating the obvious.

The biggest differentiator of skilled software practitioners is the ability to
construct simple systems.

To call this claim panglossian or meaningless is to hold the philistinic line
that this skill set doesn't matter, that any complex system is effectively the
same as any ol' simple one -- don't worry about cultivating the skill, it
doesn't matter anyway...

But simplicity is the single most important thing that matters in any
maintaining system other than one-off scripts, hack jobs, etc. -- It's
absolute torture to collaborate on a software project with anyone who rejects
this premise.

~~~
tacon
You are making the common mistake of confounding essential complexity with
accidental complexity. One you are stuck managing and one you can eliminate
with skill. The world isn't getting less complex just because you work harder
on your software.

~~~
wellpast
The world, from software’s perspective at least, isn’t growing in complexity.
What leads one to that?

~~~
swsieber
The world growing more complex from the software's perspective.

And by that I mean, it has to account for more scenarios or do additional
things... unless your software is growing in complexity while you're only
removing features...

Bugs in software come from thinking we're simplifying the world in one way
though the program, while in reality it receives a slightly different picture.

~~~
wellpast
A well-built software system doesn't have to grow in complexity over time.
More features does not equal more complex in the true sense of Simple.

Faulty data models and system designs -- that aren't fixed -- lead to ever-
increasing complexity. But that is the fault of the data model/designer.

I.e., there is a way to build (and _grow_ ) systems w/o linear increase in
complexity -- but it takes a particular rare skill set.

------
ddellacosta
I can't really understand the equivocating tone a lot of folks are taking in
response to this, and more importantly I can't wrap my head around how you
could make such a statement in the first place: without correctness you've got
nothing. Stating authoritatively that correctness comes after...anything is
incomprehensible to me.

It's possible to have a correct solution that is neither simple nor fast, and
it can be worth your while to speed up a correct solution while sacrificing
simplicity. So there are trade-offs involved in the relationship between
simplicity and speed, but correctness is not negotiable. Acknowledging that
all software has bugs is not the same thing as throwing out correctness as
your first and primary objective in implementing an algorithm, and accepting
that your solution may only be partial or fail with certain inputs is fine if
that is acceptably correct for the problem at hand, but ascertaining that
still comes _first_. Preferring simplicity over complexity because it makes
debugging, profiling, etc. easier is not a reason to insist that correctness
can go out the window in service to simplicity--who cares if you've removed
all the bloat from your code if it's _wrong_?

~~~
jerf
"without correctness you've got nothing."

Sure... but define "correctness".

Suppose my manager comes to me with some incredibly complicated problem. It's
going to take six months to solve properly. Suppose in the first three weeks I
implement a program that is 98% correct, and let's say it can detect the other
2% and kick it out for a human to solve. But it clearly does not fully and
correctly eliminate the problem as brought to me by my manager. Have I solved
the problem?

The correct answer is not "no, because your solution is incorrect and there is
no such thing as an 'incorrect solution' because all solutions must be correct
to even be solutions; you have _no professional choice_ but to spend the next
5 months and a week implementing the correct solution". The correct answer is
"the question is underspecified". I need to go to the manager and work with
them on the question of what the benefit of just deploying this is, what the
benefit of doing it "correctly" according to the original specification is
versus the cost, and whether or not there are any other in-between choices.
The business may require the full solution, sure. On the other hand, your
manager may be inclined to thank you profusely for the 98% solution in a
fraction of the time because it was far more than they dreamed possible and is
way more than enough to make the remaining 2% nowhere near the largest problem
we have now.

"Correctness" is only fully defined in a situation where the spec is
completely immutable. Specifications are almost never completely immutable. So
for the most part, everyone in this conversation using the word "correct"
without being very careful about what they mean are not using a well-defined
word.

It's all about costs and benefits, not correctness and incorrectness. For
nearly two decades, Python's sort algorithm was _technically_ incorrect:
[http://envisage-project.eu/proving-android-java-and-
python-s...](http://envisage-project.eu/proving-android-java-and-python-
sorting-algorithm-is-broken-and-how-to-fix-it/) Does this mean that any
program that used Python sort was worth "nothing", because it was not correct?
Obviously this is absurd (in practice at least), so correctness must be
understood in terms of costs & benefits to make any sense. And such an
understanding must also be grounded in an understanding of the mutability of
requirements as well, to make any sense of the real world.

From this perspective, it honestly isn't even 100% clear to me what
prioritizing "correctness" over everything else would even _mean_. That we are
slaves to the first iteration of the spec that comes out, no matter what?
(Obviously not, but I can't come up with anything better that it might mean.)
Correctness can't be prioritized everything else because it can only be
understood holistically as part of the whole process. There is no way to
isolate it and hold it up as the top priority over everything else. And there
is no way for the correctness of a bit of software to exceed the scope of the
specification itself, almost by definition, which in the real world tends to
put a pretty tight cap on how correct your software can even be in theory,
honestly.

~~~
bhandziuk
But this is what ddellacosta is saying. I imagine his interpretation of this
argument (and I agree) is that 98% correct IS prioritizing correctness. A very
fast, simple solution that is 2% correct is an unacceptable balance.

~~~
jerf
"A very fast, simple solution that is 2% correct is an unacceptable balance."

That begs the question (in the original sense) of "unacceptable". If I banged
out that 2% solution in an hour, and it lacked other costs that outweighed the
benefits, it may still be something we ship! It is _unlikely_ that we'd stop
there, just because the numbers as you've given are unlikely to favor it
because something else substantial would have to overcome the small amount of
the problem we've solved, but to be firmly confident it's "unacceptable" you'd
have to define "acceptable" a lot more carefully.

I understand the deep temptation to turn to discussions of the virtues of
letting bugs through or something, but the costs/benefits framework
_completely_ handles that already. If you ship a buggy piece of "incorrect"
shit, well, you've incurred a ton of costs with no benefits. That's wrong, by
whatever standards you are measuring costs and benefits by. There isn't a
"what if your 98% solution actually has a massive bug in it because you were
unconcerned about 'correctness'?" argument to be made, because if it does have
a massive bug, it's not a 98% solution.

------
asragab
I think to those that are asserting that correctness comes first are somewhat
missing the point. One, a simple solution is still a solution, that is if your
code doesn't solve the problem, you can't stop. I think the author is
suggesting that truly _correct_ code (code that produces the correct output
under all circumstances) is only attainable iteratively, and if your code is
not simple (and let's also remember here: that simple ain't easy!) than
reaching correctness or performance will, in the main, be quite difficult. Not
only will it be increasingly more difficult to reach a state of correctness
again after a bug is found, and it will be found, but even measuring
performance will become increasingly challenging. At least that's the lesson I
take.

~~~
AstralStorm
In practice of many years, I haven't found iterative approach to produce
either simple or easily maintainable code. If tends to grow rings instead.
Each layer is relatively simple but altogether neither it is performant nor
simple.

The simple code of present was almost always written by someone who
understands the problem domain really well in one or two tries.

~~~
asragab
Yeah, maybe iterative as a concept is too facile a concept to contain what is
meant here. Maybe "fractal", or recursive is better. Maybe though that's the
point, it is hard after "multiple rings" to make code simple anymore, so
better start off trying to optimize for simple first. Correctness requires
exposure to cases you didn't know you didn't know (unknown unknowns).

This is one of the reasons why I am suspicious about the long-term saliency of
so-called "smart contracts" on the blockchain. The immutability of code, while
super amazing for digital assets, seems like a horror-show of a liability for
dApps.

------
jandrewrogers
Performance in most non-trivial software, and especially infrastructure
software, is architectural. In many cases an architecture that will allow your
software to be performant requires a commitment to a very substantial amount
of software complexity upfront to ensure adding performance is much simpler
(or even possible) later. There are also rarer cases where correctness is not
simple, so there is no trivial path between the simple implementation and a
rigorously correct one. While "simple" is easier for the software engineer,
customers pay for "correct" and "fast".

In my own area of work (database engines), the common mistake is that
inexperienced designers _do_ focus on simplicity first, instead of correctness
and performance, not understanding that it is at best difficult and sometimes
impossible to add correctness and especially performance later. The fast win
of "simple" can turn into nearly insurmountable technical debt when you are
asked to deliver scale and performance. People often grossly underestimate the
minimum amount of initial implementation complexity required for good
architecture.

There are many types of software where "simple, correct, fast" is sound advice
but it is far from universal.

------
agentultra
I think this is lacking a definition of simple. And where in the problem space
do we desire simplicity? Simple in the implementation (and conversely complex
in the interface? ie: C-style libraries?) Or complex in the implementation but
simple in the interface? (ie: Haskell/FP style libraries?)

My definition of simple software is software that I can validate the
correctness of using only _equational reasoning_ and the mathematical tools
used to carry it out without any specialized knowledge or verification
systems.

If I have to learn a new way to reason about a software system in order to
understand it then it is complex.

 _A priori_ any system written in C fails this litmus test: one must
understand and identify the many ways that undefined behavior can enter into
their program and be leveraged by their compiler. One cannot reason about a
local expression in the presence of global effects and unchecked side-effects.
And if it is possible to write a _correct_ C program it takes considerable
effort and the use of very specialized verification tools.

There are many reasons to prefer C however; if we're willing to live within
some tolerance of "correct" and "incorrect" then we can leverage a tool-chain
that can produce highly performant code... but then we're forced to restrain
ourselves from introducing complexity instead of spending that effort on other
things.

~~~
thedirt0115
+1 to acknowledging that when it comes to software, "simplicity" does not have
one exact quantitative meaning we all agree on -- for now, simplicity is still
in the eye of the beholder in my opinion.

------
lmm
Not convinced. Simplifying a correct implementation can be easier than
correcting a simple implementation. Eventually you need to find the correct
model and then everything else will follow easily, but a complex
implementation that does the right thing will tell you a lot more about what
the correct model is than a simple implementation that doesn't do the right
thing.

------
kibwen
Without defining what "simplicity" and "correctness" are supposed to mean,
this article is empty of content. The title appears to be riffing off of the
famous saying "First make it work, then make it right, then make it fast" (
[http://wiki.c2.com/?MakeItWorkMakeItRightMakeItFast](http://wiki.c2.com/?MakeItWorkMakeItRightMakeItFast)
) which is supposed to be a warning against premature optimization (one that
is less often taken out of context than Knuth's famous saying). But by lumping
both "making it work" and "making it right" under "correctness", it makes it
appear that the author values simple software that doesn't do its job over
complex software that does. And the problem is that you can't easily slot
simplicity in by drawing a stark dividing line between "making it work" and
"making it right", because it's a continuum of correctness. At best,
simplicity is more important than performance, much of the time. But at the
end of the day the point of software is to perform a specified task, whether
or not it is achieved in an aestheically pleasing way underneath.

------
wellpast
This is great, but completely lost on the crowd if what Simple means isn't
understood.

One of the best clarifications of what it means to be Simple, to put it out
there, is [1]; but the key point: Simple != Easy.

Simple means _minimal_ coupling, high-cohesion etc etc.

Yet IME many developers do not understand the distinction and mistakenly
believe that easy is the same as simple, and are willing to couple the hell
out of the world under some false notion of "simplicity"...

[1] [https://www.infoq.com/presentations/Simple-Made-
Easy](https://www.infoq.com/presentations/Simple-Made-Easy)

~~~
bluetomcat
In a way, simplicity is the end result of reducing the complex and correct
solution without affecting its correctness.

As in math, you come up with the "simple" solution of 0.5 only after you've
realized that the "complex" solution is, for example, "sin(pi/4) * cos(pi/4)".
There might be no other way to discover the simple solution.

------
MikeTaylor
This title is misleading. The post actually says that the reason "simple"
comes first is because without it you can't have "correct" (nor "fast", not
that that matters so much). So he's not saying simple is most _important_,
just that it comes first chronologically, and has the other two as
consequences.

~~~
shoo
e.g. Gall's law

> A complex system that works is invariably found to have evolved from a
> simple system that worked. The inverse proposition also appears to be true:
> A complex system designed from scratch never works and cannot be made to
> work. You have to start over, beginning with a working simple system.

~~~
AstralStorm
Surprisingly to some, this is neither a law nor true.

A complex system with a good workable and testable architecture will work,
starting with passing the tests down to satisfying the user...

Such systems are not designed in detail but in general, and usually start with
a single, simple but powerful overarching idea, which is actually quite
complex to implement, but ends up working evidently well once even halfway
done.

Examples would be message passing architecture, event driven programming, time
tracking, microservices, reactors, literate APIs, contract programming, Model-
View-* and more... Note how half of those deal with reducing coupling by
adding complexity.

------
neogodless
Think about it in terms of each choice you make.

I have a simple solution and a complex solution. Does the simple solution meet
the requirement(s) before me? If so, I prefer it. Let's move on to the next
requirement and consider my options again.

The alternative might be to look at your requirements, but choose a complex
solution (over a simple one) because you think it might meet other
requirements, either ones that have not yet been identified or ones you think
are likely to happen in the future.

Are there times that the more complex solution wins? Probably. Consider you
want to write a blog. You know that you can create an HTML (text) file and
slap it on a web server and your blog has started. But if you've done this
before, you might also know that you can throw WordPress on your server for a
little more up front pain. You know you want comments and word clouds and
date/time stamps and navigation. So you choose the complex solution. (You also
know that you know face potential security implications, upgrades, dealing
with users causing trouble with comments, having the PHP/MySQL
infrastructure/hosting requirements...) Maybe you just wanted to dump your
thoughts to the internet. Maybe the text file approach was better...

It may just be another way to say "avoid gold-plating your software."

~~~
DannyB2
Certainly don't gold plate.

But simply meeting requirements is only doing the minimum possible. Now in a
government job, that's okay.

But in a real job, if you see where the simple solution is OBVIOUSLY wrong for
certain likely cases not considered in the requirement, then THE REQUIREMENTS
ARE WRONG, or incomplete and this should be pointed out!

------
whack
The OP's advice, if applied in CPU industry, would be disastrous. Modern
desktop/server CPUs are incredible complex... in order to drive maximum
performance. Pipelining, OOO execution, branch prediction and speculative
execution: these are all features that introduce tremendous amount of
architectural and design complexity. In many cases, they also harm
correctness, because they can lead to functional and security bugs.

And yet, if you try to compete with Intel with a CPU missing the above
optimizations, you will get absolutely creamed in the marketplace. No one, not
even those touting the importance of simplicity and correctness, will buy what
you're selling.

Today's free market is too complex for these overly simple rules. Choosing
between simplicity, correctness and performance, is a complex tradeoff that
needs to be made on a case-by-case basis. Trying to find shortcuts to avoid
these analyses may feel liberating... but you're ultimately only shooting
yourself in the foot.

~~~
zurn
A counter-anecdote: The features you listed started shipping (from Intel &
MIPS) in microprocessors in 1996, 22 years ago. Intel's out-of-order Pentium
Pro was beaten by the in-order DEC 21164 the same year.

Also, there's the case of Intel losing to in-order ARMs in mobile. First with
XScale, and later on with the in-order Atoms.
([https://appleinsider.com/articles/15/01/19/how-intel-lost-
th...](https://appleinsider.com/articles/15/01/19/how-intel-lost-the-mobile-
chip-business-to-apples-ax-arm-application-processors))

~~~
whack
Sure, specific optimizations in specific markets may not be worth the cost
they incur. Or they may not be valuable enough to overcome other weaknesses in
the project.

And yet, if someone tried to sell a server CPU today that was not pipelined,
not OOO, and didn't have branch prediction, it would absolutely tank in the
marketplace.

I never said that performance optimizations should always be implemented. Just
that performance optimizations should _sometimes_ take precedence over
simplicity.

~~~
ben509
> And yet, if someone tried to sell a server CPU today that was not pipelined,
> not OOO, and didn't have branch prediction, it would absolutely tank in the
> marketplace.

You could sell it as a niche product for high security applications, since OOO
execution is a nasty side-channel.

~~~
whack
That's an interesting idea. A "so simple it can't have bugs" design would
never win over the mainstream-market, but it might be able to find a niche
among extremely security-conscious users. This might be a great project for
the open-source community to take on.

------
jsdalton
I often repurpose a famous quote from Mark Twain (about letter writing) for
the topic of simplicity:

> I didn't have time to write a simple program, so I wrote a complicated
> program instead.

This is in my experience more than just a clever turn of phrase: the vast
majority of software projects (or features etc.) move from _simplistic_ to
complicated, and rarely from there toward simplicity. The end result is
exactly what this author describes -- a complex mess that's difficult to
reason about and rarely performant.

Few of us (usually myself included) are willing to devote the time and effort
required to achieve true simplicity.

~~~
wellpast
I've found that, unlike Mark Twain's profession of writing and art, in which
each new work is attempting to push the bounds of thought and expression,
constructing simple software is something that you can actually get better at
over time, and with focussed practice. I hate to break this to us, but what we
software developers do over and over again is not as novel and groundbreaking
as, say, what a Mark Twain is doing with each work of art he produces.

This is certainly true for me after 20 years of focussed practice. In fact, I
have to go out of my way now to introduce coupling--it takes time for me to do
the wrong thing. So when I hear someone say "I don't have time to make it
simple / decouple everything / design it correctly", what I really hear is "I
don't have the skill set" and, often, "I don't want to do the hard work and
patience it takes to require the skill set." It's a philistinic cop out,
really.

------
pieterr
I prefer: “the strategy is definitely: first make it work, then make it right,
and, finally, make it fast”.

[http://wiki.c2.com/?MakeItWorkMakeItRightMakeItFast](http://wiki.c2.com/?MakeItWorkMakeItRightMakeItFast)

~~~
collyw
This is better. If you can make it work while keeping it simple the other two
will follow more naturally.

------
abakus
Apparently everybody knows "correctness" comes first. But that is the whole
point of this article: It argues that complexity will reduce correctness, and
thus simple should come first.

Please NOTE that I am not saying that I agree or disagree with this article.

------
jasonkester
My take on this from like 10 years ago:

[http://www.expatsoftware.com/articles/2007/06/getting-
your-p...](http://www.expatsoftware.com/articles/2007/06/getting-your-
priorities-straight.html)

I came up with Readable as the top priority, followed by Debuggable and
Maintainable. I suppose one could combine that into "Simple" if one liked.

But yeah, Fast was already at the bottom of the list. Even back then.

~~~
collyw
Those three properties are very highly related.

Personally I found the biggest improvement to my own software came from
maintaining the same system that I wrote for 4+ years.

If I came back to a part and didn't understand it more or less immediately,
then it was time to refactor it. I wrote the code I should understand what it
is doing. No excuses that someone else had written bad code.

~~~
HankB99
"It was hard to write, it should be hard to understand." (I was joking, of
course, when I said this.) It seems to me that that's a reasonable metric for
determining if something was well written or possibly well commented. I had an
opportunity to go back to some of my older code about a year ago. It was scary
to me what portion of it I no longer understood. Worse yet, portions of it
referred to documentation that I could no longer find. The company had
migrated their documentation through several storage organization technologies
and no one seemed to know where the old stuff went. Luckily I ran across a
retired engineer who recalled where a lot of it had been archived.

(Unrelated) When I read the article, the first thing I thought was that all of
the simple programs had already been written.

------
hawkice
The simplest code is an empty file. And yet, we live in a world with complex
software. There are tradeoffs, but this article doesn't help you make them.

~~~
tumdum_
> this article doesn't help you make them

Sure it does. It clearly states that sometimes new features or performance
optimisations have to be sacrificed to keep the software simple.

~~~
hawkice
That's like saying, sometimes apples are so expensive you can't buy them.
Perhaps surprising, if you haven't thought of it, but it is an obviously true
statement about all tradeoffs. Sometimes you shouldn't make them. Still
doesn't help you figure out when you should.

~~~
tumdum_
Maybe it’s obvious to you but at least in my experience lots of devs when
confronted with feature request rarely consider if said feature should be
implemented. Most of the time they only think _how_ it could be implemented.

------
jeremya
I think this article may be guilty of overstatement. It's an ever more common
sin, as overstatement seems have become a preferred method for communicating
thoughts and ideas (and products, though that's nothing new).

Is there a need to overstate to cut through the noise and get your point
across or your message heard? Maybe, but it seems unfortunate that when people
have some truth or wisdom to share, there is a felt need to amplify and
polarize it.

This article has good things to say about the importance of simplicity in code
and implementation. I'm fine with value judgements as long as they
convincingly define the values they are judging and show evidence that the
facts have been thoroughly weighed. 'Correctness' is an ill-defined villain
here and the article would do better to state the benefits of simplicity and
experiences the author has had with systems designed without simplicity as a
first-order goal.

Then again, perhaps I ask too much. Also, I've never had an article on the
front page of Hacker News, so what do I know.

------
eggbrain
One of the lessons I've learned (that this article echos upon) is that you
should _always_ factor the "long term cost" of adding a feature.

When I first started building TrueJob (job board software), I'd add in all
these really cool features that made my app -- and at the time, they felt
really useful. But over time, people weren't using them, so I built more
features.

But then the old features I had built broke, so I had to fix them. And then
they lagged behind the quality of other features I had written them, so I had
to update them. And after doing this 5-10 times (as the software evolved and I
dramatically increased the complexity of my application), these features that
no one used really were painful to keep coming back to, but now enough loud
users were using them that I couldn't remove them.

It made me really value the projects where we polished and did just a very few
things, but did those very very well -- it lead to higher customer
satisfaction, and less pain in the long run for us.

~~~
tchaffee
I have encountered this so many times with enthusiastic and well-intentioned
but non-technical founders that I created a quick presentation around it.
Maybe other folks will find it useful.

[https://www.slideshare.net/chaffeet/how-killer-features-
will...](https://www.slideshare.net/chaffeet/how-killer-features-will-kill-
your-team)

------
niftich
I fully buy the argument that code is a liability, while the underlying
algorithm, or the overlying service is the asset. Following this line of
thinking encourages the writing of simple code.

Far more of a problem than code complexity is the lack of systems thinking
when applied to programs. Various factors (abstraction, delegation, nicer
APIs, solid products, SaaS, "microservice" trends, package managers and
bundlers) have encouraged offloading much of the computation and data flow to
other products, whose strengths and liabilities become your own if you make
use of them. One-liner lambdas might be simple code, but they're often coupled
to a maze of other cloud services, and the complexity there is coming from the
dependency substrate, whose shape can't even be expressed in an imperative or
declarative notation like code.

In truth, code and libraries and services feed into systems, and those systems
must be understandable if correctness and maintainability are a goal.

------
d__k
Some other formulations:

\- Occam's razor
[https://en.wikipedia.org/wiki/Occam%27s_razor](https://en.wikipedia.org/wiki/Occam%27s_razor)

\- "Simplicity is the ultimate sophistication" (Leonardo da Vinci)

\- "Less is more" (Mies Van Der Rohe)

\- "Make everything as simple as possible, but not simpler" (Albert Einstein)

~~~
DannyB2
All four of those are great! But if they are wrong, then it is still wrong no
matter how simple.

Like saying leap years occur every four years. Simple!

~~~
bigjimmyk3
I think I read the OP's assertion in relation to your counterpoint like this:
when thinking through the problem, your initial iteration of work is allowed
to say exactly that: a leap year occurs every 4 years. However, you should do
so via a semi-stubbed function (isLeapYear) as mentioned previously. I am
imagining that the specifics of when a leap year occurs are not critical to
your solution, only that you know when they happen. Thus, you avoid spending
too much time on a detail that's eventually important, but not critical for
proving your first hypothesis. In the "correct" stage you come back and
improve the isLeapYear function to return correct results.

Part of attaining wizard status is not learning how to hold more of the
program in your head, but instead learning how to hold _less_. This seems like
an excellent step in that direction.

------
sreque
My reaction to both the title is similar to that of other commentors:
correctness should go first!

However, the general idea appealed to me, so taking a step back, I tried to
post-rationalize a similar thought to the author that I could reconcile with
my initial reaction. The thought I then had, is that if a simpler solution can
be found that solves a significant subset of the problem being solved, then
perhaps it is worth adjusting requirements to go for the simpler solution for
the fact that it lets us ship faster and with less risk.

Often times we come up with requirements that aren't really "required":
showing business stakeholders that dropping a few requirements could enable
you to ship six months faster and with far less risk can be a valuable insight
in of itself. In essence we are still putting correctness first, but we are
changing our definition of "correct" slightly in order to increase simplicity.

------
dfxm12
"Correct" should, by definition, imply simple and fast. "Simple" tends to be
the correct and fastest way to code. Code that runs fast might not be the
"simple", but simple code will be the debugged/updated/ported the fastest. I
think the author is saying this, but with a more controversial headline (and,
intentional or not, controversy creates clicks). After all, this is the
premise:

 _The reason is straightforward: if your solution is not simple, it will not
be correct or fast._

This could be reworded in the following ways and the points made in the
article would still follow:

If your solution is not correct, it won't be simple or fast. If your solution
is not fast, it won't be simple or correct.

------
Floegipoky
I interpreted this as advocating for using a model with the lowest-level
abstraction that you think will work. If you start with the simplest
abstraction possible, you produce a simpler and more maintainable system.
You're also in a better position to incorporate further abstraction later as
your understanding of the problem space evolves.

This seems like a good opportunity to recommend Rich Hickey's talk "Simple
Made Easy": [https://www.infoq.com/presentations/Simple-Made-
Easy](https://www.infoq.com/presentations/Simple-Made-Easy)

------
peterwwillis
What's the right way to cut a bagel?

The obvious solution: Grab a knife, put the bagel on end, and get to slicing.

The commercial solution: Flat and flip. [https://www.epicurious.com/expert-
advice/best-way-to-cut-a-b...](https://www.epicurious.com/expert-advice/best-
way-to-cut-a-bagel-article)

The mathematician's solution: A Möbius bagel.
[https://www.youtube.com/watch?v=Ktfo8D3cCr0](https://www.youtube.com/watch?v=Ktfo8D3cCr0)

The engineer's solution: The bagel jig.
[http://www.freepatentsonline.com/5228668.pdf](http://www.freepatentsonline.com/5228668.pdf)
[http://www.freepatentsonline.com/3347296.pdf](http://www.freepatentsonline.com/3347296.pdf)
[http://www.freepatentsonline.com/4807505.pdf](http://www.freepatentsonline.com/4807505.pdf)
[http://www.freepatentsonline.com/4747331.pdf](http://www.freepatentsonline.com/4747331.pdf)

The consumer solution: The bagel guillotine.
[https://www.surlatable.com/product/PRO-1036557/Sur+La+Table+...](https://www.surlatable.com/product/PRO-1036557/Sur+La+Table+Bagel+Guillotine)

Which is the correct solution? Depends on who you are.

------
hoorayimhelping
I think clear is the most important. Most software projects I've worked on in
the past decade have gotten a lot of assumptions wrong. The projects that were
able to turn around quickly were the ones that could find busted logic and fix
it.

I've been using rspec-inspired testing frameworks to help with this clarity.
Whenever I implement some early business logic, I assert that the logic I
wrote does what I expect in words using rspec style tests (I've been writing a
lot of JS lately so I've been using Jest). The kind that read like sentences:
"the tax component applies 2% tax to all purchases above $400." Even if the
logic is initially incorrect, being clear in our incorrectness lets engineers
and (more importantly) product people quickly identify incorrect assumptions.

The logic to apply that tax in this example may not be simple. Often times,
business logic can't be simplified any further and needs to be a bit thorny.
In those cases, clarity of purpose is much more beneficial than simplicity,
and I've found writing rspec style tests, and forcing myself to translate what
the logic is doing into words, helps immensely with clarity. It clarifies my
thoughts before shipping code, and it clarifies our business assumptions as a
whole when that code is running in production.

------
ConceptJunkie
Anyone can write complex, brittle code, and often even get it to work.

The real superstars solve the same problems with simple code.

I recall a fellow student in CS in the 1980s who used to brag about how many
lines of code he wrote to solve an assignment. I never understood that
mentality. His programs were always 2-3 times longer than mine. But now that
I've had many years in the industry, it almost seems that a lot of people
believe more code is better.

------
susam
My approach towards software development has been pretty similar except that
my order of priorities is a little different. I prefer:

Correct, simple, fast, in that order.

In other words:

\- First get a correct solution working that solves the problem correctly for
typical as well as corner cases, hopefully with good test coverage for both
typical and corner cases.

\- Then improve the solution to make it simpler while preserving correctness.

\- Spend time on making it fast when there is actual evidence of performance
problems such as data from performance testing with current workload and
expected future workloads.

The reason why I like to ensure correctness before simplicity is that many
times what might seem like a simple solution initially might turn out to be
the wrong idea when all corner cases need to be accounted for. Ensuring
correctness first requires me to think through the corner cases well and write
test cases early during the development phase. With correctness taken care of
and protected reasonably well with test cases, it becomes easier to iterate on
the solution and increase its simplicity.

~~~
__mifflin
I like this approach, and it's usually the one I follow.

When I'm first building out a feature, I need to make sure that it actually
works (i.e. is correct). Once that's out of the way, I'll have a good idea of
what it takes to make that functionality actually work correctly, and can
begin making it simpler while preserving correctness.

Simplicity is easy if correctness is not a constraint.

------
sgt101
Software is rarely correct, because if it is not correct today it will only be
correct tomorrow due to skilled interventions, and if it is correct today then
it will almost certainty not he correct tomorrow without skilled intervention.
The chance of skilled intervention is always low.

Simple software reduces the skill required to intervene.

------
gwbas1c
That mantra, taken blindly, leads to software that won't scale under realistic
circumstances.

I once worked on a project that failed because the lead programmers did not
want to learn how to use a database. They insisted on using an ORM
incorrectly, and almost all of their code needed to be rewritten in order to
handle a typical anticipated load.

Granted, the entire codebase was full of "simple" for each loops, but the
reality is that if they had started by writing correct database queries, the
project would have never failed.

Thus I say, optimize for your budget at the beginning of the project. You
should pick design patterns that handle anticipated load. Full optimization
can come in later, but if your project can't handle anticipated load at the
beginning, then you are misinterpreting what this "simple, correct, fast"
mantra really means.

~~~
kungtotte
There's a difference between simple and simplistic, and there's absolutely a
value in keeping things simple instead of complex. So much so that I think
it's reasonable to state categorically that if there is a simple solution
_that works_ it's always better than the complex solution.

Not using a database when the solution calls for one clearly violates the
"works" principle. And obviously, using a tool incorrectly (the ORM) trumps
anything else. That's a tautology.

~~~
gwbas1c
It's like some people take "optimize last" to the extreme. A project needs to
handle anticipated load at the beginning.

For example, if a web application needs to handle 20,000 requests an hour,
it's okay if early versions take 10-15 seconds to respond under unusually high
load. The optimization phase can bring that down to something more manageable.

Some people take the "optimize last" so far that they ignore their basic
scalability requirements; or just assume there are no scalability
requirements. That's when a more senior dev needs to step in and demand basic
scalability in the design.

------
jmilloy
Understandably, a lot of people are balking at the idea that simple might come
before correct. Code that is simple but not correct is not a solution. But the
OP is not asserting that simple, incorrect code is a superior solution to
complex, correct code. Instead, I think the assumption is that there _is_ a
solution that is simple (enough), correct (enough), and fast (enough), and
that you are not going to stop until you reach such a solution.

Correct comes before fast not only because you should make it correct before
making it fast, but because you should not sacrifice correctness in order to
make it fast.

Similarly, simple comes before correct because you should not sacrifice
simplicity in order to make it correct (or fast). Instead, you should continue
looking for different ways to make it correct while maintaining simplicity.

------
dredmorbius
Some years ago I tracked down the origins of the phrase "complexity is the
enemy", and learned three things:

1\. The full phase is "complexity is the enemy _of reliability_ ".

2\. It dates not to the 1980s or 11970s as I'd thought, but the 1950s.

3\. It first appeared in print in _The Economist_ newspaper, 18 January 1958,
according to Google Books & Ngram viewer.

I'd very much like to have a copy of the article, though its proved resistant
to obtaining. HN username at protonmail.com should anyone happen to have
access to a PDF.

[http://books.google.com/books?id=aDsiAQAAMAAJ&q=%22complexit...](http://books.google.com/books?id=aDsiAQAAMAAJ&q=%22complexity+is+the+enemy%22&dq=%22complexity+is+the+enemy%22&hl=en&sa=X&ei=_tZgU9ncAYGQyATBp4GYDg&ved=0CDUQ6AEwAQ)

------
Chyzwar
Simple is the property of microscale. You can have a class, function, module
that is simple but the whole system can be incredibly complex. You can
sacrifice simplicity in many places as long as things are loosely coupled.

Correctness can be only achieved if the programmer has a good understanding of
requirements and posses necessary discipline to write tests. I cannot image
system that maintains correctness without tests.

Performance usually is something that only good architecture can bring. I
disagree with the article. If you focus on simple too much you will miss
important requirements and you will make architecture choices that negatively
impact performance. More annoying it the author is trying to create another
silver bullet approach.

Simple/Correct/Fast - it depends on the problem you try to solve.

------
josmala
The Amdahl's law applies here also, if your code is slow everywhere, making
the slowest part faster isn't going to help much, so there needs to be some
basic minimal consideration given the performance throughout the project. Or
better yet, have well performing default ways of doing things and avoid badly
performing ways.

In worst case failing to do that will require write entire project from
scratch because basic data structures and structure of code is antithesis for
performance and there is not a point that you can optimize. A reasonable
default is using vectors over linked lists. A more complex choice is using
struct of vectors rather than vector of structs. And neither of those choices
are easy to do for data structures that really matter late in the project.

------
kyleperik
Something I would add that helps when writing simple software is testing. You
have to close the feedback loop early and often. Until you see what your
program does you won't really understand it.

Thinking you understand what you're making before you really know is one of
the worst mistakes you can make. It's under that misunderstanding that you'll
think you need to add to your program to make it more correct and fast, when
really you're over complicating it and making your life worse when you
actually need to get it running.

In contrast, when you can run a test as soon as possible, that's where I
usually see an alternative way of writing it that may be shorter, more correct
or faster

You'll never get to this point of course if it isn't simple in the first
place.

------
quadcore
I've found that what keeps more programmers from simplicity is that people
like things to be in order. Like their desks or their living room. I've found
that seemingly messy code that are actually simple perform better than complex
supposedly in-order code.

~~~
p0nce
Another problem is reliance on "big" frameworks as if those being big and
successful would make your particular application successful. No it's just a
bunch of code you don't understand and can't fix.

------
mindB
Interesting that he hit on the top three points in the pony philosophy[1] but
in a very different order than they chose. For pony, the order is:

1\. Correctness

2\. Performance

3\. Simplicity

4\. Consistency

5\. Completeness

I think I tend to agree with that order more. While simplicity tends to be
helpful with performance and correctness, there are very few cases where you'd
sacrifice correctness/performance for simplicity if implementation time/cost
were not a factor. Let's not confuse a way of getting to the goal with the
goal itself.

> Incorrectness is simply not allowed. It’s pointless to try to get stuff done
> if you can’t guarantee the result is correct.

[1] [https://www.ponylang.org/discover/](https://www.ponylang.org/discover/)

~~~
cpburns2009
I think I agree with these points but I have a question. What is "consistency"
in this case? The article only says,

> Consistency can be sacrificed for simplicity or performance. Don’t let
> excessive consistency get in the way of getting stuff done.

I take consistency to mean consistency of results being correct. Wouldn't that
make consistency a subset of correctness?

~~~
mindB
Pony is a programming language, so I take consistency to be about how
consistent the pony programming language feels to the user (principle of least
surprise, etc.). That said, I'm not part of the pony core team; if you really
want an answer, you could stop by #ponylang on irc and ask.

------
davidhariri
I can't agree.

Simple is important, but it's not the most important thing. The overwhelming
evidence is that there's plenty of working software, in use, that isn't
simple. Users will shape their behaviour and memorize flows around a complex
piece of software if there is sufficient motivation.

My own principles for design state this order:

Functional, Simple, Delightful

Does it do the job? Is it simple to use (nothing more than what's required)?
Do people get excited to use it?

The first is non-negotiable and the last is optional. There's a lot of
software that is functional, but over-complicated. Lots that is delightful,
but doesn't work and everything in-between.

The software I admire most (and use daily) has a balance of all three.

------
zzzeek
How do measure "simple"? Is PostgreSQL "simple"? If not, is PG not considered
to be "correct" or "fast"? Seems like it does a pretty good job to me and no,
PostgreSQL isn't exactly "simple".

------
digsmahler
Yes! So much covered in so few words. This ordering works shockingly well for
constructing robust solutions to complex problems.

Keep it simple. Do what needs to be done now, leave off for later everything
else. By the time later arrives, what the project needs will have gone in new
and surprising directions. Refactoring complex and coupled code into logical
discrete units pays off in multiples down the road. Removing that which is no
longer needed is like giving your whole team extra time to breath and new room
to think. For all the times that simplifying also solved the two other
problems, we used that time for making more cool stuff.

------
RogerL
It's only six paragraphs so you can't hope for much subtlety, but this is very
fuzzy thinking.

I always stress a hierarchy of Goals, Strategies, Objectives, and Tactics,
which is a fundamental and well known way to tackle strategic
thinking/planning. When you engage in fuzzy thinking, then it is easy to make
the wrong decision. For example: "we must be agile". Well, no, that's not our
goal. Our goal is "whatever", and agile might let us meet that goal, or it
might hinder it. "Move fast and break things" will (hopefully) never be the
motto of Boeing.

So, simple, correct, fast. That conflates different things. It is never my
goal that things be simple. A goal might be 'correctness' or 'don't kill
people'. A way to achieve correctness is, for example, easily unit testable
software, which is a strategy, the objective is a passing unit test, and a
tactic might then be the code be simple (low cyclomatic software is easier to
exhaustively test, for example).

When you conflate various levels of this hierarchy bad decisions and religion
ensure. "Code must be simple". "We must always be agile". "Everything must be
documented". "functions must be < 10 lines long" You can see that these are
not goals, but often chosen ways to get to your goal. The problem is, goals
change, and strategies don't get you to the goals in some edge cases. Because
you are focused only on strategies/tactics and haven't clearly articulated the
goals you don't notice this and make bad decisions.

For example, short functions are generally a good thing. But, sometimes it
takes 50 lines to express a cohesive thought. Splitting that arbitrarily
across many functions can just obscure intent, and leave the developer
scrolling back and forth, which is proven to reduce comprehension and increase
the likelihood of errors.

In short, if you use these three criteria, in this order or any order, I don't
think you will end up making very good decisions because you at best only
implicitly have stated and understood the problems you are trying to solve and
the interrelationships of how your strategies and tactics affect one another/

------
xbtsw
The article and lot of comments implicitly have the assumption of you must
sacrifice something to get simplicity or correctness, at least at beginning.
Performance can only be picked up “later”.

The fact is retrofit performance into a sufficiently complex system is hard,
more often then not the system has to be redesigned and rewritten to achieve
it, as we have seen with so many OSS project.

I think with right amount of forethought, all three can be achieved. That
doesn’t mean you never need to iterate on your software, because requirement
always involves.

------
amatheus
It seems to me like what the author proposes fits with TDD: start building the
simplest case by way of a unit test, implement it in the simplest way, and
keep going that way; while the “correct first” approach many are saying should
be used feels more like having a problem, really thinking about how it should
be solved and then solve it all at once.

Now, while I think some small and contained problems may be solved the second
way, I think most complex problems of the kind we solve as programmers will be
best solved the first way.

------
stefs
I do agree with the premise, but with at least one caveat: for some problems,
there aren't any simple solutions. In that case it's better to choose the
_simpler_ solution.

------
DannyB2
Making correct code simple(r) requires more time (eg money) than making
correct code in the shortest time possible.

Simple, and still correct, is often the result of doing MORE work to achieve
it.

------
jstimpfle
Yep, and that's why "worse-is-better" is better.

I've never read Gabriel's article in such a way that this "worse" means
actually, seriously, worse quality. I always thought it meant "YAGNI", "KISS",
and so on.

If you can't solve a problem, maybe you shouldn't try so hard? Find a better
problem! I'm confident this attitude helped me a lot in the past. Speaking as
a clearly-not-a-genius programmer.

------
tchaffee
My version of this is test cases (e.g correct), simple, fast: in that order.
With test cases you can refactor your initially complex and naive solution as
many times as needed until the solution is easy to understand and elegant.
I've rarely made something simple the first time around. To get to simple I
need to work at it. And I'd like to know I'm not breaking things when I do
that work.

------
l0b0
Looks like the point about correctness is misunderstood. The post argues that
1) perfect simplicity, correctness or speed is unachievable in most software
of any complexity and 2) given that there is necessarily going to be a trade-
off, focusing on simplicity makes the other two easier to solve. There is
nothing in there about simply writing `return 1` and considering that the
perfect solution.

------
rdiddly
Been working on some legacy code that is so convoluted, you sometimes can't
even tell if it's correct, without starting to rewrite. Once you grok what
it's doing, there's almost always a simpler & more direct way. This in turn
usually makes it faster, AND sometimes more correct. It's pretty gratifying,
though there are many SMH moments.

------
jasonmaydie
This made me laugh. We were tasked to write an app for a customer and one of
the main requirements from their product manager was the app had to open
within 1 seconds. Not a problem we thought, then their technical guy came up
with all these requirements that had to be resolved during the startup
sequence.

So it was not simple and it was not fast. Nobody wants simple.. ever.

------
BatFastard
I find it depends on how well you understand the problem space. If you are
well versed in it than making a simple solution first is possible. Which can
be corrected to handle edge cases later.

However if the problem space is new for you. I find I usually have to write
more complex code to understand it first. Then I can come back and simplify
the complex code.

------
jorgeleo
mmm... I disagree. The order should be Correct, fast, simple.

First of all the software needs to be correct, if the software produces
incorrect results, then there is no point for it to exists. It needs to be
debugged until the results are correct.

Then fast. The point of using software is helping the user to accomplish a
task. The better help that the software can provide, the more useful is the
software, and performance is often critical to help the user.

Then simplicity, simplicity help future developers to understand how the
software work and how to properly maintain it. But if some complicated
procedure is needed to produce the correct results, then complicated procedure
it is. If a different algorithm needs to be used to speed the software, then
the new algorithm it is. It is up to the next developer to study and
understand what was done and why.

The purpose of software is the end user. The next developer sits second to the
end user.

~~~
skj
Sounds like a recipe to deliver hard-to-use software late. The easy-to-use
software that was delivered on-time will eat your lunch, even with some bugs.

------
ioulian
You can apply this rule to the structure of your codebase/class: make a lot of
small, focussed classes rather than one big class that does everything.

This way you can easily debug it, replace the logic of 1 class without
breaking your whole application.

One of the best advices for me was: your function must do one thing only, and
do it good.

------
mwkaufma
OP opines, but cites no evidence, examples, or even anecdotes. If your work is
slow and incorrect, your work isn't simple, it's incomplete. Also, some
problems are irreducibly complex. Save us from "simple" and "lightweight"
"solutions."

------
nautilus12
At the end of the day your opinion on this doesnt matter. The company you work
for is still going to want it as fast as you can possibly manage it, and they
honestly dont care if you have to write spaghetti code to get there because
they dont have to deal with it.

I think I need a new job...

------
bluetomcat
Simple means that there is less to go wrong, but not necessarily correctness.
A seemingly simple implementation can also hide a lot of behavioural
complexity behind abstraction layers. The program might not crash, but it
might not behave properly either.

------
Veedrac
_In the boardroom, three of Ford 's top brass are in the midst of heated
debate..._

"As you see, our main priorities, in order of importance, are simplicity of
design, reliable operation, and good performance. For now we're focussing on
simplifying our designs and manufacturing process, we'll figure out the rest
afterwards."

"In that order?!" screamed another, sitting to the left. "Why you might as
well replace the engine with a brick! If it's not able to drive you around,
what even makes it a car?! Scrap your ramblings, first we make a car that
works, with an engine and all, _then_ we can figure out how to manufacture
it."

"Pah," scoffed the engineer across the table, "and when is it in this story
that you realize we're in the business of building cars, not mars rovers? If
it doesn't get you from A to B faster than a bike nobody is going to give a
hoot that it can run for three years without maintenance at the bottom of the
Mariana Trench."

"Idiots!", the original butted back in. "If you make it simple _first_ , then
changing it to make it _fast_ will be easy, and of course only simplicity
begets correctness."

"You think the Mars Rover was simple?!─"

"Ha, because nuclear reactors are the paragon of simplicity─"

"CHERNOBYL IS _EXACTLY_ THE POINT I'M MAKING HERE!─"

"Though wasn't it economic factors that lead to disuse of nuclear power? I
hear solar is getting popular, we should really stick to my plan─"

"A car is not a solar panel, you bumble-headed fool─"

"You might as well be though─"

_ _Ahem,_ _ sounded the man at the head, drawing the room's attention, "I'm a
little lost, so forgive the stupid question, but why have you not just
considered... doing them together?"

------
nickjj
I wish more people followed those 3 steps for deploying apps.

A lot of people want to jump straight to the auto scaling, self healing, super
deluxe cluster + tax edition to handle a billion requests a week before their
app even has 1 visitor.

------
Const-me
TLDR: there’s no universal criteria for code quality.

Imagine you have 5 programmers. #1 works in a web development company, writing
a CRUD web app in SQL, PHP and JavaScript. #2 is coding C and developing an
embedded firmware for a PC component. Also works on the driver for that
component in the Linux kernel. #3 writes some COBOL for a half-century old
system running in a bank. #4 works in a game studio developing a level editor
for yet to be announced videogame. #5 is a researcher working in a university,
mostly writing Mathlab but occasionally some Python and R.

The ideal simplicity/correctness/performance tradeoff is totally different
between the five. Just like any other tradeoff, or generalization, or
methodology, or approach. All 5 are writing code, but the software they’re
creating have totally different requirements, expectations, lifetime and
budget.

I never saw articles generalizing stuff across all engineers: aircraft,
biomedical, marine, telecommunications, etc. However, I saw many articles,
this included, that tries to generalize across all software development.

------
h0p3
Correct, [Simple|Fast]: In That Order.

The above is definitely the correct algorithm. Surely you don't think it is
wrong! Come at me, bro!

If you aren't trying to make your representation correct, then why bother?
Correctness is either the telos or closest of these in the pursuit of the
telos (the assumed intrinsic purpose). To make it simple and fast are the
instrumental (and heuristical) means to increasingly correct versions of that
correctness end(s). You can only ever simplify a representation (hello, Kant!)
because you never have complete (hello, Gödel!) access to the thing-in-itself
directly.

The representation is the only thing you can consciously (the Daseinic
emergent result of the non-conscious [that you know of: problem of other
minds, OOO, dualist's hard problem of consciousness, etc.] aspects of your
brain using language to talk to itself) attend to; it's the only way to tell
and retell these stories to yourself. How correctly can you recursively
represent correctness? I can only begin to meaningfully compute by starting
with a meaningful notion of correctness as my foundation (however flawed it
may objectively be), else it is meaningless. What does it mean to correctly
tell yourself about an object if you don't assume the concept of correctness
in how you tell yourself about an object?

In a sense, you beg the question of the Ontological Proof (hello Kantian
idealist vs Gödelian realist in dialectic!), the reality of the possibility of
the goodiest good of your program (and I suggest even beyond), in thinking
about the telos of your program (and, clearly, you change your mind about what
counts as that).

Simplifying and/or optimizing a representation already begs the question of
having something to be correct about. "Simplify" according to what standard?
The pursuit of correctness is the necessary precondition to having a reason to
take the means. Epistemic justification in coding computers (be they silicon
or brains) is inevitably tied to this telic processs and metanarrative. We do
not escape the chain of sublations. Be a transcendental coder! I believe in
you, folks. I know you care about correctness, deep down. Don't you want to be
correct about this code too?

It's dangerous to go alone! Take this:
[https://plato.stanford.edu/entries/dialetheism](https://plato.stanford.edu/entries/dialetheism)

\------------------------------

[https://philosopher.life/](https://philosopher.life/)

------
throwanem
A valid observation, but hardly novel:
[http://wiki.c2.com/?MakeItWorkMakeItRightMakeItFast](http://wiki.c2.com/?MakeItWorkMakeItRightMakeItFast)

------
gilbetron
It's not an order, it's not "or", you must strive for simple AND correct AND
fast, that's how you compete. If you don't have one, it is extremely difficult
to compete.

------
amorphid
For any non-trivial program, I usually end up with: working jank, maintainable
jank, fast jank. If I can write something that works AND isn't a ball of mud,
I'm pretty happy.

------
patientplatypus
How sure are you that you're program is simple enough for the next guy to grok
it? If it works and the next guy doesn't see how, it doesn't work.

------
slx26
and funnily enough, writing simple code is the hardest part to learn, because
you can't measure it until it falls down on you

now that I think about it, that would be a good exercise for programming
students. after writing a program that does a certain task, make the student
rewrite it to be simpler/easier to understand/maintain. then take a look at
the best solutions in the class.

------
AstralStorm
Simplicity only trumps actual correctness if you want to hire more cheap
maintenance programmers, thus pay more for the software.

------
JackFr
I read this as "Don't start with the edge cases." Which is _sometimes_ good
advice, but is not a universal rule.

------
therealmarv
What means simple? When I speak with Java developers it means enough
abstraction for all kind of stuff (UML graph) :/

------
betacat
How do you know how far to simplify if you don't pay attention to correctness?
Simple has correct as a dependency.

------
mike_ivanov
The simplest possible program that is not correct is no program at all, so why
bother?

------
denzil_correa
“Everything should be as simple as it can be, but not simpler” aka Occam's
razor.

------
newswriter99
These are also the attributes that news should follow.

------
sharpercoder
Make it work, make it good, make it fast.

------
an_d_rew
I disagree. Correct, THEN simple, then fast.

Why? Aside from the excellent reasons given by others below, remember that
even conceptually simple systems can and often have surprisingly complex
behaviours.

The emergent complexity of interacting simple systems is... often breathtaking
in the scope of how whacked out the unexpected can be.

tl;dr - "simple" programs do not necessarily have "simple" behaviour or
"simple" interactions with other "simple" systems.

Make sure the damn thing works before simplifying it.

------
ythn
I subscribe to this philosophy, but unfortunately simplicity is surprisingly
difficult for most software devs. It's because if you don't have the tool in
your toolbox, you won't even know it _can_ be simplified.

I once did a code review for a function that parsed some Linux file for
Ethernet stats. It was incredibly convoluted with tons of substring finding
and indexing. I told the author to simplify it and he declared he already had
and it was as simple as it could get. I then showed him of the existence of
regex and his mind was blown.

~~~
Jtsummers
Exactly. And this is a major reason to always be on the lookout for new things
to learn.

I had a similar experience (but earlier in his coding process) and managed to
change what was about to be a months long effort of adding epicycles and
epicycles to code into a one week task. I have most of my library at work and
constantly speak of design and theory concepts with the younger folks. Based
on more recent code it's paying off.

------
rhapsodic

      > The reason is straightforward: if your 
      > solution is not simple, it will not be 
      > correct or fast.
    

A very hand-wavy statement, and not always true.

~~~
perl4ever
"Everything should be made as simple as possible, but not simpler"

