
Skills Poor Programmers Lack - rspivak
https://justinmeiners.github.io/the-skills-programmers-lack/
======
apo
The skills cited in the article are:

1\. Understanding how the language works. Additionally understanding how the
language infrastructure interfaces with the computer.

2\. Anticipating problems. Prefer solid foundations over veneers that appear
to get the job done.

3\. Organizing and designing systems. Essentially, SOLID.

Two things on this:

First, bad code often results from conflicting goals. Moving goalposts _and_
on time shipping, for example. The result appears to have been written by a
poor programmer, when this may not be the case.

Second, the most valuable skill a programmer can have isn't technical, but
rater social: empathy. The best programmers I've seen have it and the worst
completely lack it.

Lack of empathy leads to poor communication. If programmer can't anticipate or
read his/her audience's perspective, there's no way s/he can communicate a
complex concept to them. The temptation will be to blame the audience when in
fact the failure lies squarely with the programmer doing the speaking or
writing.

Lack of empathy also leads to disregard for the needs of users and future
maintainers. Systems get built that don't need building, and systems that
should be built aren't. Should the two happen to coincide, the system is a
nightmare to maintain because the programmer simply didn't care about the
people who would need to maintain the contraption.

A lot of the 10x programmer discussion focussed on people who lack empathy.
For some reason, it's easy to conflate lack of empathy with technical skill.

~~~
rjpc
Yeah, context and empathy are two things that I'm only appreciating more and
more as my career goes on.

I once had this small but terribly written module written by an inexperienced
developer who wasn't given the kind of feedback and code review that he should
have been given. It was still running in production years after that person
had left because it was in a corner of the code base that was basically never
touched. What made it interesting to me was that it was badly written at
almost every level from the high level separation of concerns to low level
coding practices, while still basically getting the job done.

I started giving this module as an exercise during interviews for a certain
position, with the framing of "This was written by an beginner developer on
your team. What kind of feedback would you give them to help them improve?"
This sort of thing was actually a major part of the job, as it was a position
that would be a kind of consulting resource for other teams and would involve
many code reviews and encouragement of best practices -- basically providing
subject matter expertise to full stack, cross-functional teams.

The results were fascinating to me because it acted like a Rorschach test of
sorts and told me a lot more about the focus on the interviewee than the code
they were criticizing. More junior candidates immediately jumped on the low
level issues like the boolean assignment example, naming conventions, or small
snippets of code duplication, and spent all their time there. More experienced
folks often mentioned the low level issues but spent more time on the higher
level problems -- the class should be broken out into two, extending in the
most obvious way would be hard because XYZ, this etc. Some of the best
candidates would ask for more context on the situation and overall codebase.

It also helped weed out the jerks who, despite the prompt, decided that the
exercise was an opportunity to show off and insult the original author (who
was of course anonymous), venting about how stupid something or other was or
using it as a springboard to attack their current co-workers. Everyone starts
somewhere. It's fine to wince a little at something that's poorly written, but
the point is to actually help them improve. The better candidates were there
trying to understand what their gaps in understanding were that would cause
them to make certain mistakes. The very best candidate was trying to map out a
staged plan of more easily digestable things to work on so that they're not
overwhelmed all at once -- extrapolating a whole technical mentorship model
out of what they could glean from the code review.

~~~
techslave
bookmarked.

i don’t suppose you could publicly share the module? this is a stellar
interview question.

or maybe there’s a library of such code? (yes yes, jquery/openssl or your
favorite true but not useful reference comes to mind)

~~~
rjpc
I don't think I can, sorry. Though I imagine that if you have a code base of
any size that more than a couple dozen people have touched, you'll be able to
find something similar if you ask around.

------
klenwell
As an addendum perhaps to the Organize and Design Systems section, I'd propose
including mise-en-place as it applies to project maintenance and the
development environment.

This centers on tooling and documentation, especially the README. I want a
README that gets me set up and running as quickly and in as few steps as
possible. Recently I needed to test out some stuff one of my teams was working
on. It was a somewhat complex Wordpress project. I was able to get set up
using our documentation, but a couple key steps were missing making it error-
prone and unnecessarily aggravating.

Anthony Bourdain explains it as only he can:

 _The universe is in order when your station is set up the way you like it:
you know where to find everything with your eyes closed, everything you need
during the course of the shift is at the ready at arm’s reach, your defenses
are deployed. If you let your mise-en-place run down, get dirty and
disorganized, you’ll quickly find yourself spinning in place and calling for
backup. I worked with a chef who used to step behind the line to a dirty
cook’s station in the middle of a rush to explain why the offending cook was
falling behind. He’d press his palm down on the cutting board, which was
littered with peppercorns, spattered sauce, bits of parsley, bread crumbs and
the usual flotsam and jetsam that accumulates quickly on a station if not
constantly wiped away with a moist side towel. “You see this?” he’d inquire,
raising his palm so that the cook could see the bits of dirt and scraps
sticking to his chef’s palm. “That’s what the inside of your head looks like
now.”_

[https://books.google.com/books?id=XAsRYpsX9dEC&lpg=PA65&ots=...](https://books.google.com/books?id=XAsRYpsX9dEC&lpg=PA65&ots=dOy2B3EFxH&pg=PA65#v=onepage&q&f=false)

~~~
afarrell
Mise-en-place is so important.

Inexperienced engineers sometimes find themselves in teams which don’t
appreciate it. Because confrontation can be nausea-inducing, they sometimes
don’t endure the discomfort to insist on it.

———

When a customer walks into a restaurant, does he ask about the temperature of
the fridge the chicken is stored in? Does he ask if the kitchen is clean and
organized? No. Thats the chef’s job to insist on. But the customer does care
if his food is late or laden with salmonella.

Likewise, its an engineer’s job to care if she has automated tests and well-
structured code.

~~~
saidajigumi
I'd argue that missing on _mise en place_ is symptom of a team leadership
failure, as well. Allowing excessive tribal knowledge vs. repeatable process
(docs, dev tools, tests, well organized code, etc.) is an organizational risk.
There's the efficiency loss side, but also the possibility that an
incapacitated team member becomes a critical business loss – the team becomes
unable to meet goals or worst case, to ship at all. Not empowering new team
members to efficiently onboard is just one facet of this. With new hires,
there are also the power issues that result, particularly blaming new hires
for poor onboarding performance.

I "fondly" recall one early job in my career starting with the promoted-from-
the-ranks dev lead filling a whiteboard with boxes and acronyms... an
overwhelming brain-dump to a newcomer that took hours of time spread over
days. By the time I'd sorted out what we really did, months later, I could
explain the whole thing to a new hire in fifteen minutes. One easily
understood diagram, pointers to the two actually-important directories in our
codebase, and some context – and they would be off to the races. Funny enough,
I met a number of other people at that gig that thought of their work as
"irreducible". One was, to use the recent meme, a "10x engineer" for whom the
company maintained a rotating roster[1] of tech writers to follow around
_everywhere_ and try in vain to record what the heck they were doing.

[1] "rotating" as in "hired, then fled screaming"

~~~
afarrell
I agree deeply.

------
humanrebar
Most programmers do not follow the Golden Rule, which is to write code you'd
like to maintain with minimal training. It's a principle, but there are
various skills involved in doing it successfully, including writing,
automation, design, seeking quality peer review, and a few other things.

Similarly, writing code that can be deleted is an important skill.

Using "good design" and "knowing your language" are fairly nebulous and
somewhat tautological goals out of context. What is good design? Design that
works well, I suppose?

Most code is actually fairly easy to change in the future. Just slap another
"if (isLeapDay(today)) dailyProfit * 28 / 29" wherever will solve the
immediate bug. Is this a good idea? Nope. Why is it a bad idea? Well, it's the
S in SOLID, probably, but the real reason is that someone would be surprised
and chagrined to discover that design choice the hard way. Hence the Golden
Rule.

~~~
zwp
> writing code that can be deleted is an important skill

Are you saying "I can easily delete this code from the project" is a metric
for modularity? That's really interesting, is it your own thought?

~~~
spawarotti
I saw it before. Deletion Driven Development is one of the many names. For me
a good rule of thumb is: removal of a feature should result in a diff that
deletes a bunch of files (classes in case of oop) completely, and only single
lines in other files: the call sites to that feature.

------
quadcore
Most programmers work in passes, and don't write everything in one shot. But a
common mistake I see is not doing enough passes still. Programmers jump on
making polished commits they can show off to their peers.

Instead, what should be use more often is what I would call "postit commits".
Simple strokes of code that are certainly not final (if any code can be final)
but achieve a purpose. Their postit nature makes it very easy to change. Those
commits still need to point in the right directions though because they may
stay here for a while (think 15 years). As long as they aren't hurting the
customer or the codebase (like a supposedly-temporary hack, even though those
are sometimes required), then it's all fine, a program is never finished
anyway.

Working with more passes allows you to think more at every step and shape a
solution that's more efficient. For example, take a piece of code that's too
slow for the requirements and for obvious reasons. Don't necessarily jump on
optimising it right away. You may find later that you will finally get the
whole subsystem in which that piece of code is included better by using
another more powerful idea. And when you come up with that new idea, the only
thing that will stand in your way is "postit commits". You can change things
now because you're not facing polished ones. You can change things now, not in
10 years when you find out every one of your competitors has finally
implemented that idea, in which case you would take the cost of a full
rewrite.

This is essentially how you do things well from scratch (and by extension, how
you do anything well with code). Keep postits as long as you can, because it's
a better strategic position. Only harden a solution when you can't give it
more time or because it just hardened itself with cool ideas. At the end of
the day, you will win time, if you worry about that. You will crush your
competitors even. They will have 3 times more code with bad solutions, you
will have 3 times less code with all the super cool stuff and nice subtleties
(it doesn't always play out like that, but often enough).

~~~
metroholografix
This forms part of what Alan Kay calls “late binding” [see link below] and
describes as part of what makes Lisp and Smalltalk great, which is something
of a lost art today, mainly because most contemporary programming languages
(the usual suspects in the statically-typed camp but also languages one would
not expect, Python / Javascript) go out of their way to de-emphasize it. I
feel that Perl was the last, massively popular language that promoted late
binding.

[https://ovid.github.io/articles/alan-kay-and-oo-
programming....](https://ovid.github.io/articles/alan-kay-and-oo-
programming.html)

------
JohnBooty

        > Programmers who only work on small temporary 
        > projects (like an agency) may get by without 
        > ever improving how to design programs.
    

Amen to this. We recently ended our engagement with a _very_ well-known Ruby
consulting shop for exactly this reason.

Their engineers were very smart and wrote very pretty code.

However it was not suitable for the real world once even a little bit of
scaling was required, and this was an app that needed to store and move a fair
bit of data as a Day 1 requirement.

The real tragedy is that they were fairly arrogant about it. _They didn 't
know what they didn't know._ It's OK to not know how to build things at
scale.... as long as you know that's not one of your core competencies. I
don't know how to fly a plane, or create a design system, or write assembly
code. That's OK, because I know my limitations there.

However, these folks were arrogant and dismissive about scaling concerns.

Well, they were dismissed as a result of their dismissiveness.

Which is a shame, because they did have some talent there.

~~~
cosmodisk
This is the problem I've got at my current job. While I try my best to write
code in a way that it'd work many years ahead, the lack of scale requirements
and relatively freedom I've got on my own decisions are double edge sword most
of the time. Would love to work in a large shop with more senior people than
am.

~~~
JohnBooty
Both kinds of experience are so valuable. "Consulting shop" experience where
you get to play with a lot of stuff, often very cutting edge stuff.

"Long lived legacy app" experience where you really get into the nuts and
bolts of engineering some software, but you are often locked into a particular
stack and the cutting edge is your enemy and not your friend. Can suck big
time when you look for your next job and your skills are 5, 10, whatever years
out of date...

------
bryanrasmussen
Well, where he says

"You may have seen code which misunderstands how expressions work:

if isDelivered and isNotified: isDone = True else: isDone = false;

Instead of:

isDone = isDelivered and isNotified "

I think that's a matter of style, I prefer the isDone = isDelivered and
isNotified style myself, and I think the people who write the other way have
very poor style but as arrogant as that sounds I don't think I would be so
arrogant as to say they don't know how expressions work.

~~~
moedersmooiste
In college they taught me that every line of code should do one thing and one
thing only. So doing both a test and an assignment on one line would not be
prefered over the first example(split over multiple lines). I don't have a lot
of experience with programming so for now I do what I was taught. :)

~~~
sidlls
The code is doing one thing: assigning the result of a boolean expression.
It's not different in structure from, say, `x = a + b`.

~~~
bryanrasmussen
it's sort of an interesting problem in intention - if the reason why the
expression

"isDone = isDelivered and isNotified" was written is because the programmer
saw "if isDelivered and isNotified: isDone = True else: isDone = false;"

and thought I can improve that, then what they have done is managed to wrap
the checking and assignment into one expression, they are logically now doing
two things in the one line - it just so happens that they can do that because
one of those things was not really necessary to do.

------
arwhatever
The most lacking skill that I tend to see is an inability to think in types,
and to design software accordingly. Too many software developers never
progress beyond primitives and basic control structures - I call them Int,
String, and For Loop Developers.

Second biggest issue I see is failing to incorporate our cognitive
shortcomings into code design, assuming that you'll remember these dozens of
little details from today forever, and failing to enforce or make explicit
one's "today knowledge" in the code.

~~~
atmosx
Can you give a clear example on “thinking in types” approach ?

~~~
kyberias
It's the anti-pattern "Primitive obsession"

[https://refactoring.guru/smells/primitive-
obsession](https://refactoring.guru/smells/primitive-obsession)

Let's say, for example, that we need to handle distances in on our code with
different units (meters and inches, for example). I've seen code bases that
use integers or floating points for this and it's always confusing and error-
prone what is the unit. In fact, you could accidentally use some other values
(money) as distances.

Instead, if your programming language allows this, you could define a type
Distance and make sure that all the code that handles distances only use this
Distance type. Everyone is forced to think about distances when maintaining
the code, conversions routines are implemented and instantiated in one place
etc.

~~~
Insanity
This is why I like type defs in a language.

------
gravypod
> Using sleep(), cron jobs, or setTimeout is almost always wrong because it
> typically means you are waiting for a task to finish and don’t know how long
> it will take. What if it takes longer than you expect? What if the scheduler
> gives resources to another program? Will that break your program? It may
> take a little bit of effort to rig a proper event, but it is always worth
> it.

While I understand the sentiment of this post, which I feel is mostly correct,
it's interesting to see how divergent this statement is from my experience
building "modern" microservice architectures, devops, and distributed systems
design is from this view point that I used to hold so true. Async background
tasks that self-heal and are entirely outside of the serving path? Yes,
please.

~~~
kevsim
I initially had the same reaction, but I don't think that's the author's
intentions. He's saying don't use sleep() and cron to constantly poll if some
asynchronous thing has completed, have a proper event fire at the end and
handle that. Don't think that's in conflict with what you like about your
systems :-)

~~~
yoz-y
Yes, but in the article author also argues that proper definitions are
important. In more general case I'd say that expressing ideas in a non-
ambiguous manner is important.

------
gordaco
"I believe OOP and relational database get a lot of flack because programmers
tend to be bad at design, not because they are broken paradigms". This is very
true. Another more charitable interpretation is that sometimes deadlines
prevent programmers from thinking their models thoroughly and so an imperfect
model ends being used, causing problems in the future.

------
dpeterson
This article says more to me that the author(s) are inexperienced themselves
rather than shed any light on the practice of software development. I'm
imagining some recent boot camp graduates attempting to conflate their months
of programming experience into something more than that. "Hey old dudes in
company I just joined, I found some things I think are basic so I'm going to
write an article to indirectly shame you in hopes our manager will see how
valuable I am already." They know something isn't quite impressive about their
older co-workers but this list isn't it and the authors don't even come close
to being experienced enough to put a box around it and be thought leaders of
any kind.

~~~
komali2
I don't think these accusations are fair. The author has been programming
since at least 2009, and is degreed in math. They also wrote the "think in
math write in code article" that people here seemed to like quite a bit last
week.

~~~
virtualwhys
> The author has been programming since at least 2009

Which is to say, since the author was 12 years old if they followed a standard
K-12 + undergrad program (graduated from a Utah Valley University last year it
seems).

I think the irony in the article is that the author is very likely 22 or 23
years old and opining about how developers that have been coding for longer
than he's been alive still just-don't-get-it. I guess you'd just expect this
kind of article from someone with more experience in the field.

I did like this, however:

> Poorly designed software lacks conceptual integrity...It usually looks like
> a giant Rube Goldberg machine that haphazardly sets state and triggers
> events.

That is, it seems, the modern web :)

~~~
komali2
>Which is to say, since the author was 12 years old if they followed a
standard K-12 + undergrad program (graduated from a Utah Valley University
last year it seems).

Graduated the university in 3 years (check the dates), which doesn't nullify
your proposition, but it makes as likely the other possibilities, such as that
they went to university well into their adult life, applying credits from a
previous (unlisted) university career.

------
0xDEFC0DE
>However, there is a certain level everyone should know

Sadly that's still debated and you can only get piecemeal ideas from blogs and
job postings.

There's no central authority to approve developer skill levels that says
"these skills every programmer should know before joining the profession -
after that, it's up to the specific job" and then it's _legally enforced_ via
certifications and official exams.

Right now, it's up to specific jobs and blogs and you have to figure that out.

It's ambiguous and that frustrates people who like exact and measurable goals
when learning things.

I know it seems like one blog isn't a big deal, but I've read many blogs that
add things like "10 things every programmer should know" and this language
just never ends.

If you took a union set of all the things from every "things programmer should
know" article, you'd be studying for about 6-8 years before you got a junior
position.

>For example, if a successful login generates a session token and it collides
with another token, you could reject the login and have the user try again

Eh, don't even go back to the user. Collisions should be a rare occurrence
that you can make a second call to the token generator and just replace the
bad one. If you get common collisions, you need a better token generator.

------
adamparsons
Kinda frustrating that an article calling out where programmers fall short was
a non-responsive document that was almost unreadable on mobile. Mildly ironic?

~~~
afarrell
Does blocks of text need responsiveness in order to be readable on mobile?

What problem are you experiencing?

~~~
codycraven
The text is very, very small on a mobile device (XL devices may be easier).
Requiring anyone with a small/normal sized device without near perfect vision
to zoom in and swipe left/right to read a single line of text.

For example I'm on a normal sized device and can't read the text at a zoom
level that shows an entire line without my glasses (and my vision isn't too
bad, I'm legally allowed to drive without my glasses).

~~~
mjw1007
This is what "reader mode" is for.

The text is using good oldfashioned html markup (basically just h2, p, code,
a, em).

So there's no reason why you should have to use the stylesheet provided by the
site, and with a sensible browser you don't have to.

------
LaserToy
Well, I do agree with some points in the article, but I don’t believe even the
author really understands how the env. he runs his code on works. There is
more to it than just knowing the language, one might argue that you have to
know the OS (see fsync misuse by Postgres), how all the involved drivers work,
how CPU works, and the list never ends.

It is quite idealistic view, in reality you have to realize you never know how
everything works, otherwise you will spend a lifetime in academia.

On Software Engineering side (not just programming), the most important skill
to me will be curiosity towards business, not just code. Endless devs are
dying in arguing over tabs vs spaces, when their job is to deliver for the
business, not for their ego. Surprisingly, it is a common decease (based on my
Google and other companies experience). They create their wonderful world of
logic and sense, and think that everything that violates it is stupid (like
business person asking for a feature).

The skill poor engineering lack is actually common sense.

------
cjfd
In all fields there is a minority who are not very good at what they do. I
think, though, that in programming that minority might actually be a majority.
The thing is, if you are a poor plumber who causes floods in peoples houses
you are not going to be in business for a very long time. In programming there
seems no such discipline because it actually takes somebody good at
programming to perceive the difference between good and bad programming. And
if the programming project is a disaster right now the cause of this might be
bad decisions made 5 years ago and the people who made them are already gone
thereby shielding themselves from the consequences of their actions. In a
healthy programming department there should certainly be a small influx of new
people and new ideas but also the stability of people who see the project as
their personal responsibility.

~~~
pmichaud
Programming is more like trades than you think. A bad plumber might
occasionally cause a flood, just like a bad programmer might occasionally
cause a hard crash during an important sales presentation, but actually bad
plumbers use sub par material, and install them incorrectly, and you don't
know until 5 years later when your first floor is flooding because a pipe
joint inside the wall finally burst after the 1000th time water hammer slammed
it.

And, like programming, sometimes you'd find someone who actually just didn't
know what they were doing because they were inexperienced, and sometimes you'd
find someone who was rushed and underfunded, but in either case they are
generally long gone, and the cause of the ultimate failure isn't perfectly
clear, so the world turns and everyone keeps plumbing, for better or worse.

I can tell a similar story for almost all trades.

~~~
cosmodisk
Used to work in plumbing. One morning coming to work and see a waterfall
coming from the 3rd floor..The apartments were complete with kitchen furniture
installed. Turns out one idiot didn't solder an elbow,while other didn't test
it properly and the third signed it off and gave green light to turn the water
for the entire block.. It took many people and a lot of time to undo this. Not
much of a difference when things go wrong in code.

------
avip
The example is interesting.

    
    
      if isDelivered and isNotified:
        isDone = True
      else:
        isDone = false;
    

is not the same code as

    
    
      isDone = isDelivered and isNotified
    

in ruby, python, js (and more?).

[edit: while I'm nitpicking... breaking webpage text selection is also a clear
sign of poor programming]

[Edit2: more unsubstantiated "absolute truth" from OP]:

    
    
      Using sleep(), cron jobs [...] is almost always wrong.
    

_false_. Any hw related code (most of the code humanity had created?) relies
on such timing considerations.

    
    
      Another common mistake I see is generating a random file or identifier and hoping that collisions will never happen
    

_false_. This is a 100% valid approach, the odds of uuid collision are 0 to
any practical consideration.

~~~
chpmrc
Assuming those variables are all booleans why wouldn't it be?

~~~
Retra
They're not booleans in ruby, python, js; they're objects that can be null.

~~~
coldtea
null (or None) is still a valid Falsey value, and in both examples isDone will
be a boolean (strict) value, as it is the result of a boolean operation (and).

So the point of the comment is moot...

~~~
detaro
JS: undefined && undefined -> undefined

Python: None and None -> None

~~~
Izkata
On top of that, truthy values also propagate, so if isDelivered or isNotified
are something else that's truthy (such as a 1 from mysql), isDone will not be
True either and risk failing in more interesting ways down the line.

------
ravenstine
> Organize and design systems

To extend on that point, I think that better programming happens when you
don't simply apply prefab design patterns to every problem or be overzealous
with those patterns.

This is where things like object-orientation, MVC, SOLID, DRY, abstraction,
inheritance, overly-organized code with lots of categories for different
things, decoupling, etc. can be taken too far or be applied to the wrong
problems.

Basically, don't _believe_ in any one thing. Just understand different
approaches and realize that nearly all of them have a % probability of
succeeding. That can even mean writing things in a more procedural way.

------
d_burfoot
> I believe OOP and relational database get a lot of flack

Who gives relational databases flack? The RDBMS and SQL is the cleanest,
simplest, and most productive technology stack I've ever used.

~~~
jacoblambda
I don't think you understand. Maybe this video would help enlighten you on the
subject.

[https://www.youtube.com/watch?v=b2F-DItXtZs](https://www.youtube.com/watch?v=b2F-DItXtZs)

------
PopeDotNinja
My standard for writing "good' code is...

\- can my team read & understand it now

\- would my team likely be able to read & understand it later

Earlier I wasted a lot of time trying to impress people with clean code, only
to later learn those people were never going to care. Now I'm more in the camp
of getting it done, which is a heck of a lot easier when you work on a team,
and at a company, that shares enough of my values. Getting hung up on "what is
good?" is a waste of time, because everyone cares about different things. When
you ship stuff, the people who don't care don't speak up don't, and the people
who do care do. When the people who care speak up, that's when you have an
opportunity to learn what is important to them.

Don't write complete crap, and don't try to make everything. Just try to be
productive and not ridiculous, what that means will vary from project to
project. Your polished diamond is someone else's stinky turd, and vice versa.
For example, I initially learned to write code while hanging out with people
who really valued testing everything and aggressive decomposition (e.g. a line
of code should only do one thing). I wrote code like that on a new job, and
the team hated the number of classes and functions I was asking them to read
in a code review.

------
pmarreck
Some of the bad code cited is produced by high management pressure, in my
experience. Be careful how you manage programmers, and try to incentivize
optimally.

------
mirceal
i call bs on this kind of observations and all advice that claims you need to
know A,B,C to be a “real programmer”

imho, you need to 1) be curious 2) continuously learn and want to improve 3)
don’t make the same mistakes over and over again 4) share your thought process
and be willing to both learn and teach others

yes, sometimes the delta in level of experience is inconvenient but we are all
somewhere in our journey. be nice to others.

that is all.

------
yellowapple
Advising against sleep(), cronjobs, etc. seems insane to me, and I don't
understand the rationale here. Wanting to yield until there's more work to do
(or poll periodically), or run things on a schedule that need run on a
schedule, etc. are all very common and very valid use cases. Unless the
author's recommending using something else?

Maybe that's just because I'm a poor programmer, though :)

EDIT: I guess in the context of waiting for something to asynchronously finish
happening, it'd be more ideal to check for an actual indication of success
(e.g. via an await, or by listening for a response message) instead of doing a
sleep(5) and hoping for the best. Unfortunately, there are disturbingly many
scenarios where that ain't exactly possible (or it's "possible" but not
practical), especially when interfacing with external systems written by poor
programmers :)

Still, the author should probably clarify how the "cronjobs are bad" opinion
fits into that context, because without further elaboration it sounds _really_
silly.

------
craigsmansion
The question is: if making a schema to distinguish between "good" and
"poor/naive", can one do so whilst keeping biases in check, viz. without
inadvertently putting one's own understanding (or unknown lack of
understanding) in the "good" camp?

Unless you're EWD, the answer is "no". No you cannot.

------
cgriswald
> You may have seen code which misunderstands how expressions work:

> if isDelivered and isNotified: isDone = True else: isDone = false;

> Instead of:

> isDone = isDelivered and isNotified

Are people actually finding code like this in professional work or is this
just an example? I'm self-taught and know I've got some gaps, but this example
is so fundamental I find it shocking.

~~~
munchbunny
I've seen it in professional work. If it's actually a single variable
conditional, that's irritating but by itself not worth more than a comment in
the PR. However, in practice it's rarely this cut and dry. I usually see
conditions expanded like this for clarity reasons. For example, I consider
this kosher if it's in some business logic... instead of writing:

> return a && !(b || c) || d;

I've seen (pardon the formatting, I'm typing on my phone):

> if (d) {return true}

> else if (a) {return !(b || c);}

> else {return false}

It's usually a choice to be verbose for clarity.

~~~
userbinator
I think that would be clearer as "return a && !b && !c || d". Also, your
second example doesn't directly correspond with the first, because condition
'a' is evaluated first.

The major problem with code that's "overly branchy", as in containing a lot of
if/else, is that you're forced to go through each case when trying to
understand how it works, and it often proliferates into even more branchy code
as someone makes a bugfix (with another if/else) in one of the cases, but
neglects to see that a similar if not identical change must be made to some of
the others.

In other words, if/else cases like your second example optimise for _micro_
-readability when what's often important in debugging and understanding is
_macro_ -readability.

~~~
munchbunny
While I get your point, several thoughts:

1\. I'm talking about acceptable code, as opposed to "the best choice". There
are good reasons to go for verbosity. Depending on the problem domain I might
agree with your expansion of the parentheses, but when we get to this type of
discussion, usually my overwhelming reaction is "this is bikeshedding,"
because...

2\. If it's "overly branchy" code and you're worried about causing macro-
readability issues, the answer is to refactor, not to compress. Modifying
inline conditions runs just as much of a risk of becoming inscrutable. You
choose between following branches and enumerating binary tables. If it's
showing up too many places, you're likely extracting either expression into
its own function.

When overly branch code happens, my experience is that the root cause is
underthinking/overthinking the method, not taking time to design the right
high level abstractions, or as you mentioned a case of too much repeating
yourself. Generally, the fixes for those issues don't have that much to do
with your choice of boolean expression vs conditional.

------
vkaku
I read it. To me, it came off more as edgy, brash and inexperienced than what
it intended to do - to influence people on what skills they should pick up, so
that they may, according to the author, not be a 'poor programmer'.

Often, there are two factors that lead to very different code than usual:

\- Lack of time and hence taking shortcuts

\- Simple code is different from clever code

And, the TL;DR that I wish to convey back is:

\- All code sucks given the right circumstances

\- A skill learnt over time: being a snob is not the way to influence people

\- Everyone starts lousy and gets opportunities to learn and become better -
even better than you

\- The way people think and envision things is very different, and operating
philosophies are very different. There are multiple pathways to succeed and
the skills you mentioned do not appear in all of them.

Any time you wish to state such an opinion, please think about what would
influence 'poor programmers' to do better than status quo.

------
29athrowaway
Poor programmers are optimistic:

* computers will always run my code fast

* infrastructure problems will not happen

* our team will always have plenty of time to understand my code

* users are not malicious and the libraries I depend on are not malicious

* I will always have to plenty of time to diagnose and fix problems in this code

The author covers some of that.

But you can care about those, and still be a poor programmer: being too
pessimistic is problematic. You need to pick your battles and spend your time
mindfully.

Now, I have a problem with this:

> Naive programmers think that design means “don’t make functions or classes
> too long”. However, the real problem is writing code that mixes unrelated
> ideas.

How do you enforce that policy using a linter? How do you audit a large
project for these issues? You can count characters in a line, you can count
lines of code in a function or file, and you can count import statements. All
those are indicators of coupling unrelated ideas together.

~~~
machiaweliczny
You don't. (to both questions)

I don't know about you - but I can spot it by glancing through MR and just
expect code provider to explain his solution in description or be "obvious" to
me. So sulution to not having these problems is code review.

And if you have this problem then well, you have bigger problem (bad CR). But
fixing it all at once is bad idea. Just fix modules you are working with/at.

EDIT: I spot it by checking imports/exports and API surface, so could be
automated.

~~~
29athrowaway
You cannot do CR retroactively.

------
FrankyHollywood
"There are no tricks or rules that you can follow to guarantee you will write
good software. As Alex Stepanov said, 'think, and then the code will be
good.'"

Excellent conclusion, can't think of anything else. I don't know how many
times I've seen programmers apply latest patterns and use trendy libraries
only to make a terrible mess out of thing.

I literary heard a guy say last week "we should use MongoDb, it is better than
SQL". No context, no arguments, just read a blog entry or I don't know what.

There are no golden bullets, think, and then the code will be good :)

------
elwell
> On the other hand, what if you generate storage files with random names and
> you have a collision? You just lost someone’s data! “This probably won’t
> happen” is not a strategy for writing reliable code.

If the entropy is high enough, and the likelihood of collision low enough,
then this is a very useful tool for certain situations, particularly
distributed systems. I suppose IPFS (and even Ethereum) was written by "poor
programmers"?

~~~
jdsully
There’s a lot to get right here. I wouldn’t call using a UUID or its
equivalent a random name. It has random components - sure, but the format is
intentionally structured to reduce collisions.

~~~
elwell
Aren't Ethereum wallet private keys 'completely' random?

------
povertyworld
Skills are nouns, but the list was all verbs. Weird.

~~~
afarrell
Skills which someone has put a name to already are nouns. But if you are
trying to say something meaningful about the skills required in a rapidly-
evolving field, it is going to be hard to find prepackaged nouns and so you
are going to need to use verb phrases.

——

EDIT: for example, the notion of a language “working” in a mechanical sense
has only been around for the past 60 years. If there is a specific noun to
refer to understanding those mechanics as you type, it is obscure. This
28-year old software engineer who has worked in both the US and UK has not
heard it.

------
r4mbini
> In JavaScript, this is often indicated by new Promise inside a .then().

I have found myself doing this sometimes, why is it considered bad practice?

~~~
kevsim
I guess because inside of a then() you can simply return the success case and
throw the error case, no need to create a new Promise and call the
resolve/reject functions.

~~~
rounce
Was trying to think of exceptions to this when I read and and could only think
of one: When you need to wrap a callback API with unusual callback arguments,
where a `promisify` like helper won't work. Then again, I still feel the
wrapping function should be defined outside of the `then`, as it feels like
this is a separate utility to the work being done in the Promise chain.

~~~
r4mbini
This is the context I’ve used it in

------
beardedman
> Programmers who work only on old programs never learn to write new ones.

Maybe give Linus a shout and ask him to call it a day. ;)

------
curiousgal
Another skill: making a website readable on mobile.

------
crimsonalucard
>I believe OOP and relational database get a lot of flack because programmers
tend to be bad at design, not because they are broken paradigms.

OOP has Fundamental and Intrinsic problems that can be described in a very
concrete way.

If you believe OOP gets a lot of flack just because programmers are bad at
design, then you are the one that is also bad at design.

I will say this, OOP is bad for many and most design problems, and most people
who hate OOP also don't even know why OOP is bad. They just have this gut
feeling and bad experiences but they can never pinpoint the concrete reason
for why OOP tends to lead to bad designs. A lot of people who like OOP, really
like the mind bending design patterns but they don't realize that these
patterns often offer limited flexibility and the catharsis of creating a
design pattern abstraction is just an illusion.

So really what's going on is nobody knows the true nature and goal of design
in programming. There's no theory behind it. To first know why OOP is bad you
need to know the fundamental nature and goal of design when it comes to
programs.

The goal and nature of design is abstraction. When designing programs we want
to start from primitives, then compose those primitives into higher level
abstractions, then take those abstractions and also form those into even
higher level abstraction until we achieve the final level of abstraction that
represents the program itself. The key insight here is because everything
starts with a primitive, the way your abstractions are designed, depends
entirely on your primitives and choice of primitives.

A good primitive must be able to additively compose with other primitives to
form every other possible abstraction that the program may possibly need.
"Additively" being the keyword here because if you subtract information from
your primitive during composition it means your primitive is not "primitive"
enough and that in actuality the "thing" your dealing with may be two
primitives representing what you tried to "subtract" from the primitive and
the remaining part of the original primitive itself.

Bad design often involves dealing with bad primitives. You may find yourself
realizing that you have abstractions that cannot be built out of the
composition of the primitives that you have. You may find that you have to
split up one of your primitives and realize that since it's all encapsulated
in a micro-service there's no easy way to do this, so you take on technical
debt by making a redundant component that does what you need. You may not even
have a notion of what the primitives of your programs are and designed things
at the highest level of abstraction with Zero code re-use. 99.999% of all
programmers will not have a notion of what primitives are, and design programs
in a way where they have a potpourri of components that are a mishmash of high
level logic combined with low level logic and no notion of forming higher
levels of logic with lower level composition of primitives. Yes 99% of
programmers are like this, literally take a look at yourself and your
colleagues and tell me who out of all them actually has linked the notion of
the "design of programs" with "choice of primitives." In fact the entire blog
post never mentioned the word "primitive" or "axiom" once in the entire write
up on design.

Which brings me back to the initial topic of why OOP is bad:

OOP is bad because the object is a bad primitive.

Objects are actually arbitrary mixtures of lower level primitives: functions
and data. It is far easier to compose functions with functions and data with
data than it is to compose these arbitrary mixtures called objects with other
objects. Object compositions often involve surgical grafts of one object into
another object resulting in a hideous dependency. A lot of people like to use
tricks and have this happen at runtime; People tend to call it dependency
injection, a very abstract and clever concept but also very very very bad.

Meanwhile composing two arrays:

    
    
       [1,2,3] + [4,5,6] = [1,2,3,4,5,6]
    

Composing two functions:

    
    
       function compose(f, g) {
          return function(x){
             return f(g(x))
          }
       }
    
       b = function(x){return x+1}
       c = function(y){return y*2}
       a = function(j){return j-3}
       d = function(e){return e*e}
    
    
       g = compose(b,c) // g(x) = (x*2)+1
       t = compose(a,b) // t(x) = (x+1)-3
       l = compose(d,d) // l(x) = x*x*x
    

... you get the picture.

~~~
foobar_
The real idea for OOP is quite similar to the actor model ala Erlang. OOP as
implemented and C++ and Java is an abomination. I still don't get how Bjarne
Stroustrup did not steal co-routines from Simula.

It's very easy to implement actual actor model style OOP with procedural code.
You can get most of the benefits by using a message bus.

~~~
crimsonalucard
>C++ and Java is an abomination.

It's also the OOP version I'm talking about. Most people are talking about
this when they talk about OOP, not smalltalk. Why does everyone turn it in
this direction... yes smalltalk was the first, but nowadays the traditional
term used is not OOP as defined by smalltalk, it's OOP defined by JAVA.

------
kissgyorgy
This is just an arbitrary collection of a couple of things. A lot of
programmers lack a lot of skills.

