Hacker News new | past | comments | ask | show | jobs | submit login
Don't write clean code, write CRISP code (bitfieldconsulting.com)
157 points by bitfield on April 18, 2023 | hide | past | favorite | 167 comments



Since we’re adding new backronyms every day, I propose SIMPLE.

S - Spaghetti: write tapestry of code like a chef.

I - Interlinked: if the project has modules, they should all depend on each other (we are strongest when we can depend on one another).

M - Micromanaged: if the product owner doesn’t expect reports in the daily stand-up, do they even care?

P - Perplex: diversity for the codebase.

L - Lazy: Bill Gates once said “I choose a lazy person to do a hard job, because a lazy person will find an easy way to do it”, for example, without testing, collaborating with team members, or ensuring the feature works with anything else in the codebase.

E - Opinionated: because I believe E should stand for opinionated and everyone else will have to work around this with adapters. But E should mean Opinionated because Uncle Bab said so.


S - Spaghetti: Weave a tangled web of code, just like a master chef crafting a delicious pasta dish. Complex code is the key to intrigue and job security.

P - Precarious: Make sure the code is fragile, such that any minor change could lead to a cascade of issues. This keeps everyone on their toes and ensures that only the bravest dare to modify it.

A - Ambiguous: Write code that leaves others guessing about its purpose and functionality. Code should be a puzzle to solve, and the more obscure, the better.

G - Gratuitous: Don't be afraid to add unnecessary features and lines of code. After all, more code means more functionality, and who doesn't like more functionality?

H - Haphazard: Consistency is overrated. Embrace the chaos and develop without a plan, jumping from one idea to the next as inspiration strikes.

E - Entangled: Ensure that all components of the project are intricately connected, like a delicate lattice of spaghetti noodles. This guarantees that every change will have far-reaching consequences and keeps the team on high alert.

T - Time-consuming: Write code that takes a long time to understand, modify, and debug. The longer it takes, the more valuable it must be.

T - Tangled: Never refactor or simplify. The more convoluted the code, the more creative it appears, and the more impressed your colleagues will be.

I - Impenetrable: Write code that is difficult to test and verify, so that only the most determined and adventurous developers will dare to attempt it. This maintains a sense of exclusivity and mystique around your work.


> Write code that is difficult to test and verify, so that only the most determined and adventurous developers will dare to attempt it.

And then you create job security for yourself!

Nobody will dare going on your turf or I mean the beautiful junk you created. With all the complexity and smelling code in place only you is “smart enough” to know how everything function. Since people that love to code would be so horrified you will be the only leader and maybe at some point the only expert and junk man, I mean coder of this junk. Since now you are so integrated to a part of how the system works nobody can fire you. You can still advance in your career. Thanks to how much complexity and buzz word you added to the pile of junk you can demonstrate you know what you are talking about to no so good managers in tech. Words like micro services, kubernetes, Kafka, istio, Falco, Vault, (add the last flavor)… Even though all this have no real value for the current state of your projects. Barely doing X events/s and making complexity of the system goes from O(n) to O(n^2).

But who cares? You have the most secured job and your career possibly is great.


job for life


I‘d like to extend it to SIMPLED:

D - Duplicated: it's always good to have backup code at hand.


I'd go with SSIMPLEDD - because it needs to Scale and more Duplication must be better?


Great! Let me propose SSIIMPLEDD:

I - Iteration: As we showed over the last 2 hours, iteration is always the way to go.


All you need to do is get Y in there and we'd have recursion covered as well.


SSIIMPLEDD + Y + keeping redundancy apart to avoid merge conflicts + spaces for readability

SIMP DED SILY


Don't miss the opportunity to add in some of those fancy SIMD instructions as well somehow!


Good old Uncle Bab! Love it. I have a borderline weird hatred towards the concept of "clean architecture".


Just watch his videos on "clean code" or "Scribes Oath". You can't make this shit up. He literally glorifies tribal rituals. Calls devs - "his tribe" etc... Such zealous simpletons are a plenty in our pop culture driven industry.


I wish I had more context for Uncle Bab.

The Báb (born ʿAlí Muḥammad; 20 October 1819 – 9 July 1850) was the messianic founder of Bábism, and one of the central figures of the Baháʼí Faith. He was a merchant from Shiraz in Qajar Iran who, in 1844 at the age of 25, claimed to be a messenger of God. He took the title Báb (/bɑːb/; Arabic: باب; meaning "Gate" or "Door"), a reference to the deputy of the Hidden Imam, while instigating a religious revolution that proposed the abrogation of Islamic laws and traditions, and the establishment of a new religion.[1] Though he was popular among the lower classes, he faced opposition from the orthodox clergy and government, which eventually executed him and thousands of his followers, known as Bábís.

https://en.wikipedia.org/wiki/B%C3%A1b



I think "clean code" is essentially a meaningless phrase. It's a nebulous label that sounds like a thing we all want our code to be. But, much like living a "good life", what it actually means varies from person to person.


Hmm, can we have this included in the next ChatGPT training round? That would be great..


I laughed.


> Since we’re adding new backronyms every day, I propose SIMPLE.

> S - Spaghetti: write tapestry of code like a chef.

> I - Interlinked: if the project has modules, they should all depend on each other (we are strongest when we can depend on one another).

> M - Micromanaged: if the product owner doesn’t expect reports in the daily stand-up, do they even care?

> P - Perplex: diversity for the codebase.

> L - Lazy: Bill Gates once said “I choose a lazy person to do a hard job, because a lazy person will find an easy way to do it”, for example, without testing, collaborating with team members, or ensuring the feature works with anything else in the codebase.

> E - Opinionated: because I believe E should stand for opinionated and everyone else will have to work around this with adapters. But E should mean Opinionated because Uncle Bab said so.


DRY is probably my least favourite programming meme. There are far too many overzealous juniors who learned it and have a bee in their bonnet about creating absurd abstractions around any two lines of code (or config) that have vague or imagined similarities, locking in all sorts of annoying indirection.


My take is that all this WET/DRY wisdom makes people think about abstractions in the wrong way -- it makes us debate them in term of when, rather than what and why.

In my mind, the point of abstraction is to transform models such that we can build solutions in a way that is a better fit for the problems at hand. Reducing LOC and repetition is explicitly not the goal, sometimes a good abstraction may actually result in more lines of code (but more often it's less). So reducing LOC and repetition is commonly a happy byproduct, rather the reason we do it in the first place.

I see slightly more experienced programmers rebound from DRY to a point of having a pathological distaste for abstractions, and I find those codebases to be far more stressful to work in than ones which just happen to have a few bad abstractions.


I agree with rather having good or correct abstractions than saving LOC. However, a bad abstraction can give readers/devs the wrong idea about what is behind it, how it works or how it can be used and when to rely on it. Unreliable abstractions are terrible. Leaky abstractions are also terrible. Both introduce a lot of mental load overhead.


Some people choose “no abstraction” over “poor abstraction,” which I don’t understand. Working without any abstraction is the same as working with an infinitely leaky abstraction.


Yeah. A pervasive poor abstraction can be painful, but in my own experience most misguided abstractions tend to be somewhat superficial and easy to unwind. It does worry me to see so much rhetoric in the wider programming community encouraging people to avoid abstractions, because this is the kind of lesson that is easy to imprint on newcomers -- but I suspect the easiest way to get good at them, is to do them badly a few times.


Yes, that happens, but I've also had to take over a large iOS app written by a self-taught programmer who never learned DRY. They didn't factor out any shared code whatsoever. Any time they needed to do something that had already been done, they just bounced over to the other controller and copied the code over.

Of course, the copied code was often buggy, and they would only fix the instance in the controller where the bug was reported. When bugs were fixed it was pretty much random chance which version would be chosen to copy when needed again, so creating the right abstractions involved piecing together an entire phylogenetic tree of the code to decide which pieces were supposed to be the same and which were legitimately different.

Add a commit history that looked like "v1", "v2", "v3", and... yeah. We finally decided it would be cheaper to scrap the app and rewrite it from scratch.

So, yes, premature abstraction is a huge problem, but so is the opposite extreme, and after that experience I personally would choose overly-indirect code over the mess I inherited. It's easier to untangle a function that has too many callers than it is to find code that should have been shared after months of divergent evolution.


For me DRY is one of the most important principles for maintainable software. Imagine having to fix a bug at two independent places or having to add or change the same conceptual functionality at two different places (which you may not even be aware, because you don't remember you copied code somewhere else once) Totally unnecessary bugs guaranteed, and ready to be thrown away after some time, because a change will be harder than a rewrite.

It's enough pain when the effort is too big to avoid (e.g. Js frontend, Rust Backend) it.

Juniors will learn the difference between actual and imagined similarities at some time.


DRY should really be "DRA" - don't repeat abstractions (which admittedly isn't as catchy).

The problem with DRY is when abstractions with currently identical implementations are given a single interface, even though they're logically distinct.

Then, when those distinct abstractions' implementations need to diverge, you've got a rats nest of references to manually pick through and separate out.

Or worse yet , the mistakenly-shared interface becomes parameterized, leading to a horrible mixing of requirements and concepts that may never be untangled if the original intent is lost to time.

Obviously DRY is great when you can consolidate multiple implementations that are actually a single abstraction, but it can really go off the rails if the motivation for the mantra isn't understood.


Premature dryness is the real issue, not the principle itself. When you catch yourself fixing the same code in multiple methods again, it's a good time to stop and refactor.


It is fundamental but it's in constant tension with other principles - e.g. cohesion and coupling.

I'm pretty sure all of the most fundamental principles of good code are in tension with each other.


Funny enough this happened to me yesterday. I had to do a small change of code in a component and then when I pushed the code I got an error back. After searching for what was happening, it was actually a code I copied and named "Component2" two years ago when I was starting out. Today I would not have made it the way I did, but in my first month...

And thank god for github actions! :D


True, but then you also get times when somebody who should know better implements a "in current financial year w/ special cases" function 3 times.

Then your tester says it isn't correct in 2/3 places, you find out you've only updated the function in 1 place and either abstract those the other 2 cases out or update it in those 2 places aswell.


The worst part of DRY is that developers worrying about the amount of characters in a source file rather than readability, as if the compiler had a hard time reading those extra few kbs.


Totally agreed. I've tried explain exactly this issue in a talk [1], trying to find a useful synthesis between different paradigms like KISS, DRY and YAGNI.

[1] https://www.youtube.com/watch?v=0FIZn2trkoA&pp=ygUXbWljaGVsI...


The trick is to understand this paradigms, why they came to be and what they want to archive and as much important when to not apply them.

Then program with the things you have learned in mind, instead of blindly following the some "paradigms rules", 'cause that's never ending well.


I made a DRY/YAGNI yin/yang sticker for my laptop a little while ago.

https://i.redd.it/16qm2a3m7m991.jpg


I asked everyone's favorite chat AI to come up with a new buzzword using the letters from those.

The best was KINDergartener (uses all the letters). I would describe this as a programming paradigm for people who are new to the programming and real work and havn't found out that the getting shit done and shipping the code to users is the most important state for code to be in.


For best results, set your DRY dial to 8 or 9 out of 10. Settings higher than 9 result in premature or excessive abstraction.

But if you hate DRY so much, imagine a word where people set it to 1 or 2. I wouldn’t want to work in that code base.


Whatever you do, never turn it up to 11,


But what if you just need that extra bit of dryness and you're already at 10?


heck, sometimes even 7 is better than 9


Often the best way to DRY something if to let it sit in the sun a while where everyone can see it.


Seems whenever someone popularizes a rule of thumb, expressed in a catchy way, some newbies will take it as inviolable absolute law that everything else must be bent around, at any cost.


But remember you have to get WET (write everything thrice) before you get DRY.

I apologize for my dry humor.


I put it second behind "premature optimisation" which is used to justify never caring about performance at all (until it's too late and you can't do anything about it).


Actual premature optimisation is harmful for the same reason as premature abstraction. In both cases, improvements that come "for free" should be done without thinking, but one they start to impact readability, we should consider how beneficial those changes really are.


If you don't know what you are building, any action is premature.


My take is the opposite. DRY can be overused, but it is much easier to “undry”, than it is to consolidate duplicate code which should have been dry from the beginning. Sections of duplicate code tend to drift apart over time, making it more difficult to consolidate down the road.


DRY only paired with the rule of three (or more)


That, or the code block is sufficiently big and similar.

I remember a ex-colleague of mine who copied a 1000 line function, changed a parameter, and didn't see anything wrong with it.


I had a coworker who did similar. It was essentially the entire frontend for an order application (back in the jQuery days), a few thousand lines. That was all copy/pasted, changed to deal with the case of when there are no orders (which only ever happens once since this was an in-house system) and those massive chunks were wrapped in an if/else.


I recently had a junior copy paste a 20 line code block rather than add an extra 'or' to the if statement.

Hey, rule of 3 ;)

Anyways, guidelines are just guidelines, regardless of what you come up with you can break it.


Why do you have a 1000 line function in the first place? Sounds like an entire class of methods in a single structure.


The function contained 300 lines which he copied and pasted twice, because DRY and extracting into a function right away is dumb /s


Created by the same person. But the way he doubled his LoC contribution in this PR you could imagine why.


I’m not swayed by this rule of three. The experience I have is the reason I reach for a hammer immediately, instead of first trying three rocks to demonstrate that’s still a bad idea.


When I was younger, I really thought this was the be all end all, and probably committed more programmatic sins due to trying to achieve DRYness than anything else.

Often times it seems like we aim for DRY at the expense of simple, or idiomatic code. It also has a nasty habit of making code difficult to change since it leads to a lot of premature coupling, where someone sees a repetition and naively assumes that that needs to be eliminated, when it might literally be a naturally independent value.


It's definitely something that people overdo. But sometimes, especially with things like terraform, you have to struggle against the DSL to avoid having thousands of lines of repetitive code to do what could done in 100. I'll go quite far to avoid the situation I saw at at previous company, where they had 36k lines of trash to run a few data centers. Deploying a new data center was next to impossible.


DRY is priceless, and if your argument is to avoid DRY in very specific cases, state that specifically.

DRY, when used with functional, well named code is the number one thing keeping a codebase easy to read. There should be one way to achieve something well-defined like save/edit/delete an entity, check a file type, url encode a string, etc.

This ALWAYS leads to much easier code fixes AND much easier refactoring.


It is laughable talk about "Don't Repeat Yourself principle" on a strongly biased towards Golang article. A language that took 13 years to add a simple method (Index) that helped the programmer to find an element in an array (slice). Before that you had to write a for loop every time you wanted something from an array. Talk about "DRY" ... That is everything but "simple" as described in the article.


An amusing aspect of DRY is that the moment you think "great, with that last patch I'm no longer repeating myself," you now have an opportunity to look around and really just take in how much repetition happens everywhere, at multiple levels.

You are still repeating yourself, and will likely continue to repeat yourself, forever.

(Phew, look at that word "everywhere", it should just be "vrywh")


Great example of how refactoring is a neat way to introduce bugs and break features


Huh?

> But there's nothing wrong with repetition in itself. I say again, there's nothing wrong with repetition in itself: a task we do many times is probably an important one. And if we find ourselves creating new abstractions to no purpose other than avoiding repetition, then we've gone wrong somewhere. We're making the program more complex, not more simple.


Literally why I stopped writing Go. I was going crazy repeating myself so much. If err != nil also never felt nice to use.


> Before that you had to write a for loop every time you wanted something from an array.

How often do you do that? I mean, it comes up - but if you're linearly scanning every time you want to select something from a list, that sounds like a surefire way to write slow code to me. Is there a reason you aren't using a map?


Linearly scanning a small array is very likely to be faster than looking up a key in a map. Especially if we factor in memory cost and not just speed.


Maybe. That depends on the map implementation. In javascript I’m pretty sure small maps are implemented as lists anyway. I wouldn’t be surprised if Go is the same.

Searching through a list is definitely more complex to write, and it carries the danger that your list will grow and you wont change your code.

The speed difference will only matter at scale - so if you have a lot of small lists with items you’re searching for. That happens, but it’s uncommon that it’s the best approach. I probably use find() / indexOf() about once every thousand lines or so.

The commenter above implied it’s a very common operation in their code. (So much that they’re angry about Go not having it in the standard library). I don’t know about the commenter above, but I’ve certainly seen a lot of novices at programming massively overuse lists not because they’re performance experts, but because they don’t yet understand when a map might be a better choice.

So I must say I understand Go’s choice here. Go is a paternalistic language where the obvious choice should usually be the right choice. Go is actively against clever optimizations philosophically. I can imagine rob pike being quite pleased that slow, linear scans of lists are awkward in his language. This sort of judgemental paper cut is sort of Go’s whole thing.

If you don’t like being looked down on by the compiler, use a different language. Or use maps in your Go code. Go isn’t designed to be microoptimization friendly.

(Source: I sat about 2m away from Rob Pike for nearly a year while he worked on Go, before it hit 1.0. Go isn’t designed to be a language for people who think about cache lines.)


At sizes/number of items, where this holds true, maybe the choice of data structure is not as important. It becomes important, once the number of items in that collection increases and then always linearly scanning the whole array will become a problem. Just use the appropriate data structure and be safe in the future, taking a negligible hit for small input sizes.


Now I understand why mordern web is slow as hell.


Then you probably misunderstand just how much work a CPU can actually do in the time it takes to read a new cache line from main memory.


Now you miss the point that you need to read that list from RAM in order to do for loop too.


Go does actually embody the DRY principle (Do Repeat Yourself).


I prefer calling it WET (We Enjoy Typing)


These days Copilot tremendously helps me with those kinds of repetitive code.


This is my experience as well. GitHub Copilot has been very efficient at generating variants of boiler plate code. It allows you to efficiently repeat code so you can delay code abstraction or generalisations until you are confident a generalisation makes sense.

Having worked as a developer for many years I now much prefer to repeat code a bit to making the wrong abstractions to early. The tedium of writing out code is now reduced thanks to well performing language models.


reading multiple lines of what would amount to a single scan/find/map/reduce is still a pain in the ass though


A summary.

Correctness: the primary objective. Code that does what it's supposed to, bug free.

Readable: strive to write code that will be understandable by others (and your future self). Review what you wrote for clarity and be your own critic.

Idiomatic: Write code the way people in your community expect to read. Don't surprise them with your own quirky and clever conventions, when there are well established and perfectly acceptable ones already.

Simple: Your code should do what it says it does directly, with no funky side-effects. Repetition is not always a sin, don't DRY things up just for the sake of it. Don't do too many things. Be parsimonious.

Performant: be aware of RAM (a bit of a letdown, was expecting "code to data structures" advice).

---

Author concludes with a few interesting take-away points. Some highlights (paraphrasing):

- don't trust foolproof software engineering methodologies sold using a neat backronym.

- words like "clean", "simple", and "readable" are as vague as "freedom", "justice", or "equality".

- neat slogan like "don't repeat yourself" or "clean code" are moot.

- correctness above all else.

- tests only prove that the code can pass the test.

- No info is better than bad info. No tests is better than bad tests.

- all software has two types of bugs: those you've found and those you have yet to find.

- we all want readable code, but we have different definitions of "readability".

...


This essay is great. Regarding simplicity, a couple of amusing observations: "I apologize for such a long letter - I didn't have time to write a short one." - Mark Twain (a variation of the French original by Blaise Pascal)

Similarly for writing software: First, get it right. Then, as with writing in general, rewrite ruthlessly until it is clean, beautiful, and simple.

Simplicity might be "defined" the way Justice Stewart (non-)defined pornography: "I can't give you a definition of it, but I know it when I see it."

There is the well-known programming experience of struggling with a crufty, complex piece of code. Then a new idea pops up, the code gets rewritten in a flash, and two thirds of it disappear. The result is obvious and inevitable.

Pair programming can really help here: As the author, you develop internal state that leads to a form of myopia about your code. A pairing partner can look at the code with fresh eyes and say, "WTF is going on here?" Can we rename that method or class or file to give the reader a clue?

It also reminds me of Paul Erdos' idea of "The Book", where God keeps all of the simplest and most beautiful proofs. As Erdos said, "You don't need to believe in God, but you should definitely believe in The Book!" His (and Selberg's) elementary proof of the prime number theorem, developed 60 years after the original proof, would be a great example of this.


Reminds me of the quote by Kent Beck (allegedly): "Make it work. Make it right. Make it fast".

First throw something together that works, then refactor it to make it "right" (clean, readable etc). Finally optimize it if it's necessary.

Like the author of this article describes it "Write more bad code. [..] Get answers. Learn as much as you can about the problem you’re trying to solve [..] it’s much faster in the long term to write bad code then make it good, than to try to write good code in the first place" https://medium.com/swlh/coding-faster-make-it-work-then-make...


I like to read these kinds of prospective acronyms, because sometimes they provide new perspectives or leverage points that other, existing acronyms don't. And they are easier to remember than, say, an essay.

DRY, for example, is instantly transferable to a bunch of other life practices and disciplines; people from dry cleaning to graphic design will hear about it and go "oh my god, I'm not really DRY but I totally could be" so it was really neat to discover.

In the case of CRISP, the "Correct" criterion somehow seems far less tractable to me than the other terms, and even less tractable than "Clean" for some reason.

To me it reads like a hint at subjective, self-contained logic. That's great, insofar it's instrumental to how code either works or doesn't.

But "Correct" is also kind of getting negative connotations these days, for a lot of reasons.

(Imagine also, receiving "your code could be more DRY" feedback, vs. "your code could be more CRISP," wherein you look up the latter and think, "oh right, my code could be more _correct_!")

And then the author even takes the argument in the ad-infinitum direction by referring to e.g. what _else_ isn't correct here? My tests? My purpose? My gut biome? (Ok not the last one). But there's a reason why Correctness is a thing in science, and a big part of that is scope constraint.

Maybe "Cogent" is more fitting in such a case? It has less of an absolutist ring to it. It expresses a bar to measure up to, with more of a qualitative, less-checkboxy feel.

I would also guess that a word like "Contractual" or "Compliant" would provide more leverage toward the same outcome. But those are already used around code in other ways, I guess...

Anyway, it's interesting to think about, because these little acronyms can really help when applied, if they reveal some traction that's been missing.

(This also made me wonder...why is one of the world's most popular crispy rice chocolate bars called a Crunch bar, and not a CRISP bar. Hmm)


> how code either works or doesn't.

That's all "correct" is trying to mean here. As a synonym for "working". Can you share a bit more on the negative connotations of "correct"? Nothing comes to mind for me.


That's all? Good argument for RISP then.

I have to ask, what's the point of telling coders--presumably experienced ones if they are familiar with the concept of clean code--to try to get their code to work?

"Have you tried getting it working?"

(I really think the author must mean more than that, given the segue into testing one's tests? Why do that, if the thing is working? Or does one assume at all times, forever, that it's broken? That's not really mentioned)

So IMO the energy that goes into justifying the word probably isn't worth the carnival ride to the exciting world of status quo, so to speak.

"Correct" as a word has also had quite a social shadow side ever since Political Correctness became a thing at the very least. What it amounts to is de facto social resistance to the topic of whether one's creation is correct.

These days it's even more encumbered by the ongoing reconciliation dialog regarding "incorrect or just different?" E.g. divergence from pack or narrative in various ways being not so bad a thing by default anymore, especially insofar as the resulting support structure makes new room for new ideas that move past the worst parts of who we all used to be.

But that's just one aspect; telling a coder to check that their code is correct is also an awkward way to communicate on its face, especially if that's the first answer to "how can this model help me to be a better coder" for example. It's going to drop the author into literal mansplain territory for huge swaths of audience. It's like first principles, for people who don't know what code is meant to accomplish.

Just some ideas though.

So, where's the value of having this word in there? Does the model really need the word at all? If so, why?


> to try to get their code to work?

Because some people, even experienced coders, can get caught up caring about performance first or elegance first instead of getting something working correctly first.

> "Correct" as a word has also had quite a social shadow side ever since Political Correctness became a thing at the very least.

Interesting, I've never noticed that. When going back to the 90's I can't recall a time where "Correct", has carried any kind of extra baggage like that. We must have had some very different experiences.

>These days it's even more encumbered by the ongoing reconciliation...

Okay, you've actually lost me, I'm not really sure what it is you're trying to say here or what it has to do with programming or software development.

>It's like first principles

Anything reduced to an acronym in a context like this is going to be about fundamentals, not some nuanced deep dive. They're rules of thumb, not a comprehensive guide

>how can this model help me to be a better coder

Maybe it can't help you, and that's okay if others find it useful.

>Does the model really need the word at all? If so, why?

If it had no value here, that would mean people write working code by default. The proven value of having tests (specific approaches aside) to demonstrate to demonstrate the code is working means this is not the case. But there's a spectrum of effort to be had here, I can not really think about it and hope things work out because I'm experienced and how dare you imply the code I wrote isn't perfect, or I can spend effort thinking critically about my work, the code, the task, the desired outcome as written, and the desired outcome as intended. Having "correct" here is saying you should err on the side of the latter rather than the former.

It's not an attack or a slight, is just some advice. Advice isn't universal and where you are in your journey as a programmer may mean this advice isn't valuable to you right now, but that doesn't mean it's fundamentally bereft of value.


> Who wants to write dirty code, unless maybe it's for a porn site?

Even porn sites need to write great code to handle huge traffic. I am offended by this


It's a bit of an awkward protrusion to thrust into an article that's aimed at general tech audiences.

And I did wonder, reading that: IS there actually a dirty coding language for porn sites?

Certainly the simplest of erections in that area could arouse some latent skill. Contributions would likely be furious.

Though some annoying friction is perhaps practically guaranteed, depending on which executive discovers, in which way, which particular feature set of their corporate software project was conceived...


Now I want to create this language, thanks!


Would love to see it. May my freshly earned downvotes be good for something, this is the world I like to be a part of.

It's interesting how people love to hate on plain old passion, with passion...this is quite an irrational kink really, tho the bar isn't generally set very high


I did a stint for a website y'all know very well and believe me, the amount of optimizations we had to do were on par with Netflix. That codebase was much more well organized and sane than any Fortune 500 project I've seen in a long time.

Some days I used to wonder if I could get a book deal off of this but that's thinking too big xd.


If that's meant seriously rather than as a joke: I think you're misunderstanding. They aren't saying "porn sites don't care about code quality", they're making a pun on the word "dirty".


What CRISP is supposed to mean:

Correct

Readable

Idiomatic

Simple

Performant

Notice that all this is just well-wishing and suitably vague, so that you feel the need to buy a book & hire a consultant to explain it to you. The trick is there's nothing to explain.


It makes sense if you read the article.

Correct: the primary objective. Code that does what it's supposed to, bug free.

Readable: strive to write code that will be understandable by others (and your future self). Review what you wrote and be your own critic when it comes to clarity.

Idiomatic: Write code the way people in your community expect to read. Don't surprise them with your own quirky and clever conventions, when there are well established and perfectly acceptable ones already.

Simple: Your code should do what it says it does directly, with no funky side-effects. Repetition is not always a sin, don't DRY things up just for the sake of it. Don't do too many things. Be parsimonious.

Performant: be aware of RAM (a bit of a letdown).

The last point was disappointing, as I was expecting something about coding to data structures when it matters. Instead the author went on about being aware of memory.


This is kind of how I felt about the S in single responsibility of SOLID too. I wonder if there are any principles out there which are actually objective/measurable.


The only objective rule in SOLID is L:

"Objects of a superclass should be replaceable with objects of its subclasses without breaking the application."

And when you decipher it, it literally means "all X should act like X", which is already obvious. I.e. if "Y extends X" then Y is also X, so Y should act like X.

The other rules are so vague and subjective, that you can spin them any way around to prove someone's code sucks if you choose so. Which is ideal for selling people books & consultation services. And for Internet arguments.


When possible, I prefer to do away with objects and have functions with types. X in, Y out.


A style I like as well, but methods are exactly that as well, only that `X` is implied, and there's a danger of abusing state (`f(X) -> Y` can and often is pure, whereas the entire point of methods is to be impure (modifying instance state)).


> methods are exactly that as well, only that

Exactly, except for implicit types and modifying state, indeed. Those are the reasons why I prefer the above.


Start with simple. Don't involve objects and classes and whatever, until there is a need for them and you can justify them. Don't use them because OOP. If your program or part of a program is fundamentally just a complex function, then write it as function(s).


There are plenty of code metrics. I'm not aware of any that I would recommend, though (other than basic ones like keep function LoC <= 50, keep indentation level <= 3). But if you're interested, you can research this topic. I saw tools for Java which measure those code metrics.


> well-wishing and suitably vague

That way, no matter what you do, they can say you did it wrong, and no matter what they do, they can say it's right.


Well, since we're in a fancy acronym thread, here are some for your attention:

Beneath Utterly Ludicrous Labels, Salespeople Hawk Ineffective Tricks.

Bogus Utterances (with) Little Legitimacy, Sold (to) Helpless IT Teams.

Business Units Love Lingo, Selling Hyped IT Truisms.

Credit: ChatGPT.


Very often it's the design pattern people that make code unreadable.


So true. Sometimes, you read some source and you think "surely, there was a simpler way to implement whatever this is". Then you notice that the class name ends with "Visitor". Mystery solved. Now go make yourself a coffee, a pattern has been deployed.


There's a good phrase to use in code reviews...

"This code is Design Pattern unreadable. Please redo simpler."


Introducing the new coding acronym, CHAOTIC!

C - Confusing: Write code that is hard to understand, ensuring that only you can maintain it, thus securing your position in the company.

H - Hodgepodge: Mix programming paradigms, libraries, and styles to create a unique blend of code that reflects your artistic side.

A - Arbitrary: Make decisions about architecture, data structures, and algorithms without any specific reasoning, allowing for a more spontaneous and whimsical development process.

O - Obfuscation: Deliberately make your code difficult to read and comprehend by using cryptic variable names, nested loops, and a lack of comments.

T - Tinkering: Constantly make small, undocumented changes to the codebase to keep your teammates on their toes and to give yourself an excuse to fix "bugs" that you secretly introduced.

I - Inconsistent: Use different naming conventions, indentation styles, and language features throughout your code to keep things interesting and unpredictable.

C - Cluttered: Avoid modularization and organization, and instead cram as much functionality into a single file as possible. This will create an exciting challenge for anyone attempting to decipher your masterpiece.

Embrace the CHAOTIC way and create a memorable, one-of-a-kind codebase that will leave a lasting impression on your colleagues and ensure your job security!


Uh, another acronym to be tortured by in interviews.


Yeah, I cringed and I think I chipped a tooth just by reading the headline. I'm so sick of these "let me sell you a book & some consultation" BS constructs.


The first take-away of the article is:

> Anyone who has a foolproof software engineering methodology to sell you is a charlatan, especially if the methodology has a neat backronym.

Some of the others (there's more take-away than content, perhaps) are also note-worthy, except that I don't believe in correctness. Some functions may be well tested, but most be assumed partially faulty. It just isn't feasible to make them all correct. The article is jumping between these two stances, so this is not a great take-away.


What an excellent post! What I would add though, is that you can gain some simplicity by removing abstractions (e.g. interfaces) and sometimes it's just worth it. For example, a good practice may be to abstract away all implementations and use interfaces instead, but for things like logger (which may be a bit complex to abstract depending on it's API), you may be just fine logging to some buffer instead of maintaining mocks (and in Go you'll probably make expectations what they should be called with). Focus on things that matter - if you want to check logs in your tests too, make it as effortless as it can be (e.g. use snapshots for those).


Don’t write clean code, write the best way to express your idea without caring too much about how dirty that looks like, and let the AI chat bot rewrite it in the cleanest version that follows all the best practices!

I’m not even joking, that’s basically my process since ChatGPT-4 has been released :) That’s a dream for boilerplate heavy languages like Go, C#, and C++.


Completely agree, except that simplicity is kind of redundant. If the code is correct and readable, I don't see why I need to simplify it any further, to me it's probably simple enough if the complexity isn't causing problems. It seems like engineers, and humans in general, really love simplicity for its own sake, in some deep way I that makes me wonder if I'm secretly a bad person, since it seems to so often be praised as connected to so many other virtues and most every philosopher ever seems to love it.

I'd much rather have DRY than simplicity, because repetition creates the possibility for later divergence and incompatibility and stuff left behind. What if there's a bug in the thing you repeated and only 9 out of 10 get fixed because of an oversight?


In fact there only a few things that matter:

- code, when executed, should do what is expected.

- code should be maintainable.

All the rest are just some ways to get there, but not must-haves.

One great way to get there is:

- discuss architecture, nomenclature and design choices. Document those.

- have your code and docs reviewed and discuss/rework accordingly.

The above in a rinse and repeat manner.


So, which of these criteria were contrived, and which were left out, all to make this fit into “CRISP”?


SOLID


Don't follow the concrete practices in the best-selling book* vetted by industry leaders for years, instead follow this set of 5 high level goals that I've generally described and grouped together into a catchy acronym. Yay me!

* - Granted the book was written by an asshole.


Gentle reminder that industry leaders could be wrong too, and domain-specific knowledge, even if gained from experience, may not necessarily be universal truths especially in platforms that may follow different patterns or paradigms. Programmers really should be more discerning about the people whom they are looking up to.


I've worked with people following the book to the letter and the quality of their work would vastly improve if they took the time to understand the catchy acronym. The author is not the only problem here.


> Don't follow the concrete practices in the best-selling book*

Argument by sales number. Good start.

> vetted by industry leaders for years

Self-appointed industry leaders, sure. They could by no means be wrong, right?

> instead follow this set of 5 high level goals that I've generally described and grouped together into a catchy acronym

Sounds like Uncle Bob's SOLID. If that's the case then I agree with you.


Could you elaborate on the last point?


Uncle Bob has been widely criticized for his political stances, basically being a MAGA republican.

This also spilled into some critiques of Clean Code which are sometimes colored by this.


Eh, I don’t even know anything about his politics but I read his Clean Architecture and I saw immediately what a hot pile of garbage it is. The book is full of platitudes, the author is so conceited that he cites his other works to justify his statements, and he’s far more concerned about being immortalized or revered in the industry for his coinages of principles, than by coming up with a mechanistic model of what makes a good computer program and ways of measuring that quality.


Sure, but being a bad programmer does not make you an "asshole", which is what the original commenter asked about.


Uncle Bob has been widely criticized for his political stances, basically being a MAGA republican.

I don't think that's a fair representation of his critics. I've been reading (and even occasionally writing) critiques of Uncle Bob for many years, and you are literally the first person I've ever seen even mentioning is political stance. Up until 30 seconds ago I'd not once reflected over who Uncle Bob might vote for.


All non-political critiques which I have seen (here and elsewhere) always seem to turn out to be based on misunderstandings or exaggerations of what he actually has said or written (sometimes wildly so). Or, as it sometimes turns out, people hate him not for anything he has said, but because of how other people has misunderstood and misinterpreted him.

From what I have seen, every time he has been criticized sincerely, and he has become aware of it, he has engaged his critics in open debate, and they have come to amicable results, with him sometimes altering his views.


But we're those critiques of him being a horrible person?

I disagree with some of his technical ideas, but that in itself would not warrant such a characterization.


I have honestly never heard (or made) a critique of him as a person. In fact the little I have heard about him as a person has generally been positive.


ok but the original comment called him an "asshole", which is about the person, not his technical ideas.

I can dig up some instances of people being angry at him, but honestly, it's a 70+ republican, you can easily imagine the situations he gets into.



> performance genuinely matters. Not as many as you might think, though: computers are pretty fast these days, and since most of what they do is in the service of us humans

what's wrong with this false assumption?


Death by 100 poorly chosen algorithms?


Code should be CODE: "Clean, Optimized, Documented, Efficient".


Just curious. How is optimized different from efficient


It's a bad word choice, but fitting it into an acronym is more important than giving good advice


Probably he was just fishing for adjectives, but generally, optimization just just the minimization of any function. Efficiency could be the minimization of a function describing resource usage.

In the real world, in real time systems one often optimizes for low jitter, which is often not contradictory to optimizing for efficiency, as one may reserve resources to keep them available in order to process requests faster, even though they're not required.


"The problem, of course, is that few of us can agree on what "clean code" means, and how to get there. A rule like "methods should only do one thing" looks great on a T-shirt, but it's not so easy to apply in practice. What counts as "one thing"?

I don't agree with this statement at all. From my experience this is perfectly possible. Maybe I'm misunderstanding the statement... why would it be hard to write methods that only do one thing?


Because it's an artificial constraint that makes code worse. You end up with a whole bunch of functions that have only a single call-site and half a dozen parameters that don't make much sense. If you can only understand what a function does by looking at the call site then the function is no longer a self-contained piece of functionality and it shouldn't exist.

When you write very simple code you can have short functions that do one thing. When you work on more complex projects some functions will just be 300 lines long and breaking them up will just make the code harder to understand and harder to work with.

Take sqlite for instance: https://github.com/smparkes/sqlite/blob/master/src/vdbeaux.c

You'll find plenty of cases where functions do multiple things in sequence and those functions are long-ish because of it, and some "clean code" type programmers would feel compelled to refactor the code and make it way worse.


"If you can only understand what a function does by looking at the call site then the function is no longer a self-contained piece of functionality"

Wow, this is a solid guideline. Alright perhaps "SOLID" isn't the best adjective to use, but it's great advice :)

I find this in line with John Ousterhout's "Philosophy of Software Design", where there's a guideline saying that modules (classes/functions/components/etc) should be deep and interfaces simple. Instead of dividing methods/classes due to their size in lines, you should be dividing where interfaces can be simpler. Because a complex interface imposes a lot of complexity in the consumers of the module.


His book was pretty good, and I very much agree about the importance of good interfaces. It's the essence of computing, because file formats, data types, and protocols are just interfaces by another name.


why would it be hard to write methods that only do one thing?

Two big problems with this approach. First of all reasonable people can disagree on what "one thing" actually means. Let's say you want to take a csv file of numbers and return a numeric array-of-arrays. How many 'things' is that, 1 or 4 (read, parse, validate, convert)?

Secondly it is many time both computationally more efficient and 'aesthetic' to do everything in one in-line sweep rather than:

  x=f(x)
  x=g(x)
  x=h(x)
  ...
While both approaches can be taken to extremes, I generally agree with what John Carmack wrote on the topic many years ago http://number-none.com/blow/blog/programming/2014/09/26/carm...


The point (which I don't 100% agree with, but can see) is that what it means to "only do one thing" is sometimes debatable.

Say I write a 1000-line method that implements a red-black tree data structure, including returning closures that allow you to search for, add, or remove nodes. I could claim that this method does precisely one thing: it implements a red-black tree.

Or, say I write a ten-line method that takes a list of names and returns the unique names ordered alphabetically. Someone could complain that the method does too much, because it both finds unique names and sorts them alphabetically.


Does the `main()` function in GCC only do one thing? It compiles a program, that's one thing right? Alright, have fun writing an entire compiler in one function.

Besides, Bob Martin even mentions this in his book "Clean Code". Page 35:

> The problem with this statement is that it's hard to know what "one thing" is...

He finally concludes that a function is only doing one thing if:

> you can extract another function from it with a name that's not merely a restatement of its implementation.

Which is just dumb. Let's call our function `compileCProgram()`, well according to the second heuristic this is only doing one thing. Anyways, if even the guy who wrote the book about clean code admits that it hard to figure out what one thing is, I'm inclined to say you may be slightly disingenuous here.


It seems like the author could be looking at your method at the statement level, for example. What does that if/else do? One thing or two? What does this say about the method?

But as I commented elsewhere, there's not a lot of attention to topical scope in the article either, so it also sets out to do the metaphorical "one thing", and then somehow merges with the broader philosophical world...


“One thing” can mean different things depending on the level of abstraction.

If you don’t understand that, I can understand how the rule could seem confusing


I can't stand the insistence on "idiomatic". Taken to extreme it is essentially gate-keeping and dogma: "We have always done it so."


I think is has value for code to be idiomatic. It's a very dynamic goal post for sure, which is a problem in a rule but not in a principle like in the article.

To me, code that is not idiomatic sounds wrong, which is at least a cognitive hurdle (you need to spend more thinking power to unravel/decode it to figure out the meaning). I believe this is in line with the usage regarding natural (human) languages. A sentence might be syntactically correct but if its non-idiomatic it sounds wrong and throws the listener(s) for a loop understanding it.

A very simple example is that in C, I greatly favor allocations to be written according to:

    P = malloc(N * sizeof *P)
where P is the pointer variable receiving the return value, and N is an expression yielding the number of desired elements. The sizeof is completely idiomatic, always present unless I'm allocating an actual byte buffer, and always de-references P like that to lock the size to the receiving pointer. So this is my idiom for how an allocation is written. It might (should!) be yours, too but that's not a requirement in a personal principle I guess.


Ah but idioms change as languages evolve. To me idiomatic is ‘the way it’s done now’ rather than ‘the way it’s always been done’


If you could just write good code that would be fine.


Make it work, make it simple to understand for people with no context, make it robust to edge cases and bad inputs.


Ok.

* starts writing CRISP code *


Pretty decent article. People on this site fucking love getting riled up about stuff.


The "simple" part is so hard to understand even for people with high seniority. Once asking for simplicity I got back a mass of low effort code that seemed written out of spite, even if that was not the original intent.


I like John. What I don't like is dogma. There's a lot in HR. Also hype.


This article was way too long for what it said. Not very crisp, one might say.


Don't write XYZ code, write code that works and that you can maintain.


Considering most software already fails on the first aspect, do the other four points really matter at all?


Any manager would tell you, don't develop clean or CRISP, develop by the deadline.


> So, while readability isn't quite as important as correctness, it's more important than anything else.

In most cases, readability is more important than correctness. If you have readable code, you can always make it work correctly later. But if you have code that merely works correctly, it isn't necessarily easy to back and make it more readable.


Don’t write CRISP, write PERFECT.

Plain, exciting, refreshable, fast, entertaining, code (with) tabs.


Is this another acronym I'm going to have to learn for interviews?


I'm feeling more crunchy today, I will try crisp coding tomorrow


S - Simple

Is never simple as people can just not at all agree on what simple means. Even experienced programmers. This isn't even a question of training. It's deeply embedded into human nature due to peoples thought process differing and in turn what they find simple.

P - Performant

While a certain baseline performance does matter often (not always) anything beyond that yields increasingly diminishing returns. Discarding even large chunks of performance in rarely run non time critical code in favor of otherwise "better" code can sometimes be a very good decision.

Somewhat derived from "simplicity" and readability but not any less important is making code easy to change (by literally changing it, not by making it configurable or parametrized).

The problem with DRY and similar isn't that it's bad practice but that people take it too far and try to have rules they can just always blindly strictly always apply ignoring imperfections of e.g. other tools they use.

For example DRY is about not repeating "business logic" (1) but people try to forcefully apply it to language structure and things which just happen to be similar. They then often do so by using abstractions which make future changes to the code harder by put constraints on about what is possible. Another common failure is to not diverge (i.e. copy) you code when business logic diverges through a change of requirements but instead try to handle it by additional parameterizeation/genericity often using means which make the code again harder to change.

I.e. TL;DR: DRY isn't bad but like most things it's not a rule you can blindly apply but a tool/skill you need to learn to properly use, including when to not use it.

(1): What counts as "business logic" differs depending on the abstraction level/context you are looking at the code; That doesn't make it easier to use.


DRY is so that when you change one place, you don't forget to change it in the other place, likely resulting in a bug or bugs. There is a simple (for some people's definition of simple) solution to this, which is to document this dependency in a comment in all four places - the extra two being the end of the particular block as well, so it doesn't get missed by a reviewer if the starting comment is past the top of the screen for the reviewer. Having this checked by your automated testing system is left as an exercise for the reader.


you are not going to solve the problem that the meaning of "clean code" is vague and subjective by introducing a new set of terminology that is also vague and subjective


Mind-blowing: I had been writing incorrect, unreadable, idiosyncratic, complicated, and slow code this whole time, thinking that was the thing to do. But now I see my mistake and will cease forthwith. Thanks, Captain Obvious!


Don't write crisp code write deep fried code.


i just try to write code that can be deleted later

the sooner the better


Americans love their acronyms.

I don't foresee this one catching on


Maybe CHIP would suit the Americans better


I hesitate to speak for every American but I think FRY would go over better.


That'll work too. Correct Human-readable Idiomatic Performant




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: