> If you can’t explain why you’re doing something, and you’re just doing what you’ve been seeing others do, maybe rethink it and educate yourself on the topic.
This is very interesting to me, and applied to our entire lives, not just IT. I've started with the above opinion about 25 years ago. I was as sure as the author about it and I was similarly very outspoken about it. With time though I started noticing some problems with the theory/philosophy. I will fast forward and simplify not to make this very long:
- people notice a pattern: if problem X then solution Y
- the Y solution spreads in adoption
- problem X is not really experienced any more because it is "presolved"
- future people end up doing Y without knowing why
At this point you have NO clear optimum path:
1. you can keep doing Y but you will certainly do it when X is not present so you expend energy for nothing and even make things worse in some cases
2. you observe point 1 and stop doing Y, but you will stop doing it even when X is present, at least until you start noticing X reappearing as a problem and the X->Y pattern is again understood
Now you may say: just rethink everything before doing it. But that is not possible, it would several lifetimes just to discover the absolute basics of every day life. Most things you can not deduce, you must go through trial and error for significant periods of time.
So what do we do about tradition? Do we follow it blindly? Do we reinvent the wheel at every step? Clearly those are just the extremes but I do want to mention that if I would have to chose just between them I would follow tradition blindly. 20 year old me would absolutely think I am an idiot for this! But I payed the price of thinking I can outsmart past generations so many times that I became very skeptical about how smart I am in reinventing things. And this is not me being stupid, it's just that most cycles of pattern recognition do take generations to observe and optimize for, no one person can do it.
That top quote is enough to turn me off reading this article, and your comment perfectly explains why. Ironically, telling everybody else to think about what they're doing is something a lot of people have been doing since forever, seemingly without realising that thinking about everything you do is the ultimate analysis paralysis. I'm going to bloody well learn the best practices, thank you very much, practice them until I notice the flaws and then figure out whether the practice was anywhere from plain bad to needing a few exceptions for corner cases.
- Learn the rules
- Master the rules
- Break the rules
You can't go in straight away into life, or anything, and break the rules. Because you know fuck all about why those rules are in place, and what purpose they serve. The only people who really should be breaking rules are those that have mastered them and understand everything about them.
The most obvious place to illustrate this is art generally. You can't as an 18 year old paint a canvas red and say its art, people will laugh at you. But once you've built a reputation of being a master, like Mark rothko, you paint a canvas basically red which doesn’t need much skill. But you might be able to get away with breaking all the rules.
This is excellent, but I do want to add one thing: very few people can/will reach the level of mastery required to break the rules with benefits and no significant downside.
And we should also keep in mind that Bruce Lee himself likely died rather young as the result of some unconventional choices - https://en.wikipedia.org/wiki/Bruce_Lee#Death (a combination of substances used, over-exertion and the surgical removal of his sweat glands)
I am not so sure regarding the top comment. I think I am probably about as old as the person who wrote that but I find it rather abstract and not very relatable. I would like to hear an example to decide whether or not there is merit in this comment.
In software, I think the so-called best practices really don't deserve the status of traditions. Take the git flow. Some random person writes a blog post and now we have for some reason suddenly a 'best practice'.
Also, you do not really need to think that much about 'everything'. The KISS principle tells us to first try the simplest thing that could possibly work. Maybe KISS should be a best practice, I suppose.... What is rather interesting (and questionable), though, about the so-called best practices is that they generally lead to less simplicity and more stuff that all needs to be maintained. The cynic in me suspects a large overlap between 'best practice' and 'someone is trying to sell something'.
> Some random person writes a blog post and now we have for some reason suddenly a 'best practice'.
That's not giving other developers much credit for critical thinking. Famous articles such as Git Flow or The Joel Test are usually written by experienced developers (or a representative of a team of such), often with more understanding of the subject than the vast majority of other developers. They present a reasonable case for why the practice is better than what came before, and don't pretend that they are presenting some sort of universal, timeless solution. What happens next is the same thing that happens to every innovation: a messy, slow, mostly random walk from the original idea to something better, which may be closer to the original practice than the recommendation. But it's often worth trying.
> Also, you do not really need to think that much about 'everything'.
How do you work out which things to dig into though? You can't possibly dig into everything, and if you don't have much experience you have no reason to think that any particular rabbit hole is going to lead to somewhere better. And that leads straight back to best practices: they are probably going to work for a reasonable fraction of the problems being solved, because most of us (let's face it) are solving extremely similar problems over and over again.
> What is rather interesting (and questionable), though, about the so-called best practices is that they generally lead to less simplicity and more stuff that all needs to be maintained.
That's not been my experience in general, but I try pretty hard to make sure I understand why they are considered best practice and when it's OK to ignore or go against them. For example, a lot of shell scripting "best practices" are terrible remnants of the dawn of time when ASCII ruled the world and nobody had even heard of Bobby Tables: assuming every filename matches ^[A-Za-z0-9_-]{1,8}\.[A-Za-z]{3}$ or something similarly naive, that limited error handling is worse than no error handling at all, or that portability is more important than maintainability.
> The cynic in me suspects a large overlap between 'best practice' and 'someone is trying to sell something'.
Buy my book[1]! ;) Or slightly more seriously, read these[2][3][4][5], spend too much time figuring out how to loop over any files safely, find a blog post explaining how someone else came to the same conclusion, realize how bonkers the solution is, weep a bit, and long for something better than POSIX.
> That's not giving other developers much credit for critical thinking.
Au contraire, I think it's you who is not giving tech evangelists and marketers enough credit in their skills at persuading people to adopt bad ideas.
If you compare different programming communities, they have very different best practices.
Things one community swears are necessary to making good software, another community views as irrelevant or harmful.
Programming is littered with examples of people getting overly attached to some dubious principle and then running as far with it as they possibly can. Some of those examples have become dominant frameworks!
>thinking about everything you do is the ultimate analysis paralysis
One thing I've noticed is that people who try to live according to a particular philosophy or philosophical understanding tend to be less effective than people who either ignore it entirely or only apply it to limited situations. This observation seems to hold even amongst public intellectuals, which one would assume would skew heavily towards philosophers.
If you can’t explain why a so called “Best Practice” is useful in a specific context, and what the pros/cons are, then you should honestly just keep quiet and wait until you have more experience.
It's not that I can't explain it, it's that it does me no good to do so an in some cases makes things worse.
Where human factors are concerned, software developers in the large (which noteworthy exceptions), are so deeply in denial about what is easy and what is difficult that it's downright toxic. Especially when considered as a whole. Yes, this task is not particularly onerous by itself, but it's one of twenty tasks we have to juggle all the time, and that's simply ridiculous.
I have reasons for most of the things I do. I could write chapters for (someone else's) books about them. But I'm not going to explain them to people because they'll just try to line-item veto everything to try to protect their ego, turf, or both. Complementary practices (those with a multiplier effect) typically have very little payoff when you eliminate half of them. It becomes a self-fulfilling prophesy when you block enough of them that the overall improvement is in the low double digits, instead of say 50% spread across six behaviors.
Blindly doing anything without first having tried to understand what the pros/cons are and in which contexts the advise applies and where it doesn’t is crazy.
Could you give some examples of traditions that an average American follows which are beneficial and would be hard to explain? I honestly couldn't come up with anything...
It made perfect sense to strive for increases in GDP during the 19th and 20th centuries, where economic gains directly correlated with improvements in material wealth and a better quality of life.
Now, I don't think anyone really understands why we're still doing it.
As a society, we measure, celebrate and strive to achieve economic growth (in terms of increased per capita GDP. inflation-adjusted). People do this on an individual as well as broader group level.
In the past, the push for growth was an effective solution to the very real problem of widespread absolute poverty caused by the inability to provide sufficient consumer goods to the broader population.
In more recent times, pushing for economic growth seems to be causing more problems than it solves.
Yet, we still do it anyway because that's what we've "always" done (at least, since the start of the industrial revolution).
I understood the "traditions" GP mentioned as cargo-culted practices tied to the decision a single person makes going through his life. Like not eating pork, brushing teeth or boiling pasta uncovered.
I agree that you can easily find abstract ideas people hold true about the world which would require serious re-education in order for the opinions to change.
I don't mean to nitpick, but on the off chance you share the same views as Richard Feynmann on this matter, please understand that tooth brushing is not a cargo cult behaviour!
People who do not brush and floss regularly undergo extreme pain (physical or financial) later in life and genuinely repulse others whenever they open their mouth to speak.
Don't worry! I gave it as an example of a behaviour we learn as kids which is easily explained because we know about microbes and the process of teeth decay.
However, people have been cleaning teeth for thousands of years. My (naive) belief is that people associated the procedure with good health but weren't able to explain why it works. It was mostly tradition or good habit passed on through generations. Citing Wikipedia[1]:
> The Indian method of using wood for brushing was presented by the Chinese Monk Yijing (635–713 CE) when he described the rules for monks in his book:
> Every day in the morning, a monk must chew a piece of tooth wood to brush his teeth and scrape his tongue, and this must be done in the proper way. Only after one has washed one's hands and mouth may one make salutations. Otherwise both the saluter and the saluted are at fault.
>My (naive) belief is that people associated the procedure with good health but weren't able to explain why it works.
Also a naive assumption, but I suspect our ancestors understood completely that bad dental hygiene lead to both unpleasant smells and toothache. People are very good at avoiding both disgusting stimuli and pain!
No, not really since I'm not an American but mostly because I don't want label some traditions as "beneficial" as opposed to others, it kind of goes against the point I was trying to make.
But I can tell you what I meant by tradition: basically everything you did not explicitly decide about in your life and that you mostly do like everybody else does and did before you. Even wearing socks for example or hard soled shoes - you can easily find people these days that go against these. And you can think about it yourself. Is the barefoot movement right? Seems healthier for your foot. But how about one accident like a cut or penetration of the very thin and soft or even non existent sole?
Cooking beans. Cooking potatoes. People may be able to explain that they taste better that way, but I doubt everyone would say they are cooked so they don’t make you very sick
Another habit or practice is buying single use water bottles wherever they go or drinks and food on road trips as if it’s not super easy to bring food or water. Yes, in some cases buying is more efficient but definitely not all cases and yet some people never bring food/water.
I don't think this is really true if we apply an example:
- We need a way to travel faster and I've made a car to do that
- cars spread in adoption
- travel is much easier
- people now use cars without knowing why
I can't think of many solutions where we don't also have the original problem it solves?
The real problem is people do not have enough knowledge in a domain to infer whether a new proposed solution is better or not. So to take the example further:
- Cars should no longer be used because they are causing climate change
- People fight against it because they do not have the knowledge to infer the right decision
- Person takes a decade or longer proving out why they are right
It’s not a 1-1 analogy but reminds me of English sailors preventing scurvy by using lemons only to later start getting scurvy again because they used limes and copper pots.
But for a more 1-1 example: A lot of food is cooked by habit, I’m not sure how many people actually know uncooked beans or potatoes will make you sick but they all do “y” and cook them. Skipping “y” in some cases will make you sick or poisoned.
My preferred solution, in a situation where you've forgotten (or never learned/experienced) X, to keep doing Y until someone on your team perceives problems with it, at which point you can acknowledge that you're not familiar with the reasons for Y and hence can invest some time finding out what they were why prototyping not doing Y.
> There exists in such a case a certain institution or law; let us say, for the sake of simplicity, a fence or gate erected across a road. The more modern type of reformer goes gaily up to it and says, “I don’t see the use of this; let us clear it away.” To which the more intelligent type of reformer will do well to answer: “If you don’t see the use of it, I certainly won’t let you clear it away. Go away and think. Then, when you can come back and tell me that you do see the use of it, I may allow you to destroy it.”
Even that isn't a perfect solution. Cultural evolution is smarter than you are; a cultural practice may provide value that even its own practitioners aren't consciously aware of. Joseph Henrich's The Secret of Our Success discusses the traditional processing of manioc, a cyanogenic tuber in South America:
> In the Americas, where manioc was first domesticated, societies who have relied on bitter varieties for thousands of years show no evidence of chronic cyanide poisoning. In the Colombian Amazon, for example, indigenous Tukanoans use a multistep, multiday processing technique that involves scraping, grating, and finally washing the roots in order to separate the fiber, starch, and liquid. Once separated, the liquid is boiled into a beverage, but the fiber and starch must then sit for two more days, when they can then be baked and eaten. Figure 7.1 shows the percentage of cyanogenic content in the liquid, fiber, and starch remaining through each major step in this processing.
> Such processing techniques are crucial for living in many parts of Amazonia, where other crops are difficult to cultivate and often unproductive. However, despite their utility, one person would have a difficult time figuring out the detoxification technique. Consider the situation from the point of view of the children and adolescents who are learning the techniques. They would have rarely, if ever, seen anyone get cyanide poisoning, because the techniques work. And even if the processing was ineffective, such that cases of goiter (swollen necks) or neurological problems were common, it would still be hard to recognize the link between these chronic health issues and eating manioc. Most people would have eaten manioc for years with no apparent effects. Low cyanogenic varieties are typically boiled, but boiling alone is insufficient to prevent the chronic conditions for bitter varieties. Boiling does, however, remove or reduce the bitter taste and prevent the acute symptoms (e.g., diarrhea, stomach troubles, and vomiting).
> So, if one did the common-sense thing and just boiled the high-cyanogenic manioc, everything would seem fine. Since the multistep task of processing manioc is long, arduous, and boring, sticking with it is certainly non-intuitive. Tukanoan women spend about a quarter of their day detoxifying manioc, so this is a costly technique in the short term. Now consider what might result if a self-reliant Tukanoan mother decided to drop any seemingly unnecessary steps from the processing of her bitter manioc. She might critically examine the procedure handed down to her from earlier generations and conclude that the goal of the procedure is to remove the bitter taste. She might then experiment with alternative procedures by dropping some of the more labor-intensive or time-consuming steps. She’d find that with a shorter and much less labor-intensive process, she could remove the bitter taste. Adopting this easier protocol, she would have more time for other activities, like caring for her children. Of course, years or decades later her family would begin to develop the symptoms of chronic cyanide poisoning.
> Thus, the unwillingness of this mother to take on faith the practices handed down to her from earlier generations would result in sickness and early death for members of her family. Individual learning does not pay here, and intuitions are misleading. The problem is that the steps in this procedure are causally opaque—an individual cannot readily infer their functions, interrelationships, or importance. The causal opacity of many cultural adaptations had a big impact on our psychology.
> Wait. Maybe I’m wrong about manioc processing. Perhaps it’s actually rather easy to individually figure out the detoxification steps for manioc? Fortunately, history has provided a test case. At the beginning of the seventeenth century, the Portuguese transported manioc from South America to West Africa for the first time. They did not, however, transport the age-old indigenous processing protocols or the underlying commitment to using those techniques. Because it is easy to plant and provides high yields in infertile or drought-prone areas, manioc spread rapidly across Africa and became a staple food for many populations. The processing techniques, however, were not readily or consistently regenerated. Even after hundreds of years, chronic cyanide poisoning remains a serious health problem in Africa. Detailed studies of local preparation techniques show that high levels of cyanide often remain and that many individuals carry low levels of cyanide in their blood or urine, which haven’t yet manifested in symptoms. In some places, there’s no processing at all, or sometimes the processing actually increases the cyanogenic content. On the positive side, some African groups have in fact culturally evolved effective processing techniques, but these techniques are spreading only slowly.
> A reasonable person would have asked why everyone was wasting so much time preparing manioc. When told “Because that’s how we’ve always done it”, they would have been unsatisfied with that answer. They would have done some experiments, and found that a simpler process of boiling it worked just as well. They would have saved lots of time, maybe converted all their friends to the new and easier method. Twenty years later, they would have gotten sick and died, in a way so causally distant from their decision to change manioc processing methods that nobody would ever have been able to link the two together.
That book is so similar to my post I suspect myself of plagiarism. Must have read these arguments in the past and not specifically remember them, but I guess that supports the general thesis.
Regarding the last quote: one must also give credit to the "reasonable person" who died because they think themselves smarter. These kind of people are the necessary guinea pigs sacrifices of technological progress. In the trial and error process, the errors are mandatory, else there is no learning.
In my mind, here's a Good Tradition (TM): before crossing a street, look to your left then your right then your left again.
The only reason I as a small child learned this was because in a world where cars drive on the right side it makes perfect sense.
But there are so many traditions that are senseless: shaking hands, wrapping food in plastic, celebrating Christmas, abiding to corporate hierarchies and work for a dictatorship while living in a democracy, documenting code, bullying the weak, as was tradition when I went to school, having stand-ups when you're not doing SCRUM and so on and so forth.
Traditions might not necessarily be good things. Trail and error has been my only strategy to find out which are good and which are bad. I wish I could stand on the shoulders of giants because then I would probably be much wiser that I am today, because of the head start it would give me.
> But there are so many traditions that are senseless
A tradition is not senseless just because you don't like it or understand the reason for it. Often, there are very good reasons a convention like shaking hands or abiding by hierarchies becomes a tradition. By all means, get rid of worthless traditions, but take care you don't cause unintended consequences when you fiddle with traditions without understanding their motivation.
Also, perhaps a little humility is in order. It's just possible the people who instigated a tradition were smarter than you or knew something you don't.
> but take care you don't cause unintended consequences when you fiddle with traditions without understanding their motivation.
Reminds me of The Office when Mike is in the woods, Jim tries to consolidate all birthday parties into a single day to increase productivity, but then faces a revolt and realizes separate birthday parties are the only thing maintaining office cohesion. When Mike returns to the office he reveals he made the same mistake when he was starting out.
The trouble is in deciding which traditions are useless and which are not. Let's take some of your examples:
- shaking hands - any form of greeting involving physical contact (hugging) is a lot more powerful than just words
- celebrating Christmas - you can have a family reunion any time, or rest on any day, not just Sunday. But do you? Or do you need a pretext and a common free day for everybody and a guarantee that at least one day per week you do not work and nobody can tell you to?
- corporate hierarchies, dictatorship - any form of flat organization fails to take timely important decision because nobody can enforce them. Dictators started in Rome, where the fiercely republican senate recognized that during crises they could not unite to do anything and appointed limited time dictators and willingly submitted completely to their decisions. Also take a look at the outcomes of the Arab Spring, did the removal of dictators improve the lives of the people?
it'd depend on the various team, but even if the team doesn't follow scrum, I feel like having stand-ups is useful, to share information across the team.
A daily stand up is good for some dispersed teams, or close-knit teams that for some reason (forming, norming?) have not developed the ability to communicate effectively, or get a feel in real-time what their colleagues are doing. Any higher-order stuff should be dispersed by a team leader or senior team member - and that can be done 'as, and when, needed'. Does this require a mandatory daily stand-up?
On some occasions, I have reduced the daily stand-up from 30 mins (yes, I know) to 20, then 15...etc and it's common for them to reach 0 without any loss of knowledge transfer because the team members have started to communicate effectively by other means.
Granted, the pandemic has changed the nature of the communications challenge a bit.
I am happy to see teams having a daily stand-up because they need them. I take more notice when teams have a daily stand-up because they have been told they should have one.
I do to and this is partly my point. If I had taken for granted that traditions are good then I wouldn't have questioned waterfall project stand-ups, a concept that is essentially good and I would have been much wiser for it.
For me, the non-SCRUM stand-up is still in the "trial" phase. My mind still resists the very thought of it.
"I wouldn't have questioned waterfall project stand-ups"
Back in 1956, Teams, Slack etc. were still pre-alpha and it was possibly difficult or technically impossible to have a group phone call, so a stand-up for larger teams spread round a building might have been the only way to quickly share info + it got the folks out of the noisy computer rooms (plus ça change there!)
As everything, it depends on the context. Following a set of rules blindly just for the sake of it when you and your team is producing good quality code and have a good team culture is absurd. However, relying on “best practices” guidelines might be useful in scenarios where you don’t want to fall in your own (or someone else’s) subjective opinion that doesn’t necessarily imply the best solution.
As a personal story, not so long ago, I was the technical leader of a team that had the tendency of not following even the most basic set of common sense standards. The code base consistency and practices around that code where an absolute mess. Chaos and opinions everywhere. Everyone had his/her own way of doing things. So in order to have everyone on the same page and producing something predictable and consistent, I had to heavily impose a set of best practice guidelines. I felt kinda bad at the beginning as I don’t like to behave as a dictator but the good thing is that, at the end, everyone felt the benefit and new team members were able to catch up and being productive quickly as everything followed a pattern.
So, yes, best practices might be a useful resource for drawing a line when everything else is just an opinion without technical support behind (or a poorly one).
> As everything, it depends on the context. Following a set of rules blindly just for the sake of it when you and your team is producing good quality code and have a good team culture is absurd.
It is absurd yes, but this is exactly how every single team I ever worked with, have functioned. And it's been promoted by managers who don't want to make too much effort in actually understanding the work, and by asperger syndrome developers who do anything to avoid having to use their judgement, and possibly being "blamed" for making a wrong decision.
You sound like a good leader, but honestly that change you made optimised for planning purposes (predictability), and for developer turnaround, and there might have been a "custom practice" solution that would have been much more productive in terms of actually delivering value, although it would have taken much more careful investigation into the code base and people's opinions.
> It is absurd yes, but this is exactly how every single team I ever worked with, have functioned. And it's been promoted by managers who don't want to make too much effort in actually understanding the work, and by asperger syndrome developers who do anything to avoid having to use their judgement, and possibly being "blamed" for making a wrong decision.
I can understand your point of view given your experience and completely agree. However, in my case, believe me, it was not a question of careful investigation nor anything like that. My team was a mix of lack of experience, lack of knowledge (and the annoying habit of not reading the bloody documentation of the tooling), duct taping mentality, years of technical debt, folklore mentality etc. And I was the newer member and in charge. So, in order to fix that, I had to use sometimes "best practices" as the tablets of stone, otherwise literally every PR would have been a push and back. In other occasions, we simply reached an internal consensus.
The author of the article uses the word "hate" which I think is too extreme. I truly believe that everything depends on the situation and general "best practices" guidelines might be a useful resource. There are no golden hammers in anything, even less in computer science.
In my case is a question of habit. But yes, I can see the term “best practices” being perceived as very conclusive. Like implicitly saying “this is the very best and if you don’t follow it you’re doing something wrong”. So I agree on “good practices” or “guidelines” (or even just a simple “recommendations” or “conventions”) as a better option.
I don't agree with everything in the article (ML or blockchain are not "best practices" in any sense), but I do think cargo culting in software engineering is definitely alive and kicking as ever.
For example, there's been a definite shift in programming language trends away from dynamic languages and towards "new static languages", like TypeScript and Rust for example. In web development I've noticed these languages and all the promises of safety they bring has also brought a pretty siginficant change in mindset. Pragmatism and "move fast and break things" is now totally out of fashion, and the new guardians of ultimate code quality would probably hate to admit that what they're practicing is a fashion. Compiler safety is heady stuff.
Dynamic languages are easy to start out with, but if the project becomes large enough, it becomes extremely hard to maintain and then you have performance problems.
So yeah compiled languages are better for writing maintainable code.
I agree in the general. But what I've seen is that type safety is now being taken to quite some extremes; that type safety is given a much higher priority than delivering software. I've seen this at multiple organizations now.
gasp Software engineering is now becoming more like actual engineering! Before you know it we'll have to start thinking about the ethical implications of our code!
25+ years of professional experience here, having worked for different companies in different countries. I have never ever seen that to be the case. However I have noticed that the least experienced developers prefer non-typed languages.
One of the key benefits of "Best Practices" is, when provided by the documentation of something you're using, you can reliably consider it as good advice and follow it, even if you don't necessarily understand why it is the best practice.
An instance of this would be "Best Practices for Cloud Firestore"[1] which spends a whole lot of time discussing different ways of avoiding "hotspotting" without really going into any detail about what it is or what specifically causes it, other than that it adds latency. If my project manager asks me why we're not using sequential IDs, and I say it's a best practice according to the documentation, he'll happily accept that answer even if neither of us understand why it's a best practice.
I'd argue that's the worst kind of best practice, because in order to get an entire team to work on it they now have no reason other than "their manager said so". It's one thing to skim over what you know your project manager won't understand (mine doesn't understand programming at all so really they just take our word on the implementation side of things), but to try and push it to the rest of the team and just hope they remember a long list of do's and don'ts without justification other than a book said to do it is stifling their personal development.
Even worse, if one were to go off and read up on what you've told the team to do on their own behalf, find that they don't agree with it, you've now got to handle:
- a discussion that should have happened earlier
- possible backtracking of recent work to try and reverse the pattern
- OR a backlogged ticket to reverse it while the system now has multiple different patterns sitting around with no push for consistency in either direction due to lack of priority
Obviously becomes much more of an issue the larger the system is, but in general you shouldn't be following (or asking others to follow) best practices without knowing the benefits behind them versus what you're doing now.
> One of the key benefits of "Best Practices" is, when provided by the documentation of something you're using, you can reliably consider it as good advice and follow it, even if you don't necessarily understand why it is the best practice.
If you don't understand why it is a best practice, you cannot reliably consider it good advice (though, sure, you can cargo cult it anyway) for your use case. Even if the source of “best practices” is reliable (and many are not), if you don't have at least a basic understanding of the rationale, you don’t have any idea if your use case is in the neighborhood where you might need to dig deeper when considering whether it applies or is counterproductive.
I get that lots of people do use best practices this way, and feel reassured that sone authority said so for reasons that remain murky and unexplored, but this is absolutely the worst possible use of “best practices”.
> my project manager asks me why we're not using sequential IDs, and I say it's a best practice according to the documentation, he'll happily accept that answer even if neither of us understand why it's a best practice.
I have to say, I think I would be happier with an answer like "because it avoids hotspotting according to the bocs"
This limits the applicability of the "don't use sequential IDs" rule to where it actually helps.
Otherwise, the rumour spreads, and later you find teams doing huge damaging work arounds to avoid using sequential IDs, in contexts where hotspotting is totally irrelevant (perhaps because they are not even using Firestore) all blissfully unaware, because they they think they are following "best practice"
I've seen this happen so often in large companies, the words "best practice" let people abdicate their professional responsibility to _think_ and it's quite dangerous IMO
I have seen plenty of instances where people write documentation in which they label something as a best practice, even though it's just something they preferred to do and weren't really able to articulate the reasons for, unfortunately.
Yes a lot of people like to be the person who says what the "best practices" are (even or maybe especially if they've just made them up) it sounds much more impressive than "I like to do it this way"
A “Best Practice” that doesn’t spell out the pros/cons of applying the practice, and the contexts where it is applicable and where it isn’t, is not a useful “Best Practice”. Blindly doing something you don’t understand is not recommended. Ask hard questions!
This article is written in an alien universe. "Best Practice" is the well-trodden path. It's typically easy to find, many people know about it, it avoids most of the pitfalls you don't know about, you can get help if you find yourself slightly lost, it's easy terrain to navigate, you can probably walk it or ride it, it gets you there without you putting in much more effort.
It is probably not the shortest nor fastest path, nor most efficient path for 4-horse carriage.
Here's a best practice: Don't build Splunk infrastructure on Windows. Is it bad? No, it's fine, but fine-grained control, observability, permission and logging framework, tooling, they're all minus compared to being on +nix. Don't build Splunk infrastructure on BSD, or Solaris, even if they're +nix. Not enough people use it that when you run into trouble, there's no help. Does one have to "defend" and "explain" that every time? Does the author hate that principle? There is such a thing as best practice (no capitalization) whether one admits it or not. It's not for the old seadogs, it's for the newly initiated, you don't need to know the whys at the very beginning. At the beginning, the best advice is not to fall into the ditch.
> Does one have to "defend" and "explain" that every time?
That's not my point. If your colleague is building Splunk infrastructure on Windows, and you tell him: "best practice is not to do it on Windows", they learned nothing. If you tell them "it's fine, but fine-grained control, observability, permission and logging framework, tooling, they're all minus compared to being on +nix", at least they know why, and they learned how to make an informed decision in the future, as well as how to push back against their managers when they'll insist to do it.
> There is such a thing as best practice (no capitalization) whether one admits it or not.
I admit it. My main problem is that a large portion of the time I'll see people not get why it's a best practice. And that is a problem.
There are plenty of “best practices” that contradict each other. Which shows that you need to think and understand the pros/cons of each “best practice” and not just blindly follow the latest advise you happened to read on HN.
What I hate about "best practice discourse" is how it's used to shut down rational consideration of alternatives the "best practice promoter" doesn't like.
People at my org are currently attempting to forbid the use of a certain reactive framework because they claim it is incompatible with the "best practice" of MVC. In reality, reactive frameworks have been developed in part to provide a simpler, more predictable, more productive alternative to MVC.
I'll leave you with a long quote from Scott Adams that I'm always reminded of in this context:
> When you are trained in the ways of persuasion, you start seeing three types of people in the world. I’ll call them Rational People, Word-Thinkers, and Persuaders. Their qualities look like this:
> Rational People: Use data and reason to arrive at truth. (This group is mostly imaginary.)
> Word-Thinkers: Use labels, word definitions, and analogies to create the illusion of rational thinking. This group is 99% of the world.
> Persuaders: Use simplicity, repetition, emotion, habit, aspirations, visual communication, and other tools of persuasion to program other people and themselves. This group is about 1% of the population and effectively control the word-thinkers of the world.
I always see these types of quotes brought up on Hackernews; an overly simplistic worldview that normally groups humans in to three sub-categories. In these sub-categories there's always a group that's semi-impossible to be in, a group that you don't want to be in, and a group that you -- of course -- are in because, well, we're on _Hackernews_ and not Facebook.
But, and I know the irony is dripping right now, where is the empirical data to back any of these claims? The numbers and groups seem to be things simply thrown at the wall.
It's just a theory that explains some patterns I see in my life. It's definitely exaggerated and oversimplified (Adams is a humorist).
I try to practice rationality, and I admire people that I think do it relatively well. But I agree with Adams that I probably don't do a great job all the time, and we're all influenced by word-thinking. I don't see myself as a great persuader.
A more accurate way to say “best practices” is to say “minimally acceptable”. This one trick is especially helpful when talking to management.
If everyone is doing it, then it means it’s the bare minimum that you should be doing, in order to be the “best”, you’ll need to go far above and beyond “minimally acceptable”.
When management hears, “best practices”, they think, “If we do that we’re done!”, which is not true, but they don’t understand that. When you try to improve things management becomes an impediment, but they usually don’t explain themselves, because they think it’s obvious, we’re already doing the “best”, why do we need to do anything else?
My life has become much easier since I figured this out. There definitely are best practices, but the phrase is a lot of times used by people as a tool to justify what they do. It's also used a lot by people who don't actually want to do good work for their customers, only to impress their peers. "I have now refactored my code into proper modules to follow best practices. My design is now preventing from implementing a feature my users actually need." The same goes for other stress inducing phrases like "code smell" etc.
Scrolling half the thread to read first comment that looks like someone read the article and understood it instead of just writing opinion on how author is wrong.
> When you want to specifically point out that maybe the code calculating taxes should not be referencing MongoDb, be specific: say it’s transgressing the Dependency Inversion Principle
I would argue that even that is still evading the actual argument. Why do we need the Dependency Inversion Principle?
Rather, I would argue that a good-faith argument would answer the questions:
- What (potential) problem does it solve?
- How likely are we, specifically, to actually run into that problem?
This post is written from my own point of view, where I often get annoyed by the rules people make for programming, engineering, and business generally. But to play devil's advocate: it's often very difficult to know exactly why we do some things, and when it would be bad to change a practice. And it's good, in my opinion, that most companies have a mix of the more independently minded engineers and more humble engineers that just want to follow the 'best practices' or trends.
Those people are going to CYA, you might not always appreciate it, but if everyone was always trying to pull the rug out - it would be chaos. They give engineers like us that like to question these things room to occasionally try new approaches to streamline or improve the 'best practices', without as much risk.
I would not call the engineers that follow 'best practices' and trends humble. Quite the opposite, in fact. They are the kind that wants to tell everybody else how to do things.
Best practices and receieved wisdom can be useful but only when interpreted by someone with a degree of wisdom and judgement. As ever, the answer to whether you should do A or B is 'it depends'. Applying a best practice blindly is, ironically, a 'worst practice'.
Best practices are incredibly useful, we need to learn from the accumulated wisdom of those who've gone before, but you need to know when to modify or ignore them too.
One example in networking of a "best practice" that quickly turned into a "worst practice" is disabling auto negotiation. Back in the mid 1990's when 100 mbps nic's first came out implementation of the new auto negotiation protocol was quite buggy. The workaround at the time was to "lock down" network ports by disabling auto negotiation. Somehow this quickly became canon and an entire generation of network engineers swore this as their first commandment.
Meanwhile the bugs with auto negotiation were worked out within a few years and by year 2000 there was no reason to do this anymore and every reason not to:
Disabling auto negotiation causes duplex mismatches. If not immediately then eventually it always does. There are too many ways one side of a link can get reset to the industry default (auto negotiation enabled): Replacing hardware, automatic nic driver updates, staff turnover, and simply not checking each and every interface every time a change is made to the network. There is no practical way to keep such an inherently unstable system in sync long term.
Ironically these inevitable duplex mismatches only seem to reinforce the belief that you must "lock down" every port everywhere resulting in a vicious cycle of bad practice.
The best way to avoid duplex mismatches is to never disable auto negotiation unless you are absolutely forced to by some very old or very poorly designed device.
The other issue with disabling auto negotiation is by late 200x, gigE was affordable to the point that servers just came with it, and gigE switches were pretty affordable too, but if you were forcing 100M/full, using the hardware you already had required at least one service event to change the ports to auto/gigE.
So the problem is that the practice is called "best"?
The idea is that of all practices for solving a problem the "best practices" were those which often if not always proved best for as many contexts as possible (also in hindsight). Version control is usually considered a best practice.
I don't see a problem here. When you work with people who blindly follow what is called "best", cargo cult will not stop when you stop calling it best.
And beside that I will always prefer "cargo cult" over "not invented here syndrome".
A great blog post echoing the same sentiments (that no longer appears to be there):
If you have ever dealt with me directly as a customer, attended one of my presentations, or even simply stomached one of my diatribes in a casual, technical conversation, you have no doubt heard about one of my pet peeves (no, not “Tips –n- Tricks” – we’ll visit that another time) but the term “best practices.”
I loathe that term. I know we are guilty of it at Microsoft – so are just about every single technology-based organizations. In our ever-evolving industry, to put something down in print as THE ABSOLUTE BEST PRACTICES goes against the very nature of the word “practice.” It can seem arrogant. It implies way too much finality. It also can mean serious disruption when a practice changes. I work in a consulting practice. As products, regulations, politics, and trends change, so do our recommended practices. This is why I purposely try to use “under the current recommended practice” or simply just “current recommended practices.”
"Best practices" is something you can advise on, write on, etc. - i.e. it is sellable.
Following (mandating) best practices immunizes (management) against being accused of doing things wrong. Might still fail, but then it failed despite doing everything "right".
Also, following "best practices" is much faster that creating solutions on your own: It will only get you to some sort of "average" in most cases, but that is what most places settle for these days.
I see this all the time when digging down into why. "It's ok to shut off my brain and just do X, regardless of whether it's situationally even useful." Just as bad as "Policy". It's ok to not think and find a good solution, as long as the bad solution looks "normal" enough.
Best practices are inherently decades old. It takes a long time for a good idea to spread and get people used to it. It's very likely there are better ways to solve the issue around, if you stop to think and read.
Because even the best decisions can fail, and unless you can be totally convincing not only to engineers but also people who are less technical (management or other stakeholders) when there's massive fallout, you're not going to want to explain why you weren't following "best practices".
Very often so-called best practices are really quite bad and ended up considered best practices more for marketing reasons than because of merit. Let me list some 'best practices' that are actually quite bad in many, if not most, circumstances: git flow, comments for every method parameter, microservices, feature branches, automated code formatting, scrum.
The rest I could see arguments for, but could you explain how automated code formatting is a bad practice in most situations?
Personally I find a consistent looking codebase makes it so much easier for team members to come in and reason about code they've potentially never seen before.
Well, you know, it sounds kind of nice to have your codebase auto-formatted so it looks nice without the programmer having to do this by hand. Problem is, current formatters never seem to exceed the borderline-passable level of formatting and I actually really like nicely formatted code. E.g., when the parameter list of a function no longer fits on the same line as the function name one can either split it over two lines or give every parameter a line by itself. I tend to make this depend on how complex the parameters are. If they are very simple, like just a short variable name I put them on one line and if they are some complicated expression I will put each of these expressions on a line by themselves. I think this really makes a difference in readability. Auto formatters generally have some fixed one-size-fits-all policy for this. And this is just one example. There are many more cases where a tasteful approach yields more readable code than a very uniform approach.
Besides this, I have also seen situation where the formatting guide lines become a way for one developer to force his preferences on another developer. This is really not nice. I have been in a situation where I really had difficulty reading the code just because it was formatted in a strange way. 'We just have to pick something and picked this'. 'No we cannot have exceptions because we don't want to waste time on discussing trivial matters like code formatting'. 'Yes, my personal exceptions to the standard that we use were applied but yours cannot'.
This is always the case when somebody invents a word for something people should follow. If the vocabulary changes marketing will change with it. The problem is people are searching for silver bullets, not marketing that serves those desires.
Used to have to argue with my dev team manager about this, he was a big fan of reading about some new-ish concept that we weren't using yet and then trying to tell the team he needs to spend a couple weeks changing everything over to it. I'd be the only one trying to force him to give us a list of "pros" for it over what we currently do and he wouldn't be able to answer, then go off and half-implement it across the system without telling anyone, causing the degradation of our ability to maintain the system (since it's now less consistently written) and wasting his own time.
People need to take best practices with a pinch of salt, and actually think through whether they'll provide any benefit to them in practice for whatever software they're working on.
Completely agree with the article. And as with such, all best practices have a time an a place.
Here's one great example of what not to do: mindlessly enforcing code line sizes. No, your hand won't get chopped off if you go over 80 chars (unless it's the Linux kernel, but they used to have good reasons for that)
If the first thing a dev does to your python code is mindless chopping it off at 80 chars he's just cargo culting.
I've dropped the "best" for "better" or "good" for a while now. I also don't have a problem going to a colleague with more experience and who I know puts out good stuff, and asking them what they think the "best" solution is. I gain some knowledge from asking them and looking at how they use it, and I don't feel stupid or like I'm doing something wrong by not understanding it inside-out. There's a middle ground that provides productive results. And at the end of of it all, if you didn't know what a plane or an airfield was and you saw flying metal contraptions delivering cargo, building an airfield isn't such a bad first step to understanding what's going on.
This is a problem I've heard others talk about but I think I've been really lucky because across the three places I've worked we've had code standards and best practices as living documents with ongoing discussions for reasoning about them and refining them.
By which I don't mean that there's a constant bickering about how to do things, of course. We have standards and people adhere to them in their daily work week. But these are established not by deferring to some vague authority, but by actual debates with plenty of examples and reflection on how it applies to our own code base.
A bug bear of mine for a long time has been the best practice of always having SQL server in full logging mode, with highly regular log backups.
It's great for important production services. In practise I run test databases I could rebuild from scripts. Archive databases that never see a write. Databases that load and drop automatically from other sources. Every single time someone looks at these, there's a someone panicking about best practise. Something about this particular issue just makes critical thought non existent.
The writer is correct in what (s)he is saying but these are not "best practices". Best practice is to not name a variable "asdfg", not using AI for the sake of the trend.
The best developers I have ever worked with never said “Best Practice”. The worst/most inexperienced developers I have ever worked with say it all the time. Because they don’t have any clue what they are talking about. If you can’t explain why a so called “Best Practice” is useful in this specific context, and what the pros/cons are, then you should honestly just keep quiet and wait until you learn/know more.
Best practice is what you won't get fired for doing. Most of us want to be able to support ourselves and our families more than we value exercising our individual engineering judgement.
"Best Practices" is a marketing trick. The fact of the matter is that software engineering (and programming, which I consider to be a technically separate subfield) is still coming out of its alchemy stage. We're someplace between the pre-jouleian thermodynamics and phlogiston phase. Like, it's better than wizards sitting in dark rooms experimenting with things which normal minds were not meant to see. However, we've got a bunch of rules that are either flat out wrong or almost definitely wrong.
So, we can either tell everyone that we're currently riding on a train built by crazy wizards and we're just now figuring out all the stuff we don't know. OR we can hype things up a bit to keep the muggles from freaking out too much.
Personally, I would like to just get by in life by telling the truth. However, many of my more successful peers are able to achieve much more than me by dressing up what they're doing with a good sounding story. Part of that is "best practices". I don't really like it, but I have a hard time begrudging them because it looks suspiciously that everyone knows what's going on and in fact are comforted by the language usage.
The other side of this is that author wants to replace 'best practice' with 'pattern'. Honestly, I don't see what the difference is. I doubt you can actually define that term so that it's any better. And the true hucksters (given 5 minutes) can find out how to spin 'pattern' to be just as problematic (if not more so) than 'best practice'. "I don't like the words you use. Use the words I like!" is a position that I'm rarely going to get behind. Like, from my point of view people are just making a bunch of noises from their mouths either way.
The second one is the assertion that if you can't explain it then you're cargo cult-ing. I don't like it when people can't explain what they're doing. I've been plagued by it for my whole career. Someone suggests something. It makes zero sense to me. I probe with questions. Eventually I hit the bedrock of their knowledge and there's still an embarrassing amount of blank lines still lying around. However, they're still able to be productive.
The very idea of "experience" is that you've got some know-how which you're unable to explain. If you could explain it, then it would be "technique". And this is something that permeates our lives. Like, how are you even reading this right now? Can you actually explain how your brain is recognizing letters, words, sentences, ideas? You're definitely not cargo cult-ing as you read this. The whole basis of ML is that we don't actually know how to explain a bunch of stuff to the computer so we just find clever ways of throwing statistics at the problem.
I find the cargo cult assertion wrong. Even though I really don't like it when people are unable to adequately explain what they're doing and/or telling others to do.
This is very interesting to me, and applied to our entire lives, not just IT. I've started with the above opinion about 25 years ago. I was as sure as the author about it and I was similarly very outspoken about it. With time though I started noticing some problems with the theory/philosophy. I will fast forward and simplify not to make this very long:
- people notice a pattern: if problem X then solution Y
- the Y solution spreads in adoption
- problem X is not really experienced any more because it is "presolved"
- future people end up doing Y without knowing why
At this point you have NO clear optimum path:
1. you can keep doing Y but you will certainly do it when X is not present so you expend energy for nothing and even make things worse in some cases
2. you observe point 1 and stop doing Y, but you will stop doing it even when X is present, at least until you start noticing X reappearing as a problem and the X->Y pattern is again understood
Now you may say: just rethink everything before doing it. But that is not possible, it would several lifetimes just to discover the absolute basics of every day life. Most things you can not deduce, you must go through trial and error for significant periods of time.
So what do we do about tradition? Do we follow it blindly? Do we reinvent the wheel at every step? Clearly those are just the extremes but I do want to mention that if I would have to chose just between them I would follow tradition blindly. 20 year old me would absolutely think I am an idiot for this! But I payed the price of thinking I can outsmart past generations so many times that I became very skeptical about how smart I am in reinventing things. And this is not me being stupid, it's just that most cycles of pattern recognition do take generations to observe and optimize for, no one person can do it.