> Axiom is a fancy word, but popping a few layers off the etymology stack we arrive neatly at the ancient Greek word ἀξίωμα, or "that which is thought fit or worthy".
Pet peeve time. It would be so great if people used the word "axiom" to mean exclusively "I need to blindly assume this is true, otherwise I do not see how can I make any progress". This article contains, I think, author's advices.
The dictionaries (and people) usually conflate "a proposition which cannot actually be proved or disproved" with "a self-evident principle"/"an established principle", which looks like a very huge confusion to me. It's like mistaking math and physics. It's like claiming Euclid's fifth axiom is self-evident, when others have already shown this was just an implicitly made decision (hence non-euclidean geometries), which cut off some very interesting, but difficult, mental models. (Quotes are from https://en.wiktionary.org/wiki/axiom)
I wouldn't protest, but I see educated people repeating this confusion over and over like a broken record.
Or is there a more precise term in English for those things?
The problem is that language is not precise, pretty much by evolved design. I totally get the desire for a language with maximum precision and lack of ambiguity, but no one has ever been able to do that successfully.
I think it's reasonable to wonder if its even possible.
One thing that feeds into it, and is due to the poor way we teach language, is that people think that dictionaries prescribe how we should use language. But that's not what dictionaries are.
Dictionaries are descriptive, recording how people are using language.
This is a very important distinction.
Instead of trying to remove the possibility of ambiguity from language, we should instead focus on gaining the skills to reduce ambiguity through discourse, which we are extremely bad at.
As programmer's I think we are specifically enclined to attempts of formalizing natural language into something precise and reducing it to syntax and semantics.
Yet like you say, natural language is imprecise. And the more I think about how much of a bodge the way we communicate is, the more impressed I am we can have conversations at all. All those layers of indirection, wtf?
But of course for almost every layer of indirection there is a safety net, that assures the meaning of what is said is passed on: If I mumble, my body language will help you figure out what I'm trying to say. If write gbbrsh sntce, context make you understand. And if you can't hear me, you can probably make out what I'm saying by lip reading.
English is rife with metaphorical or imprecise uses of words with precise meanings, so the rigorous definition of 'axiom' doesn't become relevant in a (non-math-focused) essay very often.
But I agree that this list of opinions stretches the term enough that it's a distraction - these are just well-framed observations about the nature of engineering.
In my head there is a clear distinction between the axiomatic mode of thinking and the non-axiomatic one. It's fundamental to any mental activity. I cringe anytime I hear "but how can you even prove <non-axiomatic theory>?" or "but why is <an actual axiom> true?".
It would be a shame not to have a very precise term for that, so I stick to "axiom" anyway.
IMHO, given the author labelled them "my ... axioms", it's natural to infer that they are principles which the author personally considers foundational. It's a context that's orthogonal to scientific terminology (as in Euclid's 5th). And in natural language, context is everything.
1. The 80-20 rule - remember that meeting 80% of the spec takes just 20% of the time and plan accordingly.
This cuts both ways. In most cases, it’s not too much work to build a prototype and ascertain whether its worthwhile to pursue further, but attaining feature completeness, even asymptotically, requires a lot more effort.
2. Be mindful of the sunk cost fallacy - it is important to identify the point of diminishing/no return and turn back immediately.
Corollary - if you think you are most likely wrong, admit to this openly, to yourself and your peers.
3. Talk the walk - If you are convinced that you have a good product/feature, don’t hesitate to push for its adoption, even in the face of inertia/resistance.
4. Remember that integrity is doing the right thing even when this may be disadvantageous to you in the short term.
Truly beautiful list. With a few twists of the knob, makes excellent axioms for living, not just engineering. “Change is constant” and “Don't pick a solution until you've thought of at least one more.” are far more important than most business school learnings.
I disagree with #25. It seems to me that change is a subject to a kind of no-free-lunch, if everything is easy to change in your system, then it doesn't do anything useful. IMHO you always have to be opinionated and predict what change is more likely and adapt to that.
Speaking of change, regarding #1. I am not clear how to reconcile (Agile's?) "embrace change" with the well-known observation that fixing a problem in a SW system seems to be more difficult later in the design-implement-test cycle. Commonly, this is addressed by making that cycle smaller, but it seems to only constrain the design space (large and consistent designs are ruled out). (There is probably a good reason why we don't build buildings using Agile, but we gather requirements first and then build it.)
Finally, I would add a rule - Don't change what works.
The distinction between "easy to delete" and "easy to replace" is subtle but important. The latter can lead you to write interfaces, proxies and factories. The former, to stateles and pure functions.
Also:
> 25. A good design is one in which you can change your mind without changing too much code.
In my own experience, "14. A good type system is worth its weight plus some." really depends.
If you are trying to make a mobile app, I agree. If you are trying to make a deep learning framework for researchers, I think the evidence is pretty clear at this point that type systems have more cost than benefit. Pytorch, Tensorflow, Jax, Keras, etc. are all far more popular than their strongly typed alternatives and I would be willing to bet this remains the case for many many years.
Very good list. Specially love 8, 9 and 10! I think these are practical points that one can actually try to follow. As someone mentioned in the comment section, would be nice if we could test these in interviews.
I love this list. In fact, I think my list has most of these, perhaps worded differently. I would very humbly add one of mine: "Statistics beat opinions". I suppose this is a condensation/generalization of "if you can't measure it, it didn't happen". Many times devs will "feel" the performance or stability is "just fine"... but having a few weeks of (relevant) perf stats is a lot more damning. Or confirming.
It's either a tool or a weapon depending on who wields it. I recall Nokia folks pooh-poohing the iPhone because in the metrics they valued, their Symbian phones were beating it.
Completely agree. I try not to let people cherry-pick metrics though. Things always change, and it is a good thing to question everything to test for relevancy...at least as far as a "smell test".
Abstractions are often formulated even when there is only one call site that currently demands it in anticipation of a future need- don't. Just write the whole thing in a straightforward way and observe which parts chafe whenever you need to make a code change. Then refactor those parts out. Don't be prematurely abstracting into higher concepts before you're sure those are the right concepts to abstract.
Right, so I'll just go ahead and start setting MTU packet sizes, and opening sockets when trying to calculate and send a invoice.
Instead write code operating at the level that best expresses the problem you're trying to solve. As if you are writing a business document to explain it. Fill in the concrete details afterwards.
If your doing invoices, solve the problem at level of invoices, transactions, percentages, addresses, tax and emails.
I have definitely been on the code review for abstractions which did nothing other than remove some duplicate lines of code. The author admitted as much and was certain it was still a good idea.
This ties into gold plating. Sometimes a little foresight makes the road a lot smoother when you predicted it right. It will only be premature when it turns out you choose the wrong path.
Gold plating is insurance. It can be insurance against failures or insurance against requirements changes, but either way, you don’t judge its value by whether you use it but by whether it mitigates a risk enough to improve your expected outcome.
Elf rules:
#1 Do not make "blind assumptions"
#2 To avoid "blind assumptions" do your research and ask good questions.
#3 Document and communicate answers you find
I think we should be careful when using the engineering term.
The author is writing specfically about software engineering but by by-passing the 'software' word in the title he is making a disservice to the real engineers (those professionals who have to go through a rather demanding set of requirements to be considered as such, e.g. mechanical, structural, chemical, etc).
We software developers love to be called engineers because it give us gravitas and status. But we fool ourselves and, most importantly the public, as our field is not regulated and it doesn't have a clear mandate nor ethical rules as the registered engineers.
I completely disagree, and I have a degree in electronics engineering.
Engineering is simply using knowledge & resources to develop systems or processes that solve human problems. When you're developing software to solve human problems, that is absolutely engineering.
What scientific knowledge exactly goes into writing a list like this? Or for determining our "best" practices? Last I check, none. So no, "software engineering" is not engineering.
EDIT: I think this is ok in such a young field, but we should not deceive ourselves into believing that what we have today is anything like more mature engineering fields. We should also strive to attain their levels by investing more into high quality studies and less "axioms".
Engineering uses knowledge, only some of which is scientific. A good engineer will use the best data where available, and where pragmatic will try to gather more (e.g., by making small prototypes to test a concept). But even today, a lot of engineering is based on "rules of thumb" and past experiences that are not the result of rigorous scientific study.
I agree that software development (engineering) is a young field, and thus there's even MORE we don't know compared to many other more mature fields.
We should certainly strive to increase knowledge via high quality studies. We currently have a serious problem: There is little incentive to do useful high-quality studies. Many people in CS are in a "publish or perish" mode. Publishing ideas is what's important, while scientifically showing whether or not something is true is not required (and takes much longer than posting an untested idea). Just look at paper after paper, and then ask how many show the results of scientifically rigorous experiments. There's remarkably little about the control vs. the experimental group (as an example). There is work that does real science, and I highly praise it - we just need more of it.
Designing, developing and maintaining software systems is most definitely a form of ‘real’ engineering, and should be referred to as such. Qualified software engineers have gained their qualifications in the field through various pathways including tertiary study and professional accreditation, which are rigorous. Not all forms of engineering are regulated, because they don’t need to be.
What many neglect as well is that in many cases, software is regulated. I’ve worked on software where there were real laws about how and what I could produce, how the data must be managed and where, and what degree of quality and testing must be maintained.
I had to write an oath to my province to execute on that faithfully. I was working with private health records. I know lives weren’t at risk, but it took a lot of knowledge and resources to execute on that properly, and maintaining quality and privacy around those systems is genuinely very important.
I don’t call myself an engineer, but I don’t think my job is a joke either.
Not to mention there is plenty of software which does put lives at risk. Aviation, medical, financial etc... And if it's not lives, it's huge amounts of capital.
Words change, and there is no such thing as a “true” or “phony” engineer. The original engineers built siege engines, and were weaponsmongers:
Middle English (denoting a designer and constructor of fortifications and weapons; formerly also as ingineer ): in early use from Old French engigneor, from medieval Latin ingeniator, from ingeniare ‘contrive, devise’, from Latin ingenium (see engine); in later use from French ingénieur or Italian ingegnere, also based on Latin ingenium, with the ending influenced by -eer.
So the evolution of the word engineer to encompass software engineering is its emergent, and I’d imagine by the middle of this century, it’s dominant, form.
As a person with a Engineering degree from High level accredited engineering university working as a software engineer. I disagree. Engineering is using formal methods to bring about a solution. Simply writing code is doing that. No certification is going to change that act. Now if you really want to say someone isnt an engineer for the 4 years they were engineering before their P.E. I think thats offensive to junior engineers.
In Canada, "Engineer" is a protected term...you legally can't use it unless you are a p.Eng. The idea is that Engineers are bound by a series of standards and assume responsibility of their efforts...especially around designs where human health/safety are concerned. And I get that. But, to say that engineering, as a rigorous discipline can only be practiced by engineers, seems wrong to me. Good practices are (or should be) universal and executable by anyone, and not the blessed few.
I had my boiler replaced by a gas engineer a few years ago. I don't think I'd call the work they did engineering, they were plumbers and "handymen". There was nothing in the realm of formal methods, or design. It was a fairly rote job for then (which they did well), but they're still formally called engineers.
Funny, as most programming is plumbing too (e.g. how do I get this value out of the database and into this UI widget; or how can I connect this library to that program).
Some of us make the systems that power spaceships, satellites. Some make critical infrastructure like email or messaging or video chat, payments etc etc etc. Just because there is no official body handing out titles doesn’t make the engineering any less “real”.
It is debatable if software engineers are "real" engineers or not, but in any case, the article is titled incorrectly as the axioms listed do not apply to any engineering discipline.
We could call ourselves software "engineers" if we use the correct practices. I believe everyone should watch the following video.
https://www.youtube.com/watch?v=NNGaXxMueiw
Pet peeve time. It would be so great if people used the word "axiom" to mean exclusively "I need to blindly assume this is true, otherwise I do not see how can I make any progress". This article contains, I think, author's advices.
The dictionaries (and people) usually conflate "a proposition which cannot actually be proved or disproved" with "a self-evident principle"/"an established principle", which looks like a very huge confusion to me. It's like mistaking math and physics. It's like claiming Euclid's fifth axiom is self-evident, when others have already shown this was just an implicitly made decision (hence non-euclidean geometries), which cut off some very interesting, but difficult, mental models. (Quotes are from https://en.wiktionary.org/wiki/axiom)
I wouldn't protest, but I see educated people repeating this confusion over and over like a broken record.
Or is there a more precise term in English for those things?