Hacker News new | past | comments | ask | show | jobs | submit login
Code is run more than read (olano.dev)
811 points by signa11 on Dec 1, 2023 | hide | past | favorite | 310 comments



Some users are not using a system because they like it but because their company bought it.

In those situations biz > user by definition and the developers end up having to cater to the needs of the middle managment of their customers rather than the needs of the actual users. The price of not doing this is failing to win the contract. Users then get locked in to whatever crap you have time to provide for them while you're really busy implementing new features that middle managment like.

You essentially only need a nice looking login screen and some sort of reporting and the rest.....doesn't matter much.

I am being a bit cynical but it does pay as an engineer to know if that's fundamentally the kind of company you're in.

An online retailer, for example, is hypersensitive to its users and I know of at least one that has different versions of its website for different countries because they know that Germans like X and Americans like Y - small changes make a huge difference to sales.

Other companies have no sensitivity to the usability of their products because the people who buy their products are never the users.


I used to work at a company that sold SaaS to big corporations.

We needed to win contracts, so we needed to tick their checkboxes, but we also cared for the user experience (good UX was almost never a strict requirement from our customer).

Our competitors' software was very painful to use, so we wanted to differentiate in this regard.

This made our own lives easier as the training was easier, the users we interacted with (that usually had no say in whether our solution was bought to them or our competitors') were happier and recommended to their managers (where they could) buying more stuff from us.

In the end this was 80% driven by pride (our software doesn't suck) and empathy (I couldn't stand using the software if it was as bad as our competitors') but to some extent this was also in our interest (especially in the long term, where your brand is built).


> the users we interacted with (that usually had no say in whether our solution was bought to them or our competitors') were happier and recommended to their managers (where they could) buying more stuff from us

And then the happy users switch to another company and start recommending your stuff to their new managers. It's an extra source of leads and sales :-)


> This made our own lives easier as the training was easier

Unfortunately that works until some brilliant mba shows up with a plan to sell 100x more consulting and more training by making the product intentionally much more difficult to use.


I wish more companies realized your last point.


I tried to be that guy that recommends better software whenever I changed companies. Unfortunately, being new also means you're not always taken as seriously and you are stuck with whatever crap that is used instead of the better experience you used to have.


I worked at a company in a market with similar purchasing dynamics, but we focused exclusively on the users. We committed to a "product-led growth strategy," which meant no salespeople. Product focused entirely on the user experience. The problem was, we weren't selling to users. We were selling to people in the users' organizations who purchased software. These people did not have the same job as the users, and they had no first-hand experience with the product.

It was a doomed approach. We needed salespeople to get inside the heads of the purchasers, learn how to explain the benefit to them, and coach the users on explaining the benefits to other people in their organization. We needed salespeople to bridge the chasm between the users and the purchasers.


Those that write the checks are also a user, just with different expectations than the daily user. Management is also a user that differs between the check writers and actually end users.


And this is why we need to get rid of middle management. Purchasing software for others to use reeks of inefficiency


Typically middle management are users as well, but they are a minority of the user base and use a different set of features (like reporting). So now this becomes a question of which users are prioritized and finding a balance between prioritizing the experience of the small number of users who hold power over the rest, and keeping the product usable enough for the rest of the users to provide some value in data to management.


Or as I've told the devs in the place I work "Don't fuck up the data path to the reports, it is a cardinal sin".


I've experienced this. I worked for a company that sold software to municipal governments. All that mattered was the Mayor/Town Manager/City Councils opinion. If the reports looked good and the price was right, they were going to renew.

I remember being in site meetings where people who used it every day would tell us to our face how terrible it was. Without fail, that site renewed with some promises to fix a couple specific bugs and a minimal price increase.


> the developers end up having to cater to the needs of the middle managment of their customers rather than the needs of the actual users

this is why all enterprise software sucks


> Some users are not using a system because they like it but because their company bought it.

This is why I use Windows and Visual Studio at work. I don't like either of them, but it's not my call.


This is short term thinking - users who hate software can voice enough complaints to get things changed in at least some situations. (not all: many SAP programs with garbage UIs exist)


The user of SAP you describe is not the 'user' in the sense of the article. The user is the one who pays, i.e. some other business.

SAP absolutely delights its users to the tune of a $200 billion market cap.


The article is missing that part - the two are sometimes different.


I wouldn’t say you have to cater to middle management instead of the end user. You just can if you want to. Of course you need to consider what middle management needs, since they’re paying you, but there is usually room for craftsmanship to bring a truly great UX to the end user. Most software engineers are lazy and lack a true sense of craft, so they usually skip building a great UX when it’s not a requirement.


In enterprise software, you cater exclusively to management, they will turn a blind eye to 99% of issues as long as the team is able to get the work done.

Take one look at EMRs and realize that its sold / marketed to .01% of the hospital, despite 80% of the health system using it.


> In enterprise software, you cater exclusively to management, they will turn a blind eye to 99% of issues as long as the team is able to get the work done.

Again, you can exclusively cater to management, but you don’t have to. Look at Datadog. It’s a great product, but still ultimately purchased by management.


Also, look at their competition. New Relic's product is so painful and broken that despite successfully winning deals with many large companies, they still manage to lose business to Datadog as their end users work extremely hard to convince management to not renew, and switch to Datadog, for the good of the company. Ask any insider.


Some organizations listen to their engineers/users too in the buy decision. Effective products do both - cater to management checklists, and cater to end user experiences.


Catering to a mixture of management and users helps sales. Having a product that users (reports) love will get your product to have internal advocates. Some managers are self centered jerks, but most will at least consider their reports advocacy if the product also has the features they need.


> Most software engineers are lazy and lack a true sense of craft, so they usually skip building a great UX when it’s not a requirement.

From my observations, it's usually not that devs are lazy or lack a sense of craft, it's that their employers are not willing to spend money building something that isn't actually a requirement.


That doesn’t explain why the functionality that is there is usually buggy and lackluster.


Because being bug free and lustrous isn't something that people are willing to pay for.


the explanation is simple - the immutable law of resource allocation:

"good, fast, cheap: pick two"


> so they usually skip building a great UX when it’s not a requirement

or worse think they are good at it. Then guard that GUI like it is their first born.


> I know of at least one that has different versions of its website for different countries because they know that Germans like X and Americans like Y - small changes make a huge difference to sales

Can you speak any more to this? Do you or anyone else have any examples? I would be very interested to see.


That's one interesting explanation of why EDA software like Cadence Virtuoso and ICCAP have had crappy UIs for decades.


I guess "online retailer" example could be extended to many of the companies that are creating consumer products.


Seems that has always been the case. A good example would be SAP.


I've never felt more seen as a user of AEC design software.


This is a very narrow minded take. Every big software success story - Gmail, Slack, Dropbox, Zoom… - was business-to-consumer explicitly or in disguise.

Then again, I’m not saying much that vendors screw up pricing when they choose a price other than “$0,” which is an essential part of B2B-disguised-as-B2C software. Easy for me to say.

Anyway, the boomers you are talking about are retiring out of the workforce, and it will be more likely than ever that the audience will be extremely sensitive to UX, desiring something the most like TikTok and Instagram than ever.


Slack is a weird choice of example, and so is Zoom. They're both currently getting their lunches eaten by MS Teams for specifically the reasons laid out in the grandparent.

Slack in particular had to take a just-OK buyout from Salesforce and the product has seriously stagnated.


I'm curious how you'd define "success story". From my POV, it seems like there are definitely some B2B successes, such as Oracle or Salesforce.


Most software is not written for such large audiences. Typical enterprise software is used by tens to hundreds of people. There is simply no budget to create user interfaces at TikTok standards.


> Every big software success story - Gmail, Slack, Dropbox, Zoom…

Your uses of "every" and "big" here seem idiosyncratic.


TIL of ≹, which "articulates a relationship where neither of the two compared entities is greater or lesser than the other, yet they aren't necessarily equal either. This nuanced distinction is essential in areas where there are different ways to compare entities that aren't strictly numerical." (https://www.mathematics-monster.com/symbols/Neither-Greater-...)


The example z_1 ≹ z_2 for complex numbers z_1, z_2 is weird. Imo it would be clearer to state |z_1| = |z_2|, that is both complex numbers have the same absolute value.

> To conclude, the ≹ symbol plays a crucial role in providing a middle ground between the traditional relational operators.

As a PhD student in math, I have never seen it before. I do not believe that it plays any crucial role.


It sounds like the symbol "≹" just means "incomparable", which is a well-known concept in math. https://en.wikipedia.org/wiki/Comparability

This symbol for it may be useful, but it's the concept that matters.


Might play a crucial role in some arcane subfield of set theory or something of the sort


I suppose that this glyph should result from a combination of emojis for apples and oranges.


Reminds me of the concept of games in combinatorial game theory, they are a superset of surreal numbers (which are themselves a superset of the real numbers) in which the definition of the surreal numbers is loosened in a way which looses the property of they being totally ordered. This creates games (read weird numbers) which can be "confused with" or "fuzzy" with other numbers, the simplest example is * (star) which is confused with 0, i.e. not bigger or smaller than it, it's a fuzzy cloud around zero (notated 0║*). More complex games called switches can be confused with bigger intervals of numbers and are considered "hot". By creating numbers from switches you can create even more interesting hot games.


Heres a relevant video on the topic: https://www.youtube.com/watch?v=ZYj4NkeGPdM

I really love that video.


Exactly this video made me read more into this topic, I'm currently reading winning ways and lessons in play simultaneously. It's quite fun! I've just gotten started and am looking forward for what's left.


Thanks for sharing this. I just discovered it now and I love it too.

The space of possible abstractions for any given phenomenon is vast, yet we almost always just assume that real numbers will do the trick and then begrudgingly allow complex ones when that doesn't work. If we're not lucky we end up with the wrong tool for the job, and we haven't equipped people to continue the exploration. It's a bias with some pretty serious consequences (thanks... Newton?).

I don't think I've seen the inadequacy of number-systems-you've-heard-of demonstrated so clearly as it is done here.


> By creating numbers from switches you can create even more interesting hot games.

Well, don't leave us hanging! What are some of your favorite hot games on top of switches?


I'm just starting to learn about all this stuff, but iirc thr game of Go is famously "hot". Also I will emphasize thats when talking about "games", usually what is meant is a game position. What spe cific game you are playing isn't too important as it can be shown that some positions in different games are equivalent.


I find this concept is important in understanding causal ordering for distributed systems, for example in the context of CRDTs. For events generated on a single device, you always have a complete ordering. But if you generate events on two separate devices while offline, you can't say one came before the other, and end up with a ≹ relationship between the two. Or put differently, the events are considered concurrent.

So you can end up with a sequence "d > b > a" and "d > c > a", but "c ≹ b".

Defining how tie-breaking for those cases are deterministically performed is a big part of the problem that CRDTs solve.


Likewise, TIL. In your link however it states

"Example 1: Numerical Context

Let's consider two real numbers, a and b. If a is neither greater than nor less than b, but they aren't explicitly equal, the relationship is ≹"

How can that be possible?


It is false - real numbers fulfill the trichotomy property, which is precisely the lack of such a relationship: every two real number is either less than, equal or greater than.

But the numerical context can still be correct: (edit: ~~imaginary~~) complex numbers for example don’t have such a property.


As I've written in another comment here, a great example of a number-y field which is not totally ordered is Games ⊂ Surreal Numbers ⊂ ℝ. There you have certain "numbers" which can can confused with (read incomparable) whole intervals of numbers. Games are really cool :)


You've got the subset relationships backwards: Reals are a subset of the field of Surreal Numbers which is a subset of the group of Games. (Probably better phrased as embeddings, rather than subsets, but the point remains...)

Note that Games do _not_ form a field: there is no general multiplication operation between arbitrary games.


i'm learning about this right here as I read, but do you mean complex numbers rather than imaginary?


Same question.

Or more generally, vectors. They don't have a total order, because as we define "less than"/"greater than" in terms of magnitude (length), this means for any vector V (other than 0), there's an infinitely many vectors that are not equal to V, but whose length is equal to length of V.

Is this is what ≹ is talking about?


Yep, that’s what I meant, sorry!


This doesn't really pass the small test for me either, but to play devils advocate:

Imagine you have 2 irrational numbers, and for some a priori reason you know they cannot be equal. You write a computer program to calculate them to arbitrary precision, but no matter how many digits you generate they are identical to that approximation. You know that there must be some point at which they diverge, with one being larger than the other, but you cannot determine when or by how much.


Maybe you will find the proof that the infinite series 0.9999... exactly equals 1 interesting:

https://en.wikipedia.org/wiki/0.999...


Wow, can't believe I've never realised this. How counterintuitive.

The 1/3 * 3 argument, I found the most intuitive.


It's a flawed psychological argument though, because it hinges on accepting that 0.333...=1/3, for which the proof is the same as for 0.999...=1. People have less of a problem with 1/3 so they gloss over this - for some reason, nobody ever says "but there is always a ...3 missing to 1/3" or something.


The problem is that there are two different ways to write the same number in infinite decimals notation. (0.999... and 1.000...).

Thats what's counter intuitive to people, it's not an issue with 1/3. That has just one way to write it as decimals, 0.333...


Another intuition:

All the decimals that recur are fractions with a denominator of 9.

E.g. 0.1111.... is 1/9

0.7777.... is 7/9

It therefore stands to reason that 0.99999.... is 9/9, which is 1


That's a good one! Might replace my current favorite which is:

Let x = 0.99...

Then 10*x = 9.99...

And if we subtract x from both sides, we get:

10x - x = 9.99... - x

And since we already defined x=0.99... when we subtract it from 9.99..., we get

9x = 9

So we can finally divide both sides by 9:

x = 1


I like the argument that observes "if you subtract 0.99(9) from 1, you get a number in which every decimal place is zero".

The geometric series proof is less fun but more straightforward.

As a fun side note, the geometric series proof will also tell you that the sum of every nonnegative power of 2 works out to -1, and this is in fact how we represent -1 in computers.


How can the sum of a bunch of positive powers powers of 2 be a negative number?

Isn't the sum of any infinite series of positive numbers infinity?


https://youtu.be/krtf-v19TJg?si=Tpa3EW88Z__wfOQy&t=75

You can 'represent' the process of summing an infinite number of positive powers of x as a formula. That formula corresponds 1:1 to the process only for -1 < x < 1. However, when you plug 2 into that formula you essentially jump past the discontinuity at x = 1 and land on a finite value of -1. This 'makes sense' and is useful in certain applications.


The infinite sum of powers of 2 indeed diverges in the real numbers. However, in the 2-adic numbers, it does actually equal -1.

https://en.wikipedia.org/wiki/P-adic_number


Eh, P-adic numbers basically write the digits backwards, so "-1" has very little relation to a normal -1.


Any ring automatically gains all integers as meaningful symbols because there is exactly one ring homomorphism from Z to the ring.


-1 means that when you add 1, you get 0. And the 2-adic number …11111 has this property.


\1 is a good question that deserves an answer.

\2 is "not always" ..

Consider SumOf 1 + 1/2 + 1/4 + 1/8 + 1/16 + 1/32 ...

an infinite sequence of continuously decreasing numbers, the more you add the smaller the quantity added becomes.

It appears to approach but never reach some finite limit.

Unless, of course, by "Number" you mean "whole integer" | counting number, etc.

It's important to nail down those definitions.


> \1 is a good question that deserves an answer.

The same argument I mentioned above, that subtracting 0.99999... from 1 will give you a number that is equal to zero, will also tell you that binary ...11111 or decimal ...999999 is equal to negative one. If you add one to the value, you will get a number that is equal to zero.

You might object that there is an infinite carry bit, but in that case you should also object that there is an infinitesimal residual when you subtract 0.9999... from 1.

It works for everything, not just -1. The infinite bit pattern ...(01)010101 is, according to the geometric series formula, equal to -1/3 [1 + 4 + 16 + 64 + ... = 1 / (1-4)]. What happens if you multiply it by 3?

        ...0101010101
      x            11
    -------------------
        ...0101010101
     + ...01010101010
    -------------------
       ...11111111111
You get -1.


But if you look at limits you get "0" and "diverges".

And decimal "...999999" is an infinity, which should immediately set off red flags and tell you that you need to be extra careful when analyzing it.

In computers your series of 1s is not infinite, there's a modulus that steps in. And this analysis depends on the modulus being an exact power of the base. But you could make a system that's decimal but has a modulus of 999853, for example, and then "-1" would be 999852.


> In computers your series of 1s is not infinite, there's a modulus that steps in. And this analysis depends on the modulus being an exact power of the base.

That isn't quite correct. The series of 1s really is conceptually infinite. That's why we have sign extension. The analysis (of the sum of all natural powers of 2) will work for any modulus that is an integral power of 2, including a modulus where the integer to which 2 is raised is infinitely large. Such an infinite modulus will still be evenly divided by a finite power of 2 -- as well as by itself -- and so it will disappear whenever you're working in any finite modulus that is a power of 2 -- or when you are working modulo 2^ℕ. The modulus of 2^ℕ will prevent any distinct finite integers from falling into the same equivalence class.

This is what enables you to have an infinite series of leading 1s, or leading patterns, without problems.


In an introductory course to String Theory they tried to tell me that 1+2+3+4+... = -1/12.

There is some weird appeal to the Zeta function which implies this result and apparently even has some use in String Theory, but I cannot say I was ever convinced. I then dropped the class. (Not the only thing that I couldn't wrap my head around, though.)


The result isn't owed to the zeta function. For example, Ramanujan derived it by relating the series to the product of two infinite polynomials, (1 - x + x² - x³ + ...) × (1 - x + x² - x³ + ...). (Ok, it's the square of one infinite polynomial.)

Do that multiplication and you'll find the result is (1 - 2x + 3x² - 4x⁴ + ...). So the sum of the sequence of coefficients {1, -2, 3, -4, ...} is taken to be the square of the sum of the sequence {1, -1, 1, -1, ...} (because the polynomial associated with the first sequence is the square of the polynomial associated with the second sequence), and the sum of the all-positive sequence {1, 2, 3, 4, ...} is calculated by a simpler algebraic relationship to the half-negative sequence {1, -2, 3, -4, ...}.

The zeta function is just a piece of evidence that the derivation of the value is correct in a sense - at the point where the zeta function would be defined by the infinite sum 1 + 2 + 3 + ..., to the extent that it is possible to assign a value to the zeta function at that point, the value must be -1/12.

https://www.youtube.com/watch?v=jcKRGpMiVTw is a youtube video (Mathologer) which goes over this material fairly carefully.


That doesn't seem possible with the reals. An example from programming that comes to mind is NaN ≹ NaN (in some languages).


The floating point specification mandates that nan does not compare to nan in any way, so it should be all languages. If you want to know nam, use isnan()


Isn't this what happens with infs (in maths and in many programming languages)?

Edit: Not in many programming languages. In IEEE-754 inf == inf. In SymPy too oo == oo, although it's a bit controversial. Feels sketchy.


It isn't. The real numbers are a totally ordered field. Any two real numbers are comparable to each other.


Think how any number on the z axis of complex plain isn't equal to the a number of same magnitude, on x and y axis.

Now if you really think about, a number of a given magnitude on x axis also isn't exactly "equal" to a name of same magnitude on y axis or vice versa. Other wise, -5 and 5 should be equal, because they're the same magnitude from 0.


But |5|=|-5| so I don't exactly see your point.

Edit: oh, I see what you mean. 1 is not larger or smaller than i, but it also doesn't equal i.


This is just imposing an chosen (and in this case, pretty natural) order on a set without total order. But as you note, you have to give something up. In this case the order identifies each number on a circle in the complex plane with a single value, their magnitude.

I've never really seen this notation used, but it could have some use in partially-ordered sets.


It is formalisms all the way down.


Probably does not apply for real numbers, but could totally apply to, e.g., fuzzy numbers, whose 'membership function' bleeds beyond the 'crisp' number into nearby numbers.

You could imagine two fuzzy numbers with the same 'crisp' number having different membership profiles, and thus not being "equal", while at the same time being definitely not less and not greater at the same time.

Having said that, this all depends on appropriate definitions for all those concepts. You could argue that having the same 'crisp' representation would make them 'equal' but not 'equivalent', if that was the definition you chose. So a lot of this comes down to how you define equality / comparisons in whichever domain you're dealing with.


Perhaps this: if they represent angles, then 1 and 361 represents the same absolute orientation, but they're not the same as 361 indicates you went one full revolution to get there.

Contrived, but only thing I could think of.


I think they mean the case where a and b are variables for which you don't know the values.


Yeah, that's how I understood it. E.g one might write

  (a+b)^2 != a^2 + b^2
To mean that in general the equality doesn't hold. Despite exceptions like a=b=0

Strictly you should write something like

  ¬[∀ a,b  (a+b)^2 != a^2 + b^2]
But shorthand and abuse of notation are hardly rare


Noticed a copy and paste error too late - the != in the second expression should of course be =


No, that can be the case also for mathematical entities for which you can know the values, not just for "unknown variables".


I would imagine trying to compare a purely imaginary number (3i) to a real number (3) would suffice.


An imaginary number wouldn't obey the stated constraint of being real.


No, but if the parent's question goes beyond "how can this happen with reals" to "how can this happen with numbers in general", this answers his question.


The very next example on the page is "imagine two complex numbers with the same magnitude and different angles". For that to answer the parent's question, you'd have to assume he stopped reading immediately after seeing the part he quoted.

The question is why the page says "imagine two real numbers that aren't comparable".


for the Reals, it is only hypothetical, the domain has a total order.

\inf and $\inf + 1$ comes to mind but I don't think it really counts


> \inf and $\inf + 1$ comes to mind but I don't think it really counts

That just depends on the numeric structure you're working with. In the extended reals, +inf is equal to +inf + 1.

In a structure with more infinite values than that, it would generally be less. But they wouldn't be incomparable; nothing says "comparable values" quite like the pair "x" and "x + 1".


I guess it depends on the exact definitions, but reals usually doesn’t include the infinities. At my uni we introduced infinities precisely as an extension of the reals with two values defined by `lim`.


Hmm, what about +0 and -0?


+0 = -0


The jargon from category theory for this phenomenon is - partial ordering.

It really is an interesting thing. In fact, as human beings who by nature think in terms of abstract, non-concrete units (as opposed to mathematically precise units like a computer program), we tend to compare two related things. They might belong to the same category of things, but they might not be eligible for direct comparison at all.

Once you internalize partial ordering, our brain gets a little more comfortable handling similar, yet incomparable analogies.


One example would be if you define one set A to be "less than" another B if A is a subset of B. Then ∅ < {0} and {0} < {0, 1} but {0} ≹ {1}.

Such a thing is called a partial ordering and a set of values with a partial ordering is called a partially ordered set or poset (pronounced Poe-set) for short.

https://en.wikipedia.org/wiki/Partially_ordered_set


I was so hoping I could win the day with 0 ≹ -0. But alas, 0 == -0 and 0 === -0.


NaN ≹ NaN


That article is misinterpreting the meaning of the symbol. It isn't useful in mathematics because it is a contradiction in terms: if "neither of the two compared entities is greater or lesser than the other" then they are equal.

The author of the original article uses it correctly - think about it more in regards to importance for their example.

The business is no more or less important than the developer, but they are NOT equal.

It doesn't have to mean importance though, just the method by which you are comparing things.

Monday ≹ Wednesday

Come to think of it, it should be called the 'No better than' operator.


> if "neither of the two compared entities is greater or lesser than the other" then they are equal.

Not in a partial order.

For example in this simple lattice structure, where lines mark that their top end in greater than their bottom end:

      11
     /  \
    01  10
     \  /
      00
11 is > to all other (by transitivity for 00), 00 is < to all other (by transitivity for 11), but 01 is not comparable with 10, it is neither lesser nor greater given the described partial order.

You can actually see this kind of structure everyday: unix file permissions for example. Given a user and a file, the permissions of the user are is an element of a lattice where the top element is rwx (or 111 in binary, or 7 in decimal, which means the user has all three permissions to read, write, and execute) and the bottom element is --- (or 000, in binary, or 0 in decimal, which means the user has no permissions). All other combination of r, w, and x are possible, but not always comparable: r-x is not greater nor lesser than rw- in the permissions lattice, it's just different.


Yes, or for more familiar examples: coordinates and complex numbers. the "default" less-than and greater-than don't have any meaning for them; you have to define one, which may be "imperfect" (because one can't do better), hence the concept of partial order.


> It isn't useful in mathematics because it is a contradiction in terms: if "neither of the two compared entities is greater or lesser than the other" then they are equal.

That’s only true for a total order; there are many interesting orders that do not have this property.

It holds for the usual ordering on N, Z, Q and R, but it doesn’t hold for more general partially ordered sets.

In general one has to prove that an order is total, and this is frequently non-trivial: Cantor-Schröder-Bernstein can be seen as a proof that the cardinal numbers have a total order.


Example: alphabetic ordering in most languages with diacritics. For example, "ea" < "éz", but also "éa" < "ez". That's because e and é are treated the same as far as the ordering function is concerned, but they are obviously also NOT the same glyph.


That’s only true for linearly ordered structures, but isn’t true for partially ordered ones.

For example, set inclusion. Two different sets can be neither greater than not smaller than each other. Sets ordered by inclusion form a partially ordered lattice.


Is that really a contradiction? What about complex numbers?


That suggests to me that you've got a multi objective optimization problem with conflicting objectives and a pareto optimal solution that balances the tradeoffs between the two objectives. If you swing it too far one way you've got a '>' and need to swing it back the other way, but go too far the other way and you've got a '<'. And they're definitely not equal since they pull in different directions.



Apples ≹ pears.


For many of us, running our code 1 billion times will cost less than a few minutes of a developer's time.

Hell, I could spend $200 for a month of server time on AWS and run a lot of my (web API) code 100 billion times.

Optimizing for human readers is always better until you're working on something that proves itself to be too slow to be economical anymore.


It seems like the author agrees with you and picked a confusing title for the article. The article ends with a set of equations:

    user > ops > dev
    biz > ops > dev
    biz ≹ user
The conclusion seems to be that code exists in service to the end-user and the business. The last equation (≹) is a neat way of describing that both end-user and the business are equally important to the existence of the code, even though their needs aren’t the same.


> picked a confusing title for the article

Or they picked a title that would let you rapidly spot who didn't even skim the article.


Of all the functions a title could have, I feel this is not one of them. That would result in intentionally misleading titles.


Neat summary. I think many developers experience the degree to which biz and user diverges as a source of problems: the more divergence, the more problems.


The problem with "costs less than developer's time" math is that, usually, it's not you who's paying. Your users are, often in nonobvious ways, such as through higher electricity bills, reduced lifespan[0], lost opportunities, increased frustration, and more frequent hardware upgrades.

(And most of your users don't have developer's salary, or developer's quality of life, so it hurts them that many times more.)

--

[0] - Yes, wasting someone's time reduces QALY.


I've been calling the thinking of the parent "trickle-down devonomics": that by making the code better for the user, the benefits will trickle down to the users. Obviously, as the name suggests, that never happens. Devs get the quality of life and users end up paying for our disregard.


Loved the analogy. Also tracks with the fact that devs hold all the power in the user x dev relationship.

> that by making the code better for the user,

Did you mean the dev?


Users can "vote with their feet (or wallet)"... Sometimes.


Sorry, yes I did.


This implies that if a developer works half as long on something, all of that money that would be spent on them is spread out amongst their users. Which makes absolutely no sense.

Abstractions MIGHT make your code slower. But there's a reason we're not using assembly: Because the minor efficiency hit on the software doesn't match up with the bugs, the salary towards the experts, the compilation errors, etc.

A VM is a pretty good tradeoff for users, not just devs.


FYI: QALY = quality-adjusted life year[0]

[0] https://en.m.wikipedia.org/wiki/Quality-adjusted_life_year


From the article:

> When I say “run” I don’t just mean executing a program; I mean operating it in production, with all that it entails: deploying, upgrading, observing, auditing, monitoring, fixing, decommissioning, etc


So op use this point turn back to support the opinion that readability is more important.

the inference process in article is interesting, but title tempt us to debate about a less related topic.

thanks for your comments let me finish reading.


In my experience, you need to care about latency. That affects user experience. It's quite hard to pay for better latency.


That depends on the application and the use case, but good performance and good readability aren't mutually exclusive. Easy to read software might not always be the most performant, but it's far easier to improve performance in an easy to read codebase than it is to make a hard to read but performant codebase easier to read.


Yeah, that's right. I just feel that latency is sometimes missing from the discussions re. developer efficiency vs paying more for compute.


Yep. Theres a huge experiential difference between something happening instantly and it happening in 200ms or so. Especially when typing or playing video games.


It's everywhere. Next day delivery vs in a week. Streaming vs DVD rental. CI jobs. Notification about relevant news for stock traders. Switching files in editor. Jump to definition in IDE. Typing and videogames you mention have a very tight latency budget.

If you don't fit in the budget for the specific task, your product features don't matter much.


I see far more examples of per premature optimization (resulting in horrible code) than performance problems.


You do that by paying for better developers ;)


I can often get better latency by throwing a few extra bucks at vertical scaling. Still cheaper than a few hours of dev time a month.

Like I said, it works until it doesn't, and then you do have to optimize for performance to some extent.


The universe puts a hard speed limit on latency, but will give you all the bandwidth you want. There’s something almost mystical about latency, we should be very prudent when spending it.


That's the response to the article I was expecting to read. This is a different article, though. Go ahead and read it, worth it!


Also, for my compiled code (in Go), the code that I write is not the code that the compiler generates. I can write simple code that's easy for {{me in 3 months}} to read and let the compiler do the fancy stuff to make it fast.


For what it’s worth, Go might not be the best example here as its compiler does very little (hence the fast compile times).

Some LLVM-based language would fit the bill better, like Rust, C (also true of the intel compiler and gcc), C++, etc.


1million users waiting even a 1 second longer is about a month of single developers time.

This sort of calculations you preach are inherently untrue, as they completely ignore that 1second times million. After all, nobody bothers economically evaluate just a single second. But it does account to much, when multiplied by the ammount of users. And when we multiply again, by the times a single user uses your software, and then again, by the time a users uses different software from other developers who also thought "it's only 1 second, nobody cares", we end up living in world where software usability gets lower and lower despite hardware getting faster and faster.

We end up living in a world where literally weeks are wasted everyday, waiting for slow windows file explorer. If you'd want to evaluate that honestly, you would probably come to conclusion that microsoft should have a dedicated team, working for decade on nothing but explorer startup optimization, and it would still pay for it self.

But they don't. Because at the end of the day, this whole "lets evaluate developers time working on given improvement" is just a cope and justification of being us lazy, that only pretends to be an objective argument so we can make ourself feels better


I think the headline is misleading in that regard. I think that having easy-to-run software is part of having a well-designed one. Just last week I heard about a software upgrade that would require a downtime of 5 days. The software has about 100 users and is mostly a flexible web application to collect basic information (nothing fancy). Imagine the cost this creates for the business compared to an upgrade that takes a few hours.

So running the software includes more than just the server costs.


It seems like a false trade-off in the first place. The point of writing readable, maintainable code is that your team will be able to edit it later. Including adding performance enhancements.

Another way of stating the relationship could be something like: You have fewer brain-cycles to apply to optimization than the combined sum of everyone who will ever read your code, if your code matters. But that is a mouthful and kind of negative.


This isn't what the article is about.


I think the corollary to the title (to turn it around on the author) is not 'Code is read more than written' but 'code that can't be read won't run for long'. Disclaimer: Experienced sysadmin trying to make a lateral move to development and as such a complete noob.


> code that can't be read won't run for long

There's plenty of ossified code people are scared to touch because they don't understand it, but stake their business on it :)


Or just code that works and nobody wants or needs to spend money on changing. I've written such code, very crappy stuff. And then coming back to the client many, many years later, finding it all just humming along, driving their business critical applications without it or the computer it runs on ever having been updated once in 5 years or so. I was so surprised, and a bit scared.

Sometimes when you don't change anything, it just keeps working.

So I guess that makes it a very boring:

   code that can't be read won't be changed and will not be relevant for long, but not always


One of my clients is convinced that a short hunk of JavaScript code that nobody can clearly describe the behavior of is their most valuable asset.

Non-coders are weird.


I (very briefly) worked in a startup whose business was search (on a certain domain) and they had no tests for their "Search.java" file (the real name was 300x longer, because java devs…).

I had found some operations to tweak the scoring, except that some were multiplications by one, so I removed them. But I got told to not touch them because they wouldn't know if I had broken it until some customer complained.

The CTO told me off for my completely irresponsible behaviour.


And even executables for which the source code has been lost but which are still important to business operations.


Proprietary software you don't have sources for (say e.g. a third-party library), or just about any black-box system, are counterexamples to your corollary.


Well sort of, I can't read the binaries but I expect the vendor to have source and maintain it properly right?


There's a lot of software that only exists in binary and whose vendors/maintainers are long gone, and that is used for decades because it just does its job.


You’re right, the sibling to the comment I responded to mentioned this as well and it gave me flashbacks to emulating an Amiga system to get a multi million dollar industrial system up and running.


I've had a coworker commit a binary on git and keep the source on his machine.

I'm sure it has happened more than once.


Yeah, some code runs for so long that the system's owners have issues with repairing/replacing the hardware it can only run on if it fails (and then sometimes resort to emulation of the old hardware to be able to keep running it).


The entire financial industry disagrees. Also can I interest you in coming out of retirement to explain your cobol code to other developers?


Or to help us understand that mission-critical MS Excel 5.0 spreadsheet? We need to retire that Windows 95 machine under John's desk.


I think it can be run for as long as you have the right infrastructure.

I'd say it's more 'code that can't be read won't be modifiable for long'.


Your point isn't a bad one, but it's really a separate topic. Assuming we aren't deailing with deliberate obfuscation, most code can be read by people who can be bothered to try, and there are always code formatters if necessary.


That would be nice if it were true :(


Welcome to this side of the shop, I hope we can all make you feel welcome. :)

Bad news: too few experienced ops people became one less!


I have a corollary to this: there are a series of exponential increases in usage counts between each of:

    1. Language designers & standard lib developers.
    2. Shared module or library developers.
    3. Ordinary developers.
    4. End-users.
For many languages, the ratios are on the order of 1,000x at each stage, so for each 1x language designer there might be 1,000 people designing and publishing modules for it, a million developers, and a billion users. Obviously these numbers change dramatically depending on the specific circumstances, but the orders of magnitude are close enough for a qualitative discussion.

The point is that the tiniest bit of laziness at the first or second tiers has a dramatic multiplicative effect downstream. A dirty hack to save a minute made by someone "for their own convenience" at step #1 can waste literally millions of hours of other people's precious lives. Either because they're waiting for slow software, or frustrated by a crash, or waiting for a feature that took too long to develop at steps #2 or #3.

It takes an incredible level of self-discipline and personal ethics to maintain the required level of quality in the first two steps. Conversely, it deeply saddens me every time I hear someone defending an unjustifiable position to do with core language or standard library design.

"You just have to know the full history of why this thing has sharp edges, and then you'll be fine! Just be eternally vigilant, and then it's not a problem. As long as you don't misuse it, it's not unsafe/insecure/slow/problematic." is the type of thing I hear regularly when discussing something that I just know will be tripping up developers for decades, slowing down software for millions or billions.


So, the author hijacks a perfectly good rule of thumb to build their grand theory of everything. It is all nice, clean, and wise, and besides the tortured turn of phrase is just rechewing of popular truisms.


so:

  theory > /dev/null


Beautiful.


    blog < /dev/random


pragmatism > theory


> the tortured turn of phrase

Usual reminder that many people in our industry are not native speakers and don't live in an English speaking country, and yet they make the effort to write in English, which may explain the "tortured turn of phrase".

> just rechewing of popular truisms

And yet these "popular truisms" are particularly well put together in a coherent way, which makes this post a useful reference.


Honestly, I think there's value both in riffing off rules of thumb and using that riff to revisit and re-contextualise things we already think we know.

Everything is new to someone, and even if this was just confirming my own biases I found it an interesting take.


To say it more precisely, their grand theory of everything that can go wrong in software development. But interesting read nevertheless...


Did you read to the end? Everything else is backstory.


The author's framing can be misunderstood in so many ways that it is not a useful shorthand at all. There can be no absolute rank order of these tokens.

Firstly, in this framing, the "dev" is not one person but it is a collective for lots of people with varied expertise and seniority levels in different orgs – product, engineering and design orgs.

Then, "ops" is again not one thing and not just engineering ops. It could be bizops, customer support etc. too.

Then, "biz" isn't one thing either. There's branding/marketing/sales/legal etc. and execteam/board/regulators/lenders/investors etc.

All of these people affect what code is written and how it is written and how and when it is shipped to users. Everyone should be solving the same "problem".

A lot of the times, a lot of people within the org are just there to make sure that everyone understands/sees the same "problem" and is working towards the same goals.

But that understanding is continuously evolving. And there is lag in propagation of it throughout the org. And hence, there is lag in everyone working towards the same goal – while goal itself is being evolved.

Finally, "user" is not one thing either nor any one cohort of users are static. There are many different cohorts of users and these cohorts don't necessarily have long-term stable behaviors.

So, it helps to understand and acknowledge how all the variables are changing around you and make sense of the imperfect broken world around you with that context. Otherwise it is very easy to say everyone else sucks and everything is broken and you want to restart building everything from scratch and fall into that and other well-known pitfalls.


I'm glad to see something approaching ethics discussed:

> There’s a mismatch between what we thought doing a good job was and what a significant part of the industry considers profitable, and I think that explains the increasing discomfort of many software professionals.

"Discomfort" is quite the understatement. This leaves so much unsaid.

I will add some questions:

- What happens when your users are not your customers (the ones that pay)?

- Does your business have any ethical obligation to your users -- all of them -- even the ones that do not pay?

- What happens when your paying customers seek to use your business in ways that have negative downstream effects for your users?

For example, what if:

- Your platform makes fraud easier than the existing alternatives?

- Your platform makes it easier to misinform people in comparison to the alternatives?

- Your platform makes it easier to shape user opinions in ways that are attractive (habit-forming) but destructive in the long-term?

All of these have proven to be successful business models, over some time scales!

Given the reality of the dynamic, should a business pursue such an exploitative model? If so, can it do so more or less responsibly? Can a more ethical version of the business mitigate the worst tendencies of competitors? Or will it tend to become part of the problem?

A key take-away is clear: some classes of problems are bigger and more important than the business model. There are classes of problems that can be framed as: what are the norms and rules we need _such that_ businesses operate in some realm of sensibility?

Finally, I want to make this point crystal clear: a business inherently conveys a set of values: this is unavoidable. There is no escaping it. Even if a business merely takes the stance of 'popularity wins', that is in itself a choice that has deep implications on values. Political scientists and historians have known for years about the problems called 'tyranny of the majority'. Food for thought, no matter what your political philosophy.

I don't know 'the best' set of ethics, but I know that some are better than others. And I hope/expect that we'll continue to refine our ethics rather than leave them unexamined.

[Updates/edit complete as of 8:03 AM eastern time]


I believe this is a different problem than the essay is speaking about. You can choose what problems and domains fit your ethics. This is about how you build a system and how you prioritize the work.


> I believe this is a different problem than the essay is speaking about.

Hardly. I'll quote the last paragraph and the three inequalities:

> There’s a mismatch between what we thought doing a good job was and what a significant part of the industry considers profitable, and I think that explains the increasing discomfort of many software professionals. And while we can’t just go back to ignoring the economic realities of our discipline, perhaps we should take a stronger ethical stand not to harm users. Acknowledging that the user may not always come before the business, but that the business shouldn’t unconditionally come first, either:

    user > ops > dev
    biz > ops > dev
    biz ≹ user
First, I want to emphasize "perhaps we should take a stronger ethical stand not to harm users". The author did a nice job of "throwing it our faces" but the underlying ethical currents are indeed there.

Second, "the user may not always come before the business, but that the business shouldn’t unconditionally come first". This is very much aligned with my question "what are the norms and rules we need _such that_ businesses operate in some realm of sensibility?"

...

Ok, putting aside debates around the author's intent or 'valid scope' of this discussion (which by the way, is a rather organic thing, computed lazily by the participants, rather than by fiat), I'd like to add some additional thoughts...

In much of the software world there is a mentality of "We'll figure out Problem X (such as a particular problem of scaling) if we get to that point." I'll make this claim: naively deferring any such problems that pertain to ethics are fraught. Of course there are practical considerations and people are not angels! For precisely these reasons, ethics must be something we study and put into practice before other constraints start to lock in a suboptimal path.

I often look at ethics from a consequentialist point of view that includes probabilities of system behavior. This could be thought of as computing an 'expected future value' for a particular present decision.

If one applies such an ethical model, I think the impacts of choices become clearer. And it becomes harder to use false-choice reasoning to demonize others and exonerate ourselves. For example, if a particular business model has a significant probability of harming people, one cannot claim ignorance much less complete innocence when those harms happen. They were no surprise, at least to people who pay attention and follow the probabilities.

Ascribing blame is quite difficult. I like to think of blame as being a question largely of statistical inference. [1] But even if we all agreed to a set of ethical standards, the statistical inference problem (multicollinearity for example) would remain. There is plenty of blame to go around, so to speak. But certain actions (very highly influenced by mental models and circumstances) contribute more than others. [2]

To what degree is ignorance an ethical defense? This is a tough one. Not all people nor entities have the same computational horsepower nor awareness. I don't have the answers, but I have not yet found a well-known ethicist in the public eye that speaks in these terms to a broad audience. To me, the lack of such a voice means I need to find more people like that and/or contribute my voice to the conversation. The current language around ethics feels incredibly naive to my ears.

[1] I agree that for most people, ethical rules of thumb get us 'most of the way there'. But these heuristics are imperfect.

[2] To be clear, I see blame as often overused. I care relatively less about blaming a person for mistakes. I care much more about what a person's character tells us about how they will behave in the future. That said, a corporate entity is not a person deserving of such generosity. Corporate entities are not first-class entities deserving human-rights level protection. A legal entity is a derivative entity; one created in the context of laws which should rightly function to better the society in which it is formed. A corporate entity can rightly be judged / evaluated in terms of its behaviors and internal structure and what this entails for its future likely behavior. We don't expect corporate entities to be charities, for sure, but we also didn't consciously design the legal environment so that corporate entities can actively undermine the conditions for a thriving society with impunity.


I meant to say the author did a nice job of _not_ "throwing it our faces".


Businesses don’t really exist, they are an imaginary construct that we’ve come up with to help organize resources, ultimately in the interests of working together.

Business isn’t more important than anything. There are multiple users, sometimes with competing interests; you can’t be everywhere and everything, so you have to prioritize. Going after more profitable users or users that align with some long-term strategy could be seen as “good for the business,” but really the goal is to serve the users (it might just take a couple extra steps).

When the internal politics get confused to the point that people are making decisions just to further the interests of the business without figuring out how it leads to user happiness, the organization has become poisonous. It shouldn’t exist anymore. It might lurch on in a zombie state for quite some time. But it is on the decline, and all the good people will leave.


Businesses don’t really exist? That's like saying feelings don't exist: they're just constructs that we come up with to explain our reactions to situations. I suppose that's true to a degree, but something doesn't need to be made of atoms to be "real".

Businesses exist in as much as they are the primary determinant of most people's lives. They shape our cities, our media, laws, politics, foreign policy and have a huge impact in just about everything else that matters. Real or not, it has _real_ impacts all around us.

Outside of OSS, I think it's pretty clear that the entity that's footing the bill gets to make the call on how things get made. Even if that's: bad for said entity, bad for the user, bad for the public generally, the environment, etc. (of course, there's industry and governmental regulations, but by and large the company calls the shots).


> Businesses exist in as much as they are the primary determinant of most people's lives. They shape our cities, our media, laws, politics, foreign policy and have a huge impact in just about everything else that matters.

By this standard, ancient religious figures (now mostly regarded as mythological) exist. Thor, or Zeus (if anyone is a pagan here, I don’t mean any disrespect to your beliefs, but let’s think about whichever one you don’t believe in).

> Real or not, it has _real_ impacts all around us.

Sure. But people were much more devoted to these figures than any business! Religious wars were fought, people lived and died for these gods. But some of these figures still were imaginary. Being imaginary doesn’t mean they aren’t important. But it means they don’t have interests. In reality, these organizations (religions, businesses) have members, users, and leaders/owners, and the business is an abstract representation of those people.

That’s why it doesn’t make sense to ask whether “the business,” which exists entirely as an abstraction for some of their interests, ought to be prioritized above or below them.


This is simply not true. Business exist as legal constructs and there are many things that are good for the business but bad for almost everyone else. Business also do not exist to serve users, whatever definition of user you might have.

Business, unfortunately, exist to serve their owners. In most cases (I'm talking primarily about larger companies, not <5 people micro-businesses) the owners want money, so everyone at the company is there in order to get the owner more money. Happiness of anyone else, let alone the users, is entirely irrelevant, unless it happens to correlate with revenue. The only other universal incentive inside a company is self-preservstion, so besides doing whatever makes money, decision-makers will also take their own job security into account when making decisions.

Employees won't leave, because the company will make sure they're happy enough. It's surprisingly easy to keep people working for evil and/or faceless organisations if you pay them well, make them feel like part of a "community", etc. (see any FAANG office for an in-depth catalogue of these HR tricks)

I agree that this is "poisonous" and that such a company "shouldn't exist anymore", but this is how companies work in practice. This is not a sign of any kind of decline, but of a mature and healthy business that can keep going for decades. Execs change, products change, even owners change, but the business remains.


Pretty sure the <5 ppl micro businesses just want money too


I think our perspectives are not really too far apart. Maybe it is just a matter of emphasis.

> This is simply not true. Business exist as legal constructs and there are many things that are good for the business but bad for almost everyone else.

It “exists” as an imaginary construct. The imagining of it exists.

Imaginary constructs have been hugely influential through history. Just think of the mythical figures in whatever religion you don’t believe in. People live and die for these figures. But the figures themselves don’t have any interests because only the belief in them exists, not the actual figures.

> Business, unfortunately, exist to serve their owners. In most cases (I'm talking primarily about larger companies, not <5 people micro-businesses) the owners want money, so everyone at the company is there in order to get the owner more money.

I sort of disagree here. From the worker’s point of view, the business exists to sign their paychecks. Why is this not just as valid as the owners’ point of view?

Realistically, lots of work-politics are driven by the interest of individuals to keep those paychecks coming in. For example, people will make it look like they are working harder than they actually are. Managers will inflate their headcount to appear more important. Etc etc. All of these things happen inside businesses, and if you look at the man-hours spent on them, I don’t think it is obvious that the interest of the owners is a bigger focus of energy. Particularly if we treat “the interest of the owners” as some distinct thing, separate from customer happiness or writing artful code. I think people don’t actually “think of the shareholders” much while doing their jobs, day-to-day. How else can we think of the priorities of a non-sentient thing, other than the aggregate priorities of its sentient, priority-having members?

> Employees won't leave, because the company will make sure they're happy enough. […]

> This is not a sign of any kind of decline, but of a mature and healthy business that can keep going for decades.

For these two points, in retrospect I was terribly vague to the point of just being either wrong or so open to misinterpretation as to be indistinguishable from wrong, so sorry for that. By “good people,” I meant people who were not interested in doing the things you listed. For “zombie state,” I meant shambling on profitably, but not doing anything interesting anymore. See IBM for most of our lives. I accept blame for this one! In particular, I regret writing “good people” because of course “good” is wildly open to interpretation.

> Execs change, products change, even owners change, but the business remains.

I mean, this is sort of a “business of Theseus.” It is really a matter of perspective if, having replaced every component, it is really the same business. Except, in this case, there isn’t even a physical ship in the end to point at.

—-

Also, it might be worth noting—the article is about what one ought to do, when weighing interests of different parties (user, dev, ops, “business”). If you disagree with me and think the business exists, maybe you agree that prioritizing it over users is not what we ought to do.


I had the same initial reaction. Seeing something written that can be amounted to $ > person looks wrong.

Yet, importance is subjective. If you're working on your own pet code for your own pleasure, business has no importance. If you want to transform that into your main revenue source, business is the most important thing, because no amount of user love will transform into practical revenue if your software serves no-one.


If your project is useful to users and requires money to continue, it is in their interest that you make money. The business is just an abstraction around the fact that your users need you to be able to do the thing.


It's not really $ > person.

It's group of people A, who are conducting a [series of] transaction(s) with group of people B. It so happens that the wares being exchanged are currency vs. software but that's really besides the point.

Group of people A ≹ Group of people B


Let's charitably interpret 'business' here as "a sustainable funding model that can support maintenance, support, and future development". Without a business model, even great user-pleasing, deployable, maintainable software can fizzle out.


It is a useful imaginary construct. It captures the idea that your users, if you are doing something useful, have an interest in you continuing to be able to do it.

IMO the reason the article had bring up this obscure !>< operator is because it treated this particular set of user interests as somehow separate from the users. The reason it is hard to rank business vs user interest is because business \in user.


I was sceptical but you know what, I love this mental model.

Don’t follow it blindly of course, there are exceptions where dev > biz (see OpenAI debacle) and where dev > ops (early stage startup, move fast, in particular dev > ops because biz)


I think what I am seeing in this article is a curious mixture of should and does. For example, the "user > dev" formula is a clear example of "should"; but when he gets to "biz > user", he surreptitiously switches to "does". He explains the biz > user by pointing out stakeholders, and investors, and personal interests, and politics, and all other sorts of crap that, in the real world, puts biz before the user. Very understandable. But should it? And why isn't the same explanatory method applied to the "dev > *" formula? After all, there are clearly very strong personal interests involved in that too.


> But should it?

How would it work if it wouldn't? He explains what he means by that. If you spend time and resources on all things the users require and your run out of money and go out of business, everybody loses.

Of course you can take any of the "rules" and take them to an extreme where they become wrong. But I think if you don't push them to their breaking point they are good rules of thumb :)


> How would it work if it wouldn't? He explains what he means by that. If you spend time and resources on all things the users require and your run out of money and go out of business, everybody loses.

I think there is also a bit of a conflation of values vs ability. The formulas in the article represent values. Real life adds constraints based on what's possible.

Consider his formula for dev vs user: "user > dev". You could argue that, just as a business is constrained by time and money, so is the developer constrained by time and skills. And yet, the author is happy with turning the greater than sign towards the user in this formula. Why?


It's the same thing. If you bias your time and resources towards the things the dev require and/or would like and nobody actually wants to use the resultant software, you equally crash and burn and everyone loses.

It's a priority ranking, not a zero-sum game.


Very nicely explained, as I kept reading it covered all my experiential "yes, but" thoughts.

I will add that-knowing- all this is helpful, but implementing it when you are early in your career is hard.

For example, it's good that Business is the big priority, but when you gave no experience of what is good, or bad, in business, it can be hard to understand the ramifications of decisions made now.

Equally, business priorities should win, but your goals and the business goals may not be aligned. An individual may need to resume-pad (learn that new framework) while the business may want homogeneity (everything built on one framework.)

Finally, of course, this refers to commercial software. Open Source software is the exact opposite flow, dev > maintainer > user > business. Which ultimately explains why it's so hard to get funding for OSS.


Then again, the original ideal for OSS is that (just like owning a home is having the tenant and the landlord be the same person) devs and users are the same, so:

(dev = ops = user) ≹ (biz).


> Open Source software is the exact opposite flow, dev > maintainer > user > business

That's a pretty sad assumption.

"dev > user" is why a lot of projects have very poor usability.


devs have to use it to hoist it to a platform that can accessed by others before users can use it, right? imo users don't typically use open source software, they use generated artifacts from an upstream development process representing the capabilities of the software. users don't "use grafana", they use the implementation of how you deploy it.

it should be improved upon by responses from the users who use your implementation, but what you're saying suggests that efforts like architecting your software in ways that improve/maintain development standards or packaging your software in a dependency manager before delivering any level of user facing feature is a sad assumption to you. I don't think the concern of usability here takes the entire picture of a project into context.


Obviously it's not true for all projects. But it's true for most of them, no?


> There’s a lot of software being produced that just doesn’t care about its users [... caused by] a mismatch between what we thought doing a good job was and what a significant part of the industry considers profitable [... so] perhaps we should take a stronger ethical stand not to harm users.

Or, we could ditch the "biz" part. It just so happens that software not written to serve business interests tends to also respect users.


If we ditch the business part what's your recommendation to pay rent?


It's possible to get paid for writing software that secretly doesn’t solely serve business interests but also and even primarily repects users.


Just ditch the paying rent part.


I happen to believe in subsidised/affordable housing (I assume a UBI is what cybrox meant by not paying rent; also cracking down on housing-as-investments[1] sounds good to me, a layman, though apparently it wouldn't be enough[2] in our case, supply just needs to increase)—but I'm sure that other, less drastic societal changes would also be sufficient. For example, medium to large businesses could fund any open-source projects they make direct use of[3]. With the likes of OpenCollective[4][5] and GitHub Sponsors[6], the only roadblocks to adopting such a policy are corporate red tape and corporate greed. (And if it leads to smaller SBOMs, well, that's just a bonus as far as I'm concerned.)

All that said, I was actually referring to individuals (who aren't necessarily developers) choosing which software to use. Over the last 20 years, we've all taken free downloads and sign-ups for granted, ignorant to that whole "if you're not paying, you're the product" thing, and a lot of us now have a $500+ supercomputer in our pockets which is to some extent owned by a tech giant and not its nominal owner. It's apparent that such centralised platforms have some intrinsic problems with regards to users' freedom. That's what I'm cautioning against—not the idea of selling your programming labour to a business, which is fine.

[1]: https://en.wikipedia.org/wiki/Australian_property_bubble#For... (This article is outdated. From a quick web search, it seems foreign real estate investment fell during the pandemic but has now picked up again.)

[2]: https://www.abc.net.au/news/2022-09-02/housing-property-aust... (1M unoccupied houses in a country of 26M, ~120k of whom don't have secure housing...)

[3]: https://stackoverflow.blog/2021/01/07/open-source-has-a-fund...

[4]: https://blog.opencollective.com/funds-for-open-source/

[5]: https://docs.opencollective.com/help/financial-contributors/...

[6]: https://docs.github.com/en/sponsors/sponsoring-open-source-c...


I thought you'd have something practical in mind like specific industries or working for non-profits or something that would work today, not some theories of how the world could work. I can't go tell my landlord he should believe in subsidised housing like you do unfortunately.


To ditch the biz we have to make dev and ops accessible to ourselves. Nobody does that. Making a proper login form still requires a team and a week.

The biz sits there rubbing its hands, watching us making the means of production more and more complex and expensive to use, so that it has a complete monopoly on creating software.

Devs are so used to relatively big paychecks from the biz that unconsciously tend to ignore these issues.

It’s not implementing dark patterns and cookie popups that makes you on the biz > * side. Tolerating the complexity of software development is. “Biz > complexity > *”. Push the complexity to the right as much as possible to ditch the biz.

Due to developer deformation it should be explicitly stated what the complexity is:

- Pointless “constant evolution” (version.major++ in less than 5-7 years, previous versions abandoned),

- Embracing epoch-breaking changes (2/3, ESM)

- Low-level as the norm (scaffold, config, listen, route, ddl, connect, hash, jwt, css, setstate, useeffect, fetch, … in place of business logic)


It's interesting to hear you say that reducing complexity and building to last(?) is the solution. Do you know of any case studies (not even peer-reviewed necessarily, just examples) showing a "recovery" in terms of user freedom after making such changes? My view has always been that complexity should be reduced, but because it makes maintenance easier—reducing cost if you're paying maintainers—and can prevent bugs/vulns. Only tangentially related to privacy, via the latter.

Also, I don't understand your last point. Are they all React builtins or something? If you're suggesting that the "shape" of an app's navigation, or of network or system calls, etc. should be how business logic is made concrete, I'd have to disagree. That sounds like microservices but with worse intrinsic documentation (the purest documentation there is).


The kind of complexity I mentioned never gets reduced, or so to say the reduction happens in volumes, not in levels. E.g. an app that uses fetch() to access backend that uses SQL/DDL to init and access database. It never gets reduced into a thing that models data graphically and then uses out of box objects in frontend like `await main_obj.get_linked_obj()`. But it may be simplified by removing all the abstractions which ought to poorly simulate this but failed. A maintainer skill is still required to be high, only their mental capacity is taken care of.

Keywords in the last point are from all over the stack. They should be erased from code in favor of much simpler concepts. How you name the users table, which routes access it, should that be fetched or cached, how to structure code to display it — all that is not a businesses business. It should look like e.g. “root.current_user = await User.login(username, password)”. Everything else is low-level.


I think it might be enough to tax all electronic ads per viewer-second. It's what directly or indirectly drives most mistreatment of users. Such a tax could be pretty popular too.


Because we as programmers need to know how to progam: learning trumps all. Thus:

    learning > biz > user > ops > maintainer > author
There was that bit

    dev > *
meaning resume-driven development, but in reality it is

    learning > *
And that's the conflict programmers experience in corporate structures, for example the hassle programmers experience when they interview for a position. Employers know it too but they try to shirk it.


My gripe with most programmers that they keep re-learning the same narrow set of skills while ignore the rest that actually makes someone a efficient expert.


It's surprisingly easy to have one year of experience despite working for ten years.


That one year is clearly more important than the other nine.


Not necessarily. I think at some companies you can just fall into local maximums.

For example maybe you've spent 10 years building the same CRUD front-ends over and over. You're probably really good at that. And the companies that you worked for needed that skill. However, you'd be a lot more marketable if you had other skills that you could put to use at future jobs.


That would be nice, but people keep hiring based upon specific programming languages and other programming specific skills.


Out of my last five positions, only two required a programming language I’ve already used before professionally. You just have to know enough to pass the screening and no competent interviewer will reject you based on lack of experience if you are a good fit otherwise. Hence the other skills, like basic soft skills, leadership, systems and ops knowledge and office politics matter more in the long run.


I mean, that's more information for companies than me. And yes, every job I've been made an offer for and accepted has been much, much looser on the exact languages or actually looked at my submitted code samples if they were interested in understanding if I could actually write code.

One of the worst examples was Amazon. They gave no indication ahead of time, but I was presented with a coding test with only a small subset of languages. My chosen language was F# because that's what I was most comfortable with but of course it was unsupported in their online tool. I solved a problem using discriminated unions, pattern matching, and pipelines. The interviewers were very confused, a bit antagonistic that I'd choose anything other than Java (even though no one told me ahead of time that there was a limited selection), and proceeded to show their lack of knowledge. For example, the interviewer kept bringing up a violation of the open-closed principle in my code. However, as I explained to them, that principle is only really applicable to object-oriented code. Since I was using a union data structure and pattern matching, that principle is not really applicable, as the functional approach is a transpose of sorts from the OOP approach. But I just got blank stares and an automatic deflation in the room of the interviewers communicating that they were ready for the interview to end. What was mind-blowing about it is that my resume lists nothing but functional oriented programming languages, with not a single example of Java, and I have plenty of code examples posted. But yet, I get asked some dumb question about reading a file using some terrible online coding platform.


You usually don't want work companies that do resume matching.


That is indeed true, but there is sometimes a mismatch in how a team hires versus how it actually works and performs. So sometimes, there are some missed opportunities to work in interesting places or fields just because their hiring process is uninspired.


I think learning some narrow skills is fine but the real issue comes when you don't deploy code to production. Because Production is the best place to learn actual problems.


I could expand that though to a much larger set than programmers!


I have the inverse problem, and it's why I don't have a tech job.


What do you think programmers ignore that would be beneficial?


Their working domain. If you're a developer in financial services, learn finance. If you're a developer in oil exploration, learn seismology. Don't constrict yourself to only being a programmer who writes whatever code you're told to. Be a holistic problem solver.


What's in it for me?


It's easier to write code when you understand the context. The customer requirements aren't always flawless and might make assumptions about things that you don't know.

During my career I've had to study:

  - electrical engineering (phase angles, current transformers etc)
  - finance (payroll and accounting)
  - building management (maintenance schedules etc)
  - mining (terminology, how explosives are set in underground and open mines)
I'm not an expert on any of those, but I know enough to be productive and know _why_ I'm doing the task I'm assigned.


Usually, a promotion from "one of the coding peons" to "the software architect who tells coding peons what to do"


Even if not an architect, it will make you a very senior and respected developer among the other smart people that can actually evaluate this. And if in your current company there are no such smart people, it will open the doors for you to find another company where those skills will be rewarded.


Understanding your users, and what you have to solve for, better than they can probably explain it to you. This makes both your and their lives easier.


To have more experiences and expand your horizon. You never know where it will take you next. Even if it does not translate to more money, it will definitely give you a sense of self-accomplishment and MAYBE make you wiser.


Maybe not needing to rewrite your code several times after being told that it's not what the client wanted to have?


People are down voting me, but I am really enjoying the answers.


Which would the rest be? Asking genuinely as a 4 YOE dev


This is why blindly turning folks loose with AI is scary. They don't know what they don't know.


... and the "learning" is left to the AI


learning trumps all until it leads to resume driven development, and a crud app need to be using all the shiniest new technologies that no on is actually an expert in.


biz > user

This one sounds like there is only the HN reality of dev : startups, VC, growth hacking, free until is not, client=product, burn every $ to reach monopoly,...

But, there is another one with small businesses, craftsmanship, get what you pay, client=client...


> And while we can’t just go back to ignoring the economic realities of our discipline, perhaps we should take a stronger ethical stand not to harm users. Acknowledging that the user may not always come before the business, but that the business shouldn’t unconditionally come first, either.

This is why software engineering needs licensure. Not to check if people understand design patterns, or whether people know the fad library of the day, but to set clear ethical standards and hold people to those standards by disbarring them if they violate those standards.

Or more to the point:

  society > biz > user > ops > dev


IMHO. The article makes a wrong call to action. It calls for dev ethics when the problem is biz ethics. Instead of licensure wouldn’t it make more sense to have more rules and oversight for the companies?


That’s really interesting. Barbers need a license (in Australia), lawyers need one, and yet people writing potentially crucial software do not.


There's no shortage of hairdressing "institutes" in Australia promoting certificates awarded after passing their training courses and no doubt existing shops would look to see some evidence of skill before hiring .. however:

    Licensing/registration of hairdressers, beauty therapists and barbers is currently not required, legally binding or a symbol of any kind of standard in our industry.

    In previous times, there was a hairdresser’s trade license which was abandoned, without consultation with the industry.

    Re-establishing such a license would need to be done in conjunction with governments and education partners like TAFE’s and Private Colleges to ensure that the necessary awareness and legal processes are followed.

    Simply announcing that you are creating a license/registration for employees, and charging on the spot, simply won’t cut it.
Hair & Beauty Australia Industry (2017): HAIRDRESSER LICENSE NOT NECESSARY AND NOT LEGAL

https://www.askhaba.com.au/hairdresser-license-not-necessary...


"biz > user" is valid on paper because someone has to pay for the party; there are a finite number of software devs and capitalism is great at allocating scarce resources where they're most needed. Or at least the "spherical cows" version of free-market capitalism is great at that.

In that version of capitalism, if your OS spends too much time putting spam in the Start Menu and not enough on fixing bugs, you can just switch to that other OS with full support from hardware vendors, that runs all your closed-source legacy software. If your favorite bird-themed social media site starts doing boneheaded stuff, you can just switch to that other site that has all your friends and data and lots of posts from experts and celebrities. If your search results are full of spam, you can switch to that other search engine with 10,000 highly paid engineers and 20 years of experience in Web search and integration with all your devices.

And all the businesses you moved away from would have to clean up their act or go out of business. If only those cows were perfectly spherical.


Just because this doesn’t happen instantaneously doesn’t mean it’s a fantasy. Large changes in an ecosystem always take time. It may be years before the incumbents are truly displaced, but that doesn’t mean they won’t be. MySpace and RIM also seemed like they’d be there forever.


You'd have to explain your optimism of things eventually getting better, given that they've largely become worse over the past 10-15 years. Maybe it'll just be meandering around a mediocre-at-best mean level long-term.


"In the long run, we're all dead."


It is not about the comparative values of humans.

It is about costs.

The reason we write readable code is because we spend more time reading it than writing it. Engineer time. Which is money.

Yes, other things are money considerations, too. Considerations the advice to write readable code is not meant to address.


From the OP:

> When I say “run” I don’t just mean executing a program; I mean operating it in production, with all that it entails: deploying, upgrading, observing, auditing, monitoring, fixing, decommissioning, etc. As Dan McKinley puts it: "It is basically always the case that the long-term costs of keeping a system working reliably vastly exceed any inconveniences you encounter while building it."

One of the things separating a great developer from a good one, IMO, is the ability to treat the documented API boundary of an open-source library as something that isn't sacrosanct and impenetrable.

How comfortable are you with command-clicking into, or outright cloning and reading, a library's source to understand the nuance that the documentation may not convey (or situation where the documentation is outright wrong)? Depending on your language's support for it, have you monkey-patched issues, or forked a library (and ideally submitted patches upstream) to add extension points you might need? Are you used to setting breakpoints inside of libraries, not just your own code, so you can understand how data is represented internally when debugging?

And when you're evaluating whether to use a library in the first place, do you just look at the API and examples, or do you read the source to understand whether the underlying code prioritized maintainability, extensibility, and test coverage? Do you read changelogs and think about whether the library maintainers prioritize your ability to upgrade without breaking even your patched/forked code?

The brilliance of open source is that we can do this. We're not just fitting gears together - we're able to fabricate iterations on those gears as we need to.


This article has opened up a new perspective for me. It articulates very well that software is a means to an end.

It's a craft that can be used to solve a problem.

In my past I often emphasized the craft part too much. As if only writing perfect code is all you need to do in order to be successful. The really important stuff is understanding the problem you want to solve and being sure that software is the tool to solve this particular problem.


The person understanding the problem to solve and the person crafting the solution don't necessarily need to be the same person, though.

They can, in fact, be two people with completely different skill sets and if one of them ("you") can "only" write perfectly beautiful code, they can still succeed by relying on the other for breaking down the particular problem.


This article codifies general power structures within organizations: who trumps who when making software decisions in an org and sells that status quo as some sort of useful rule of thumb, when it in fact has no utility at all.


Great article! I make a living as a software engineer but my formal training is in electrical engineering (the applied math of the discipline).

In EE, the mantra "if your design doesn't work is useless" was repeated ad nauseum. There was little to no credit to half working designs. Also in most EE products, specially semiconductors, the final product cannot be patched, so the emphasis on good design was immense.

Come to the world of software engineer where the low barrier of entry attracts a lot of people interested mostly in making easy money.

I have seen more horror code in my 10+ years as a software engineer than I can count.

Some developers in particular those of the kind "I will spend 3 days coding nonstop with redbull" produce such shitty software that there is a name for them. John Ousterhout calls them "tactical tornados". I have had the misfortune of working side by side with a couple of tactical tornados and my life was miserable left to fix the mess they left behind once they left the company.


I know this is considered wrong in so many ways, but my personal job satisfaction depends on dev > *, and in an ideal world, with open-source and end-user programming, we get user == dev > *


I don’t think I agree with the sentiment that user doesn’t equal business, to some degree maybe, but a business is what its users produce. I buy the point that some users within a business can have a demand for processes that will not be good for the business because they may be too focused on their specific department, but as a whole I think they are largely the same thing. Happy users work better, better work is good for the business. Aside from that, I think a lot of the “bad” decisions are unavoidable as businesses grow, so too with the bureaucratic demands that are often not producing value, but it’ll be very hard for the “IT department” to do anything about it because it lives as a “service” department similar to HR, but less understood by management. Cost-centres never get political power to match money makers.

I do think it’s a very interesting article, perhaps with a bit of a baiting headline. But I think it would be even better if it included the perspective of the modern ci/cd pipeline. What I mean by that is how we’re building more and more automation into it, things like RennovateBot that will automate certain parts of your maintenance processes. Things that won’t work automatically if you don’t write clean testable code. If you focus on delivering business value, spaghetti be damned, you’re going to end up spending so much time maintaining it that you’re eventually not going to be capable doing what you set out to do. We used to live on a world where this was sometimes ok, because you’d clean it up once the value was created, but with how much you can gain from a highly automated pipeline, I just don’t think that paradigm is true anymore. Because not only will you have to clean things up, you’re also going to miss out on so much efficiency by not being capable of using these automated tools. You can make the argument that you don’t need to update things as often as those tools help you so, and 5 years ago you would have been correct. Many places you may even still be correct, but in large organisations in the EU this is just no longer the case because of the added legislative bureaucracy that IT security needs to play into.


Hey, author, if you end up reading this thread: put the conclusion on a sticker. I'll put one on my laptop and give a bunch to friends.

For anyone that didn't make it to the end, the conclusion was:

    user > ops > dev
    biz > ops > dev
    biz ≹ user


In the long run user > biz, otherwise users will be using a product of a different biz via the enshittification cycle.


Assuming a non-shit business, you could think of "biz" as "customers integrated over the long term". Therefore putting "users" and "biz" as incomparable could be interpreted as prioritising users, sustainably.

At least that's how I interpret it.


There are enshittified monopolies and oligopolies, with lots of moats.


I'm not sure I've ever read an article where so many times I thought "ok fine, but ..." only to have the next paragraph address that exact point each and every time. The ending point especially.

As others here have pedantically pointed out (as HN is wont to do), there are ways to misinterpret or muddle some of the phrasing used here. But I think that's true of any reasonably simple (and therefore broadly useful) mental model. To me, this is a useful way of framing these ideas at a high level.


Also cars are driven more than repaired. Highways are used more than they need to be maintained. Yet if you don't include some thought to how to deal with a situation where something breaks, this could (and very likely: will) eventually come back to haunt you.

But code is not the same as your car. If money is not an issue and your car breaks you could just get another one that drives equally well. Try doing that if your code breaks.

So whether values like readability, maintainability etc. can be important for good code is out of the question — what remains an open question is (on a per file basis) how to priorize between those two if such a need arises.

And that isn't a small "if": One should guard themselves against the simple thought that these are inevitable tradeoffs that are mutually exclusive. You can easily make code that is both unreadable and doesn't run well. You can also make extremely readable code that runs nearly perfectly. The latter is of course harder than writing the same code to just run well, but if you are not coding alone or your project is beyond a certain scale it might be well worth the extra thought.

The solution space for "clear structure" and "good performance" is smaller than for either of those alone, so finding a good solution here is a bigger challenge, but depending on what you plan to do it could be worth a try.


The article is good on the whole but the title "Code is run more than read" seems false in many situations I've been in. A lot of software doesn't have users. Often the target users don't like the product enough and are just trialing it for you as a favour.

In these cases, it's not very meaningful to say it's being "run more than read", because either you should have made the code readable enough for other developers to delete it, modify it or rewrite it until users want to use it, or the code has to be valueless and able to be completely thrown away (the knowledge of how to build it is either written down elsewhere or tacit).

I also agree with what somebody else said about the times in which the features/fixes you create aren't seen by users but just benefit a business-to-business sales process. In these cases, you behave differently. You do the work because there's a very clear signal to your company meeting their expectations promptly.

I guess one way I would agree with the article is that once something has users/businesses and is somewhat readable you apply your extra effort on polishing the user experience as prioritising the happiness of your core users is generally a good idea.


I'm not sure I agree with everything, but I love the mental models this introduces.


> maintainer > author

> usually a good investment to make the code maintainable by keeping it simple, writing tests and documentation

I recently inherited a project where the leadership 100% believed this and tried to do it.

The problem is that the copious junior developers they hired, with all of their good intentions, just couldn't write maintainable code.


My car is parked more than drive. Still, the drive is the more important, valueble, and difficult. Cars should be optimized for driving first, and only for other things when needed.

Likewise code should be optimized for reading, which is to say for maintenance first, and for other things only if/when needed.


I generally agree with the thesis here that "users and business are under-represented by people that are building and maintaining software", but the greater-than comparison over simplifies it, right?

It's something like "Net revenue" and "net user happiness" trump "code prettiness". Additionally, in the "maintainer > user" scenario, is it all maintainers? Is it one particular maintainer that has strong opinions (maybe which don't reflect what the business cares about)?

I'd suggest to replace the inequality mentality with an optimization problem, like: Find a version of the code that maximizes user happiness and minimizes maintenance cost.

If there are two solutions that give roughly the same values for the above, then accept either.


To avoid violating my egalitarian principles, I choose to read ">" as "depends on."


Nice article.

I'd like to point out that a mature/legacy codebase contains every example listed under the "smell" section of the article, sometimes even within the same file. This creates a great deal of complexity to unwind beyond the mere coding of it all.


And an enterprising code keener will insist on rewriting that legacy codebase to eliminate all "smells", replacing battle-tested, bug-free-but-bad-smelling code with fresh, elegant, untested and nice code.

Every generation relearns this the hard way, this blog post is 23 years old today and still 100% on the money https://www.joelonsoftware.com/2000/04/06/things-you-should-...


This algebra doesn’t hold and the modals used are wrong.

You can certainly build software that has a business purpose. Or not. I build useless software all the time, to learn. So the modal matters. I can build useless software. I can build useful software. Whether I should or shouldn’t is up to me.

As for the rest, it’s a shakeup algebra. His inequalities don’t make sense in an academic environment for instance. But maybe that academic software then becomes open source and suddenly it’s huge! But at no point were he biz inequalities used.

It’s a piece written from the environment of what looks to be private industry so maybe those algebras hold up better there.

Also, the whole ops inequality made me laugh. If only this were true.


The biggest stakeholder is change and accumulated cruft. The usefulness of code also drifts over time as the users, business, or paradigms/ecosystem changes. Most new code should be considered throw away until it's proven useful to end users. Once useful it can merged into 'stable' tree that should have high standards. Ideally 'stable' is capped at a certain size per the number of engineers and so to move code in requires refactoring or deleting dead or less useful code. The balance you want is to maintain high velocity but also allow for maintainability and core stability.


> Code is a means to an end. Software should have a purpose, it’s supposed to provide a service to some user.

I will accept this as a premise for the sake of argument, but it's certainly not a universal truth.


Code being run more than read means this: code is read more by machine than by person. Users don't read code much, on the other hand. Thus:

   machine > maintainer > author > user
QED


Quality article – for years of developing software, I have the bingo – heard and got used to every smell as long as there was air to breathe.

I'm always in the end of this chain, doing the dirty work :)


Kind of funny, but I have seen code that is indeed read more than is run, at least for the first year or two of its existence. Code that is used for generating batch reports for some billing process would run once a month. But I had to spend a lot of time staring at it, wondering what is going on, and fixing things up as reports came in.

One could say that the test runs were run more than the code was read (of course), but in production? Definitely the same order of magnitude for a long time.


Cheap, fast, quality. Pick 2.

Which one to pick isn’t as linear as the article portrays and it changes over time. You can’t religiously plant yourself in one corner of the triangle forever.

Low quality will loose users and eventually also profit. Rushed development inevitably degrades quality over time, even if initially a high AWS bill hides this fact. No income and eventually you run out of cash to do any of the other 2.


> It took me a while to fully grasp this because, in my experience, much of the software being built never really gets to production, at least not at a significant scale.

If you work for organizations where most of the software built is never used, you have two options: influence your organization to greatly improve their efficiency, or look for a better place to work.


> I can’t make a Google search without getting back a pile of garbage.

I think SEO / Spammer share lots of that guilt. Not sure how to attribute them correctly.


It's quite easy: Google giving better results than the minimum needed to make sure users don't switch is detrimental to its ads revenue.

If you look at how many people are working on search at Google, it's just a small part at this point, vast amount of people at Google work on different ways of monetizing it.


> There’s a lot of software being produced that just doesn’t care about its users, or that manipulates them, or that turns them into the product.

> There’s a mismatch between what we thought doing a good job was and what a significant part of the industry considers profitable

Feels like one of the main plot points of Tron. (1980s Disney movie where programming was a major plot point.)


The point is good. I found it a bit confusing that the article first collapses "author" and "maintainer" into "dev", and then randomly expands and recollapses them as the article goes on.

The full inequality seems to be:

    biz > user > ops > maintainer > author
That feels roughly right.


Code is run more than read. Code is read more than written. Code is written more than ...?


...correct?


"tested" feels more suitable


tested


I like the clean design of this blog. Is is very well readable on my smartphone.


"There’s a lot of software being produced that just doesn’t care about its users, or that manipulates them, or that turns them into the product. And this isn’t limited to social media: as a user, I can’t even book a room, order food, or click on the Windows start button without popups trying to grab my attention;"

I recently interviewed at Netflix and during the design portion of the interview I was tasked with navigating to a UI, creating a user-persona, and making arguments for or against how well the interface delivered on my goals as a user. I chose Amazon and "Parent of a family shopping for groceries". Mostly because I have a lot to say about the interface personally and I felt this would be useful in the interview.

Now it used to be that when you clicked on the account section in the top right of the interface - the page would refresh and nothing would change. I found this infuriating because the dropdown was just a hover. Turns out the interface now takes you to a landing page with the same info as the hover.

HOWEVER. During the interview, that hover (The one for Account), also advertised products to me. I mentioned that this is causing friction for me as a user, that what I want is to view my account information but I am being bombarded with more irrelevant things to buy.

The interviewer - some lead of design within the company said - "Well, you have to remember that all of these decisions are tested to death and there is good likely good reason/research to back up that part of the UI". I wish I had the presence of mind to say that response betrays the exercise we were doing but I digress.

Biz > dev > user. Biz ≹ user.


Counterpoint: machine code may be run often, but source code is run never.


Also code is read way more times than it is written (or edited).

So the corollary, in terms of concepts from my college programming languages class, is:

reliability > readability > writability


Just side note. Article is written in context of business. But sometimes there is non, like public sector.

Nevertheless, article covered this context in the end as well.


Nice article, I always had the impression that we value DX a little too much. I mean developers are paid to do the jobs, but users are not.


It is filled with wisdom. I can not imagine the most genius programmers try to sell lingerie to people in google, facebook etc. purpose > *


Needed to define customers and users separately


I expected to be very defensive when reading the title but I actually fully agree with the article, thank you for sharing


A good article with a misleading name...


Then again, it doesn't mislead very far.


> Software should have a purpose, it’s supposed to provide a service to some user. It doesn’t matter how well written or maintainable the code is, nor how sophisticated the technology it uses if it doesn’t fulfill its purpose and provides a good experience to the user:

> user > maintainer > author

This is completely wrong. To see why, replace the > signs with = and work backwards from there.


Please elaborate.


Code isn't run, machine instructions are run. Code is for reading and writing.


> Imaginary software

Isn't this vaporware?


For enterprise software:

Customer > User ??


You should move as one, don't be so obsessed with the > then >


By that logic, I hope exception handling code is read more than run :)


Open source [not-for-profit] software says: Hello?


You have to have something worth running first.


I think the message is very profound. It is one of the message like "eat your veg, exercise, sleep well, love your family". On surface one would respond "yeah who doesn't get that". Who doesn't know software is as good as getting used? Oh wait, but no, what about knowledge, wealth, career, fame, status, entertainment and so many more. What about architecture, maintainability, scalability, trendiness, developer experience, team management, OKR, career prospect, and others?

A punch is a punch. Software is about getting used. (I swapped user with used, IME user can mean different things to different people)


CPUs are lower in the hierarchy than developers.


Rolled my eyes at the capitalism comment. Was that really necessary?

Idealism > logic


Yeah man I press the gas more time than the brake but I still want the fuckin brakes to work...


That's not the point. The point is that a computer is not involved in the creative and qualitative process, it does not care. Humans are different ad you might all know.

So of course code should be readable over being runnable.


You haven’t read the article past the headline, have you? It’s not about computers running code, it’s all about humans.


"It doesn’t matter how well written or maintainable the code is, nor how sophisticated the technology it uses if it doesn’t fulfill its purpose and provides a good experience to the user"

This is what I was commenting on. And I do not agree with it because it underlines the dogma about how little craft quality means as long as someone finds it useful.

You could also hammer 3 wooden boards together and call it a chair.

But that's not for me, it's simply too shallow a mindset for being a professional software engineer.


>I do not agree with it because it underlines the dogma about how little craft quality means as long as someone finds it useful.

YMMV vastly based on industry (or in the lens of this post, the demands and requirements on the user end to satisfy). For media, it's fine; very few users are going to suffer over a few minor bugs or an unoptimal system. For aerospace, it's fatal thinking for obvious reasons (with sadly, many real world examples to point to). Both are valid career aspects and both provide value.

>it's simply too shallow a mindset for being a professional software engineer.

sounds like you fall under "the right thing", as the author calls it. A sadly dying aspect as elephants and late stage capitalism (again, as the author named them) run wild.

But I argue more that this is just fine for an engineer. You're thinking more like a professional computer scientist. scientists observe and model the nature around them. Engineers apply them, while taking into account the imperfections of nature and yes, humans. In my eyes, an engineer that can't adjust to the needs of the product (be it for biz, users, or someone that is not them) isn't an engineer, but a scientist. Or a craftman, as you deferred to ealier.

Again, not a slight nor compliment, just different roles with different philosophies.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: