Hacker News new | past | comments | ask | show | jobs | submit login
People expect technology to suck because it sucks (tonsky.me)
447 points by ivanche 33 days ago | hide | past | favorite | 433 comments



For most people, technology is a haunted house riddled with unpleasant surprises. With no agency, they are at the mercy of other people's bad ideas that keep changing. Everything needs to be updated because everything else needs to be updated, because everything needs to be updated. Duh!

Software updates! Guess what! Here's a new UI for ya. We moved all the stuff! It's like someone threw you a surprise birthday party, but not on your birthday, on their birthday, and their idea of the best gift evar is to hire an interior designer (for free! lucky you!) who completely rearranges your house inside and out and springs it on you after you return from the grocery store. And there's no going back.

At first it was exciting--when I was 15--then it's slightly bothersome, then downright annoying, then it's infuriating, then it's just tiring. Your brain learns that there is no point in learning anything anymore because they're just going to scramble it again anyway. Learned helplessness. People age out, ended up feeling old and useless and jaded because their skillset is completely inapplicable, after just a few years.

Yeah, I can understand why people hate tech.


I logged into Mailchimp yesterday and found that they moved the header navigation to the left side.

Instead of the previous menu option words like Campaigns or Audience there were icons signifying each that I had to hover over to figure out what they might mean. Then when I went to my Reports the css breakpoints seemed to be wonky making that screen hard to read and use.

Half-jokingly It almost feels like constantly confusing people is a trick to boost engagement temporarily while people are forced to figure things out.


Also half-jokingly I feel like the exact same thing happens in grocery stores.


That they frequently change the layout of their stores? I've never noticed that at any of the stores I shop at.


You aren't wrong at all. Stores regularly re-organize and what it says to customers is "your knowledge is worth nothing". The disregard of customer knowledge is an absolute anti pattern.


Some stores do it, some don't. However, when they do it's intentional in order to force you to go through aisles you might not have walked through otherwise, thus exposing you to more advertising and chances for impulse buys in addition to what you actually planned to get.

Yes, it's a definite dark pattern, but not so much an antipattern.


> I can understand why people hate tech.

To add to that, now that I'm a retired lifelong techie I realize why "old folks" back in the day would hesitate to give up the old, outdated software that they knew how to use.

E.g. I'd prod older friends and family to give up wordperfect - which they knew and loved - in order to progress to the feature-rich-new MS WORD.

Now I'm a linux advocate with its archaic terminal commands and I can empathize with anyone who wants their laptop, phone, TV, microwave, etc. to stop evolving!!


Linux is also far from stable. There is the mess that is Linux desktops like Gnome 2, Gnome 3, Unity (okay, this was only an Ubuntu escapade). The init system changed and the result is that you have to think about things you usually don't want to. There's things like Snap and Flatpack, which pretend to make things easier, but ultimately lead to more complexity...


Water under the bridge now, but I bet you did some of them a real disservice.

Wordperfect had "reveal codes", so when the WYSIWYG gave you something you didn't want, you could pop open the actual document representation and wrangle the tags until you What You Want Is What You See.

MS Word has no such function, so when it screws you, and it does, you're good and screwed.


Well, since I was their go-to tech support I paid the price!

re: Reveal codes - not being able to press ALT F3 and cleanup the formatting mess that MS WORD would inevitably get into was torture!


Hot damn is this the most concise description of how I feel.

I’ve always described it as “the design team justifying their own existence after the job is done.”

Let software get stable and boring.


> I’ve always described it as “the design team justifying their own existence after the job is done.”

I actually think that's really what is going on. Wish I had first hand evidence though.

I do know of a tangential phenomenon at a friend's work place. Her org has a dedicated build tools team. So every 6 months every project's build infrastructure needs to change to something entirely new, because the build tools team keeps having to justify its existence.

I don't know why a company would let this sort of thing happen. It's a massive waste of time for every team.


(Late to the party but) Yes, this, absolutely this. It's almost a rule now that, above some very low threshold, the more expertise and hours you throw at UX, the worse the UX is.

Some of the most annoying UX I've had is on Quora, Facebook, and the reddit redesign, which all spend a veritable fortune on it, while the best ones I've seen are something a non-specialist slapped together with bootstrap.


The thing is, I do not really hate tech, if UNIX, and UNIX alone (no GUI), is considered "tech". Most of the programs in the freely available open-source UNIX OS I use do NOT need to be updated. They just keep working and are quite reliable (at least compared to their GUI alternatives).

I do sometimes wish that there could be alternative (not "replacement") ways to do things we use "tech" to do today, where the alternatives only required UNIX (no GUI). This way if we get frustrated with a graphical UI, and myriad "updates", we can just do these things the "old-fashioned way", with comparatively smaller, simpler, command line UNIX programs.

To me, the people who would be very opposed to this idea are not users, they are developers. Having been raised on computers in the 1980's I can attest that computer users never cared about "UI" or "UX", they just did what they needed to do to use the computer. It is developers, especially contemporarary, who are actually care about "UI" and "UX", not computer users. In fact, some of them are passionate about these aspects of using a computer.


Adam Savage was talking about a scribing tool for machining which was very expensive, but which he likes very much [0].

Before recommending it, however, he felt it important to mention that for people who don't machine very much, far cheaper scribes work well because unless it's your job, your tooling is less likely to be the bottleneck, and you have fewer resources. When you machine professionally, you're tooling is likely your bottleneck and you've more resources.

I think this holds for tech and software. Think of resources here as "time spent learning APIs, bash, and remembering tar mnemonics".

At first, dragging and dropping folders isn't going to be your bottleneck. Need to move 1000s of folders scattered on the hard-drive? If you're not using a terminal, you'll be in trouble.

Everyone cares about UX, it's their experience when using tech. It's just that GUIs are better for some contexts than others.

[0] https://youtu.be/n5laGi3GO7M?t=356


Except with tar you don't even have to memorize anything, tar --help will tell you what you forgot.

   ~ $ tar --help
   BusyBox v1.31.1 (2020-03-26 00:59:22 -00) multi-call binary.

   Usage: tar c|x|t [-ZzJjahmvokO] [-f TARFILE] [-C DIR] [-T FILE] [-X FILE] [--exclude PATTERN]... [FILE]...

   Create, extract, or list files from a tar file

        c       Create
        x       Extract
        t       List
        -f FILE Name of TARFILE ('-' for stdin/out)
        -C DIR  Change to DIR before operation
        -v      Verbose
        -O      Extract to stdout
        -m      Don't restore mtime
        -o      Don't restore user:group
        -k      Don't replace existing files
        -Z      (De)compress using compress
        -z      (De)compress using gzip
        -J      (De)compress using xz
        -j      (De)compress using bzip2
        -a      (De)compress using lzma
        -h      Follow symlinks
        -T FILE File with names to include
        -X FILE File with glob patterns to exclude
        --exclude PATTERN       Glob pattern to exclude
   ~ $
And what's the recent surprise UI change? That xz decompression gets autodetected and doesn't need -J? Most software isn't even as friendly as that infamous command.


> To me, the people who would be very opposed to this idea are not users, they are developers. Having been raised on computers in the 1980's I can attest that computer users never cared about "UI" or "UX", they just did what they needed to do to use the computer. It is developers, especially contemporarary, who are actually care about "UI" and "UX", not computer users.

... what? Are you suggesting computer users in 2020 - which includes everyone from your nana on her iPhone to a toddler watching YouTube on a tablet - want to use CLIs, and are being forced by baddie developers into using apps?


Remember that "alternative" is not the same as "replacement". This is similar to the idea of "more than one way to do it" in computer languages. Users have freedom to choose. Here, one of the ways is without GUI, using UNIX. Only applies where the task does not inherently require graphics.


> For most people, technology is a haunted house riddled with unpleasant surprises.

I'd change that to: "For most people, corporate neoliberal technology is a haunted house riddled with unpleasant surprises."

Writing that recognizes that we live with the most un-free market of all time:

"We are in the middle of a global transformation. What that means is, we're seeing the painful construction of a global market economy. And over the past 30 years neoliberalism has fashioned this system. Markets have been opened, and yet intellectual property rights have ensured that a tiny minority of people are receiving most of the income." [1]

And:

"How can politicians look into TV cameras and say we have a free market system when patents guarantee monopoly incomes for twenty years, preventing anyone from competing? How can they claim there are free markets when copyright rules give a guaranteed income for seventy years after a person’s death? How can they claim free markets exist when one person or company is given a subsidy and not others, or when they sell off the commons that belong to all of us, at a discount, to a favoured individual or company, or when Uber, TaskRabbit and their ilk act as unregulated labour brokers, profiting from the labour of others?" [2]

[1] https://www.youtube.com/watch?v=nnYhZCUYOxs

[2] https://www.resilience.org/stories/2017-08-03/book-day-corru...


I have worked for software companies for over 25 years, mostly on teams building software, and I hate software. I find bugs in every software I use (my freaking microwave oven control panel!). In addition to questionable quality, software is often downright hostile (lose all the data you typed into a web form if you accidentally backspace while not in a text field, because it navigates off the page). Ironically software engineering tools (build systems, etc.) are some of the worst. I don’t know what has to happen for people to stop tolerating software as it is.


Doesn’t reality suck the same ?

My gas car stinks, destroys the planet, needs yearly maintenance, crashes in everything the second I stop paying attention.

My house decays days after day. Floors need constant cleaning, wall have holes from small impacts, paint contains inedible fragments and disperse nocive gas.

Bees are building nests on my balcony and it’s definitely not what it was built for, nor where they should be.

How can we tolerate such a life ?


I live in an old house, and routinely discover ugly hacks that were done by the previous owner, presumably due to laziness, cost or just lack of skill. For example, they buried tons of stuff (toys, furniture, water heater etc) in the backyard and built a terrace on top of the pile to cover it up, apparently because they were too lazy to take it to the dump. The terrace decayed, so I had to tear it down, but in doing so I had to clean up their mess so I could actually use the garden. I'm not annoyed at the planks for decaying, as that is to be expected, just like you are expected to e.g. keep up with third party dependencies that you have chosen to include. Discovering a mess like the one I found in my garden, however, evoked the same feelings in me as when I look at a badly written code base and just wonder how anyone could ship something of such low quality to paying customers.

I guess my point is that there is a difference between things sucking because of the laws of nature, and things sucking because of incompetence, laziness or indifference.


To be fair, the previous person didn't know you were going to try to plant vegetables in their landfill.


But those same owners failed to mention the landfill during the handover


Not to mention, it was almost certainly illegal.


Out of sight, out of mind!


to be fair ulrikrasmussen didn't know the previous owner was trying to plant scrap trees


A similar thing happened at a relatives' house, a long disused storage space under a deck needed to be cleaned and whatever natural forces were at work had accumulated enough new dirt to actually bury items stored under there (a similar array of items, since no one had played with the children's stuff and a few old chairs and such had been thrown there).

It's a lot of work to dig a hole large enough for a water heater, I wouldn't be surprised if something similar happened (I probably would have also checked inside the water heater since if you wanted to bury something and keep it dry one might consider a water heater tank as a possible container, not sure it actually works but it's a natural idea).


When is the last time leaving your keys in the car caused your house to suddenly slide 10 foot southwest?

When is the last time you flipped a light switch, and suddenly your pool disappeared?

Have you ever had French doors appear in your dining room because of a "Windows Update" on Wednesday morning?

Have you ever had to wait for half an hour for your house to boot later on that same Wednesday?

When is the last time you closed a door, and were killed by a hailstorm of bowling balls?

At least with a light switch, you know it's very unlikely to cause structural issues, or plumbing issues, or drain your bank account. Computers are singularly horrible in the ways things can fail.


I agree with your underlying point, but it's also important to point out that computers are also singularly wonderful in that it's usually much faster and easier to reverse failures, and then to diagnose and debug in a non-impactful manner.

To take your second example - if I could then flip the light switch back, and the pool reappeared, then I'd be miffed but not particularly annoyed (assuming I was able to fix that obvious-bug either myself or with an update in a timely fashion). If the pool stayed gone, then yeah, I'd be pissed.

Of course, that whole argument goes out the window when the tech in question isn't controlled by you. Which is often the case.


Tell that to the 346 people who perished because of negligent and (in my opinion, malicious in terms of regulatory deception) undocumented, uncommunicated programming of the speed trim system of the 737 MAX.

Or the folks who perished because of badly programmed software interlocks on the THERAC-25 radiotherapy machine.

Just knowing or figuring out to flip that switch may be an insurmountable barrier depending on the circumstances when a failure state occurs. Especially when the implementation is intentionally hidden so as to facilitate continued market value extraction opportunities from the happy accident of information asymmetry.


I agree with the sentiment of the post and the replies.

Yet your examples hint at something more.

Those massive failures are by people not by tech. Mismanagement and incompetence and systems designed to obfuscate accountability.

Which happens aplenty in non tech fields.


In wiring a house, there is a built in assumption that something could go wrong and disrupt the wiring. That's why we had fuses, and now circuit breakers, grounding, ground fault interrupters, metal conduit, etc. All of these serve to limit the side effects of faults.

When you turn on a switch... it's part of a circuit which is current limited, and in fact there are several limits on that current, all the way back to the source... each designed to protect a link in the chain. Each of those breakers limits the capability to source current further downstream.

When you run a task in any modern OS, it runs with the full privileges of the user id with which it was launched. This is like hooking a generating station directly up to your floor lamp in the living room with no breakers. If the process has a fault, there is nothing the Operating System will do to prevent it from being used to subvert other parts of the system, there is no limit to what it can do.

There are systems that require you to specify how many resources a given task is to be allowed to access. It turns out that such systems can be just as user friendly as the ones we're used to, but they do require things be re-written because the ground assumptions in the security model are different.

Capability Based Security (also known as "Multi-Level Security) was born out of a need to have both Sensitive and Secret information shared on a computer that scheduled Air Traffic during the Vietnam Conflict. (If I remember the situation correctly) The flights themselves were sensitive, and the locations of the enemy radar were top secret (because people risked their lives spying to find them).

It was extremely important that information could not leak, and solutions were found, and work!

About 10 years ago, when I learned about this, and considered the scope of work required to make it available in general purpose Operating Systems, I estimated it would take 15 years until the need for Capability Based Security would be realized, and another 5 more or so until it was ready. I think we're on track.... 2025 people will start adopting it, and 2030 it will be the defacto way things are done.

Genode is a long standing project to bring this new type of security to the masses... I'm still waiting until the point I get to play with it... and have been for a while.

Things will get better... these types of tools, along with "information hiding", getting rid of raw pointers and other clever but dangerous tricks will help as well.

[Edit: Re-arranged to clarify, and improve flow]


The problem with an increase in security is that it almost always comes with a tradeoff of higher complexity. Higher complexity means more difficulty tracing. It also means the state space of a general purpose machine ostensibly there to be configured to fulfill the user's goals is a priori heavily constrained.

Point being, I don't see a shift in the direction of security above usability or ease of mentally modeling doing anything but worsening the problem. I could be wrong on that though, but the last 20 or so years of further encroachment by industry on User's perogative to configure their machine as they like doesn't inspire great confidence in me.

I can say I'm totally reading up on that though. I hadn't heard of it before, and it sounds interesting.


Completely agree - hence why I said _usually_. Another example of irrevocable harm is when ML algorithms dictate some medical treatment or social program.

But, _usually_, it's easier to reverse some changed-data somewhere than it is to reverse an actual change-of-state in the physical world. At least, the inherent effort required to do so is less - but policies or obfuscation may make it harder.


I’d argue computer programs failing mode are often less gruesome that real life’s gas and electric failures.

As a kid we had a gas range, and it was pretty easy turn on a burner and just leave it open without lighting it. Or just start cooking something and forget about it, depending on your situation your house is gone.


Normally the gas has quite a distinctive odor just for these kinds of situations. Sucks if you leave your house and enter it again lighting a cigarette though.


> When is the last time you flipped a light switch, and suddenly your pool disappeared?

Or the pool just disappeared for no reason and you couldn't get it back unless you sold your house and rebought it?


Whens the last time that you had a car door working door, and it fell off when you opened it? (MVP, no tests) [I'm not talking about a worn out car]


I don’t know where you got these examples, but they were fantastic.


Just trying to make analogies people can understand over the years.

The current state of computer security.... is like building a fort, out of cases of C-4 explosive.

How so? Almost every program has bugs, many of which can be exploited remotely. It is effectively impossible NOT to have a zero-day exploitable hole in any given computer. Thus, every single computer can be exploited... and then used to attack more computers.... in a chain reaction.... like a fort built out of C-4.


I think the difference is that the entire software/hardware stack is a world created entirely by humans, untouched by "reality" for the purposes of all these annoyances, so it feels like we should be able to wrangle it better after so many decades. It's entirely our own creation, and we decide every iota of it, and yet it bites us (justifiably or not - turns out thousands of people each creating different layers of a gigantic state machine is hard to perfectly coordinate, but we may have been able to do better by now if we had been more thoughtful and patient throughout).


I hear you, but feel like we are biased by what we accepted as normal in our formative years, and that filter doesn’t apply on what we are discovering now that we’re grown up professionals.

For instance books have been with us for centuries, and honestly most of them suck. Paper pages are thin and sometimes cut your finger (how many times did you get cut by an ebook ?), most are weak to liquids yet our world is filled with liquids everywhere, sometimes coming down from the sky. Updates are painful and costly and non scalable. Font sizes can’t be changed, you have to use an external device to deal with it.

Not saying there are perfect alternatives or that the tradeoffs don’t make sense. Just that we learned very early that books have these limitations and we’ll need to live with them to be a member of society. And we can agree all of these aspects could be and sometimes are fixed, but most people are just ok with books ‘sucking’ in those ways.


We’ve also had centuries to improve the technology of books and I think that makes a difference.

Although the weaknesses you cite seem like problems in search of a solution. No one ever expected books to have variable font size ... why would they?

Lastly let’s recall the book five hundred years ago is dramatically different from the book of today. For example your point about liquids is now in many ways resolved by the cheapness and ubiquity of books. 500 years ago, not so much.


On book font size, there actually is a market solution for the issue: if enough sales are expected the same book (same content) will be sold in different formats, pocket size, deluxe paperback, standard edition etc.

Same for translations, with even books with dual languages side by side.

I find fascinating how the arrival of ebook readers made us rethink how we relate to books, and a lot of annoyances got surfaced only now because there was no comparison point before. My favorite is how you cannot ignore the length of a book while reading it: you can’t pretend not realizing there’s only a dozen pages left and the story must come to an end.


While nobody expected books to have variable font sizes, the fact that ebooks do allow it to be adjusted means that people with deteriorating vision may still read them.


A magnifying glass was the original solution to this problem. They never run out of battery.


And you can use any glass with any book!


Yeah... and once you're no longer paying by the page there's really little upside to using a small font you have to squint at or use margins that are too narrow to easily scan the page. I have no idea how long the books I read are, but I probably read them a few hundred words per screen simply because it's way easier to keep my place. Average for a small paperback exceeds 300.


In the early days of the Gutenberg press when most were illiterate, they would gather together and the one person could read would read to the rest of the group. So, arguably, it was both easier and more inclusive for a blind person to read what there was to read then than now. At least they didn't have to rely on any special accommodations.


It's worth noting that it took 75 years after Gutenberg's press before some disgusted printer came up with the idea of page numbers. As the saying goes, all progress depends on the unreasonable man, who eventually becomes disgusted enough to make things change. Quality matters, pride in design and workmanship matters, and it's not at all bigoted to point out that China, which now manufactures most of the stuff in our 21st century world (or at least the components of it), has a culture of designing and producing absolute dung. We should not accept unacceptable quality just for apparently low prices.


Books are also capable of being copied before or when damaged, passed on trivially, and are not prone to sudden existential failure because a server on the other side of the world was deactivated.

They can't be stolen back by the publisher or seller, can be trivially transformed into different formats, can take annotations, can be rebound with wear, and even if paper has it's faults, reading a page of a well maintained page in 1000 years is as easy as as the day it was written, even if significant swathes of technological backslide occur, and is only prone to the challenge created by human cultural evolution as opposed to loss of the processor or software required to decode/convert/display it.

An HDCP protected chunk of media may as well not exist in 1000 years.


> I think the difference is that the entire software/hardware stack is a world created entirely by humans, untouched by "reality" for the purposes of all these annoyances, so it feels like we should be able to wrangle it better after so many decades.

Humans, as the makers of these systems, are part of that reality, which was not created by us. The reality is that we are great apes writing precise machine instructions with our general intelligence that was not purpose built for being that precise but selected for survival. Our cognitive machinery cannot exhaustively predict all the possibilities of failure of what we write, if we are working in teams, we have transfer most of our technical ideas still through natural language, in a combinatorially increasing manner as the team size increases etc. None of this is user hostile, it is just human fallibilities and limitations in play. And since we can't alter our cognitive capacity drastically, we can only make more machines against these (e.g. unittests) with their own limitations. I think the scale of what we have been achieving despite these limitations are just fantastic.

If anything users are becoming too egocentric, expecting the world to conform to their comfort, with a dash of construal level fallacy, underestimating from a mile away how easy it would be to write bug free programs with perfect designs in a real world, by real people, with real budgets etc.


> If anything users are becoming too egocentric, expecting the world to conform to their comfort, with a dash of construal level fallacy, underestimating from a mile away how easy it would be to write bug free programs with perfect designs in a real world, by real people, with real budgets etc.

Selection bias. You only hear from users who want new features. You rarely hear from users who don't want new features and just want software to stop being buggy and acting like a goddamn haunted house.


I was talking about the “users who don't want new features and just want software to stop being buggy and acting like a goddamn haunted house.” so the selection bias is yours.

Most bugs are just annoying, consequences are not catastrophic if your favorite tuner app forgets your settings, your word processor messes up the formatting, your pdf reader crashes. You can recover with some frustration and wasted time. The perception of these being catastrophic failures shows the sense of entitlement users have because they are used to a certain smoothness in their experience and expect everything to go their way. This doesn’t match the underlying realities of the task; it is very easy to construe a sense of a working program in one’s mind but it is exponentially difficult to make the implementation actually bug free, usable and functional the way user wants.


> Most bugs are just annoying, consequences are not catastrophic if your favorite tuner app forgets your settings, your word processor messes up the formatting, your pdf reader crashes.

So what? Users get upset when your crap doesn't work. Stop being flippant and pushing back. Pushing back is not your (our) job. Complaining how hard your job is not your job. Griping and moaning about irate users is also not your job. Delivering a product that does what is says it will do on the tin is actually your job. Believe it or not, you produce something people depend on!

Imagine your power steering goes out on left hand turns going north downhill. You take it into the mechanic and all you get is "That's just annoying, not catastrophic. You can recover with just some wasted time. It's exponentially more difficult to make the implementation actually bug free!"

Users quite rightly spot bullshit excuses. And we have none. Save the settings, fix the formatting, stop the crashing.


> Pushing back is not your (our) job

Please tell me more about my job internet stranger.

You’re making the same arguments without adding substance, just emotional rhetoric and unnecessary personalizing.

> Imagine your power steering goes out on left hand turns going north downhill.

Imagine that steering wheel stopped working depending on the highway you’re driving on (software & hardware platform). Why didn’t they test this on every single highway? Because that would be was combinatorially explosive.

I’m glad you’re making a physical world analogy. Comparable physical world projects have orders of magnitude less moving parts that need to interfit, and assembly gives immediate feedback whether they can fit. They also have orders of magnitude less inputs they are expected to operate on, which makes it easier to exhaustively test their function.

“Shut up and just make it work” might have been popularized by certain tech personas, but unless you have Steve Jobs levels of clout, pulling that stuff in most dev shops will quickly make you very unpopular whether you’re a IC, a manager or a product manager.


> Imagine that steering wheel stopped working depending on the highway you’re driving on (software & hardware platform). Why didn’t they test this on every single highway? Because that would be was combinatorially explosive....

Users guffaw at this point. They do not understand why your stuff is so complicated and broken. They think you suck at your job. Both you in the collective sense (you engineers) and you in the personal sense. They start plotting ways to stop using your stuff because you are so frustrating to deal with.

> They also have orders of magnitude less inputs they are expected to operate on, which makes it easier to exhaustively test their function.

I think you still do not understand my point. Users fundamentally do not care about it. Everything, to them, is a problem of your creation and they'd quite rightly regard your long-winded explanations with complete skepticism. To you it always feels likes it's someone else's fault, but to users it sounds like complete BS. No matter how right you are about it being someone else's fault. Someone else's fault is the very definition of a lame excuse from their perspective. They are still getting nowhere and you are even more useless to them because you still can't fix anything and just confuse them and waste their time.

It's a very user-hostile attitude and makes for terrible customer relations. That attitude is also counter productive and helps no one. No wonder people hate tech.


Reality creeps in from us creating tech-utopia trough "legacy" systems where the internet, cryptography, multi-core architecture and full program isolation didn't exist yet.

Software has a nasty habit of iterating on yesterday's ideas instead of rewriting for tomorrow. Not that there's anything wrong with that, it seems to be the path of least resistance thusfar.


I disagree - by and large, we don't need to "rewrite for tomorrow". Almost every significant problem in Computer Science and Software Engineering was solved (well) decades ago, often in the 1960s. Security is a bigger issue now, but it was largely solved years ago.

The problem is that we do engage in so much "rewriting", instead of leveraging known good, quality code (or at least fully-fleshed out algos, etc.) in our "new, modern, elegant, and trendy" software edifices of crap.

To me, this may be the one really good thing to come of the cloud (as opposed to the re-mainframe-ication of IT): the "OS" of the 21st century, allowing plumbing together proven scalable and reliable cloud/network services to build better software. (Again, not a new idea, this was behind the "Unix Philosophy" of pipes, filters, and making each program do one thing well. Eventually, it will be in fashion to realize this really is a better way...)

We need smaller, better software, not the latest trendy languages, insanely complex platforms that no one understands, and toolchains of staggering complexity that produce crap code so bloated that it requires computers thousands of times faster than the Crays of the 1990s just to do ordinary desktop stuff. (Seriously, folks, the Raspberry Pi 4 on the next table is a rough match for the Cray we had in Houston when I worked for Chevron in the early 90s! Think about that, and then think about how little data you really need to deal with to do the job, vs what you're actually shuffling around.)


You reminded me of this quote.

“Einstein repeatedly argued that there must be simplified explanations of nature, because God is not capricious or arbitrary. No such faith comforts the software engineer.”


We probably could make gas stink less just like we could fix that bug. The ROI just isn’t there.


Gas stinking is a feature, not a bug. It's a safety measure. Particularly when we're talking gas (and not gasoline), which is naturally odorless and made to stink on purpose.


> Doesn’t reality suck the same ?

No. A hardware product like a car has predictable wear and tear governed mainly by the laws of physics. The fact that I can no longer use my smart speaker because the manufacturer decided to stop supporting it, went out of business, or got bought is not at all the same. My car will still work through all of those things in the exact same way. It also doesn't throw up random dialogs (or whatever a physical equivalent would be) that stop the product from working until I interact with it. Not the same at all.


A car has tons of parts that, sure are "governed by physics", but in effect just randomly fail. I can theoretically understand that my there's a clunk in the frontend of my car because I've exceeded the MTTF of a suspension bushing. To almost everyone though, it's essentially just a random event based on nothing they've perceived.


John Deere has entered the chat.

Also, car companies have been tinkering with "electronic repossession" - remote kill switches due to nonpayment.

So ... get ready for other things to suck as we attach technology to them.


> John Deere has entered the chat.

Thank you for bringing this point. The actual problem is not the software itself, but its proprietary nature and infinite hunt for profit without any limits. Consider free software instead and you will see that it is improving year by year, despite very slowly (which is logical, in the absence of infinite resources).

My Linux machine never forces me to reboot, shows any ads or suddenly changes its interface.


> It also doesn't throw up random dialogs (or whatever a physical equivalent would be) that stop the product from working until I interact with it.

I see you never had a (EU) Ford.


Sure there are things that don’t work, but it’s not nearly comparable. In my life I’ve never had a problem with a refrigerator, had maybe two problems with washer/dryer, my closet Just Works, books Just Work (with an ink smudge maybe every 50-100 reading hours) etc. I can expect each of those things to last decades. Looking at the goods I’ve bought on Amazon recently, digital electronics/software as a category doesn’t hold a candle to everything else I buy in terms of reliability.


> never had a problem with a refrigerator

These turn of phrases make me wonder what we are really expecting from software.

I can’t imagine you never slapped the door of your fridge and it didn’t properly close. You gave it a nudge when you realized it, and it was fine, but it must have happened. And your whole food supply would be rotten if you didn’t notice in time.

Or do we monitor energy consumption close enough to realize it’s eating much more than what should be expected, the same way people complain about chrome eating too much memory ?

It can also get pretty noisy but I’d assume most people just think it’s normal.

And we put the blame on ourselves for a lot of issues (didn’t put the bottle at the right place, didn’t put the right amount of force to close, didn’t set the right temperature, forget to leave space around the fridge for ventilation etc.). But few users blame themselves for not having understood the software and worked around its weakness, we just call it broken.

That benavior is normal, but I’d take a lot of “my applicances just work” with a grain of salt.


> I can’t imagine you never slapped the door of your fridge and it didn’t properly close. You gave it a nudge when you realized it, and it was fine, but it must have happened. And your whole food supply would be rotten if you didn’t notice in time.

But if the fridge was software it would randomly turn off and ruin all your food. The light would sometimes stay on, except when you open the door. It would require you to decline an update before you could get the milk out for breakfast. During an update the fridge and freezer components would switch places and then give tips about efficient ways you could manually move everything. If you bought a new fridge, part of it would locked shut until you paid more money, but the one in the store was already unlocked. And god forbid you lose your 2FA device used to setup the fridge -- it will destroy everything inside (including irreplaceable heirloom tomatoes) upon reset. It will then update to a subscription model where custom temperature settings will require a monthly fee or you'll be limited in the number of items you can store in the fridge or number of times you can open the door per day.


We saw a failure case like this with a microwave in the workplace kitchen. It somehow got into a mode where it only turned on when the door opened. Needless to say we threw it out shortly after that was discovered. We didn't bother debugging it, but my guess is it was a hardware problem because the interlock should have obviously made it work the opposite of how it was and you'd hope that a software glitch couldn't get it into a mode like that!


Oh crap. That could blind a person. I thought microwaves had to have a hardware interlock for that reason


Since this is critical health and safety stuff that can lead to serious injury, everyone I've ever seen uses hardware interlocks - generally, the door-closed switch is in series with the microwave power supply, so it's impossible for it to make microwaves with the door open. Only an idiot would put a safety interlock under software control, when a simple switch will do.


A lot of these look like pricing and marketing issues to me.

Fridges have been with us long enough in a ‘pay everything upfront’ setting that we‘d battle to the bitter end if we had to do micro-payments or aggressive planned obsolescence.

To your point, I lived in long stay apartments where you put coins to have the fridge and air conditioning work because they didn’t bother having pay as you leave metered use. That’s super rare (I hope ? I’d expect the same in some predatory situations towards low income people), but it’s to say that alternative exists.

Otherwise fridges randomly turning off is just a matter of time and/or build quality. Sooner or later it happens (or it stops turning on, which is arguably better, but you wouldn’t say it’s great)


> In my life I’ve never had a problem with a refrigerator, had maybe two problems with washer/dryer

I think blaming software for this is a little naive. Take a look at consumer reports for any modern fridge, stovetop/oven, washer/dryer, etc, and you will see complaints about fridge motors dying, touch panels going on the fritz, etc. -- none of which involve anything more than low level firmware.

If you want to put a tinfoil hat on, you can consider that it may be planned obsolescence, but to put the blame squarely on software, I would disagree with.


> If you want to put a tinfoil hat on, you can consider that it may be planned obsolescence, but to put the blame squarely on software, I would disagree with.

You don't need tin foil hat when facing the truth :).

Also, while things you mentioned aren't software-related, they're modern tech-related. Like, last 20 years. Fridge motors dry out because they're being made cheaper, with not enough lubricant put into them and no option of opening them up and pouring in the grease. Touch panels are going on the fritz because touch panels suck (that's often very much software), and they shouldn't be put on appliances in the first place. But it's cheaper, so they are.

Worth noting that there wasn't some big price reduction happening from solid appliances 20 years ago to crappy ones today. Price remained roughly fixed, but appliances started to suck more and more.


Right, but the move towards cheaper and lower-quality is more the fault of the current economic system and its incentives than it is the fault of software.


It's very instructive to look back to the 70's when electronics running a little bit of software had just come into being.

The big deal, at first, was really with memory. Your alarm clock could ring at the same time reliably. If you invested in a VCR, it could record at a programmed time. If you had a synthesizer it could store and recall exact preprogrammed patches. Pinball machines could downsize in weight and keep truly accurate scores instead of relying on tempermental relays and score reels. And so on, with every category of gadgets getting the computerization treatment. Although not everything succeeded there were lots of straightforward cost and quality improvements, with the main downside being that IC designs are less obviously repairable.

And then pretty much every year afterward, the push was towards cheaper with more software, with decreasing margins of real improvement, with the "smart" device representing an endpoint where the product is often price discounted because its networking capability lets it collect and sell data.

What comes to mind is the Rube Goldberg machines and their expression of a past era of exuberant invention, where physical machines were becoming increasingly intricate in ways not entirely practical. Our software is kind of like that.


"Kind of" like that?

Every other week I read about someone's entirely-too-roundabout way of doing X via an IoT device (requiring packets to probably circumnavigate the globe). Meanwhile I'm sitting here opening my garage door with a physically wired switch like a pleb.


Why do we have a touch panel on a fridge in the first place? The only thing we need to be able to specify is desired temperature...


... and at that, one could argue that even that isn't necessary.

I just checked my fridge, it has six buttons and an LCD panel, and in all my (4) years of home ownership, I haven't touched the buttons a single time.


> Worth noting that there wasn't some big price reduction happening from solid appliances 20 years ago to crappy ones today. Price remained roughly fixed, but appliances started to suck more and more.

First, the "solid appliances" weren't 20 years ago, but more like 25-30.

And though there wasn't a big price reduction in the interim:

- Refrigerators are more energy efficient.

- Refrigerators have larger internal volume for a given size.

Equivalent improvements have been made to other appliance types such as washers and dryers, but not stoves, as far as I know.

Those improvements are largely orthogonal to declining design and build quality, but it should be noted that there are at least some ways in which newer appliances have been getting better (that aren't just gimmicky features) while prices remained the same.


Right, but my $400 bose headphones have a broken integration with my $2400 mbp. Swiping the volume controls on the headphones also moves the balance.

Conveniently, because macos is ass, it's nondeterministic whether the balance controls display in Sound Preferences to fix the balance issue. You just have to open and close the settings panel in the hopes that it will display.

I'm a software engineer and I don't even know where to begin to debug this idiocy.

Duolingo regularly freezes audio in chrome. Once this happens, no audio will play in chrome until you restart or kill "Utility: Audio Service" with the chrome task manager.


This is the second time in as many weeks I've read a complaint about Bose headphones. The first was that their Bluetooth was so janky the audio itself was delayed multiple seconds and out of sync with video playing on the device.

That blew my mind, my $20 Amazon-purchased Bluetooth earphones just work™ with no delay.


I guess it depends how fussy you are. I've had several fridges/freezers/ovens that don't actually maintain the set temperature. for ovens this is merely annoying, but for fridges and especially freezers, this is a food safety issue. on fridges/freezers that have a dial instead of a digital temperature control, I've found that some just can't maintain a stable temperature, no matter how much I fiddle with them. after setting them "just right" with my own thermometer, I'll come back the next day to find an exploded bottle in my fridge or cold water in my ice tray.

books work really well until a pipe bursts in your attic. then you wake up and notice half your collection has been ruined (personal experience).


Well, digital services are much more complicated than a fridge, or a book. Not only that, but they also require a machine to run that is also orders of magnitude more complicated than a fridge or a closet.


Do you know how hard it is to produce a book from nothing? Make paper, all the same thick, print and so on. Or just a steel tube .. all can be done without computers and software. The process behind is difficult for every step. I work in industrie automation and I can tell you, right now, with this quality in software and "computer everywhere" we are building a super high tower in the softest sand.


Part of the complaint is that these things don’t have to be as complicated as they are. The fridge with the touchscreen screen is usually more frustrating than the fridge from thirty years ago. The smart TV is usually more frustrating for its smart features.

Things are getting more complicated, like you say, but they frequently aren’t getting enough better to justify the added complexity, especially given all the issues that come along with it.


> Doesn’t reality suck the same?

To me, software is as if when I open a book to read it, then, the book suddenly snaps itself shut, hurting my fingers.

Thereafter, the book gets wings, tries to fly away, but bumps into my coffee mug on my desk, so coffee spills on the floor. Then the book does fly out through the closed window — smashing the glass into pieces — and gets larger and larger wings, flying higher and higher until it disappears into the blue sky.

It's as if software was alive, and does random things I don't always agree with.

But actually — the bees building nests on the balcony: That feels pretty close to misbehaving software. Or the cat, bringing in a snake from outdoors. Or a squirrel shewing through a power cable, shutting down a city.


There is a difference between design trade offs and flawed design of deviations from the design. Your car does what it’s supposed to do, within the predictable limits imposed by the fact that it’s a gas-powered car. Since MacOS 10.15.4 or .5, my 16” MacBook Pro crashes waking up from sleep due to some glitch in waking up the GPU.

Of course, people perceive that software sucks because it’s more complicated than people perceive. I forget what book said it, but an operating system has more separate components than an aircraft carrier and they’re more tightly coupled. (I’m not sure that’s true, but it conveys the idea.)


Houses, cars, etc are far more reliable and well designed than software. Think about all the extreme conditions cars continue to function in. How many people don't even follow basic maintenance schedules?

Another key difference is that in maintenance of your home, you have complete control. It's extremely easy to understand and act to improve or maintain it. When large software systems (like the IRS login) have problems, you are totally helpless.


> Houses, cars, etc are far more reliable and well designed than software

Buy software the price of a house and you’ll be right to expect the same build level.

Then even at the price of your house you’ll have fun with mold growing inside the walls issues, soil that degrades in unexpected ways after heavy rain hits the hill you’re built on; rooms were fresh and bright enough on a hot summer day when you visited, but you realize overall orientation makes way darker and gloomy in winter that you expected. And you’ll pay for that house for your next 20 years.

Cars are the same at a lower level, and you see small issues creep up as you lower your budget (or go buy a fancy vintage italian car and you’re in for the wild ride).

> Another key difference is that in maintenance of your home, you have complete control.

In the good old days people had timers on their desk to remember to restart programs before they crash. Also saving stuff, making backups etc.

Of course online services are a different beast, but it’s more akin to fighting bureaucracy, which I see as a our society’s software in a way, with the shitty forms with not enough space for your name and other niceties.


This is a straw man argument.

Cars vary widely in their product quality. Houses vary widely in their product quality. Some things in life are inevitable facts of nature, but product quality is not. Quality is to a large extent determined by the time and care taken by the manufacturer.


>My gas car stinks, destroys the planet, needs yearly maintenance, crashes in everything the second I stop paying attention.

That's not a good example, nor is it parallel to the dynamic the article describes.

Your car stinks a lot less than cars did 10/30/50 years ago (emits less in the way of pollutants or CO2 per mile driven), is less likely to kill you in a crash involving the same size cars/velocities (despite weighing less!), needs less maintenance, lasts longer, and can notify you of potential collisions and sometimes avoid them.

It's probably only worse in terms of education needed to perform maintenance or nominal sticker price.


But devices have also gotten smaller, lighter and more efficient, and software can also do much more today than it could a long time ago. I think the analogy is fine.


When the analogy was built off of saying that saying that cars are bad my metric X, when the claim was that software has gotten worse by metric X, and cars have actually gotten better my metric X, no, it's not a good analogy.

And yes, there are some ways in which hardware has improved. But the claim is that, judged by what you're using it for, most UX-apparent aspects have gotten worse. Is there a clear way this is wrong? If you look at most UX-apparent metrics, it hasn't. Latency from keystroke to character render has gotten worse. App start time has gotten worse. Lots of other latencies have gotten worse.

None of the nightmares described in the article were typical of software UX.

These would be arguably fine if the additional features you get were a necessary cost, but they're not.

I'm also not sure that devices have gotten more efficient in all respects. Each version of iOS gets more energy-intensive, for example.


> Latency from keystroke to character render has gotten worse. App start time has gotten worse. Lots of other latencies have gotten worse.

Do you have sources for this? I mean, I'm not sure there aren't rose-tinted glasses here.

> These would be arguably fine if the additional features you get were a necessary cost, but they're not.

> Each version of iOS gets more energy-intensive, for example.

I would argue that multitasking, camera postprocessing, widgets, background app refresh, and others are all features worthy of more resource usage. Many of those are things you can choose not to use if you want to save power.


Increasing keystroke latency was discussed on HN before: https://news.ycombinator.com/item?id=15485672

>I would argue that multitasking, camera postprocessing, widgets, background app refresh, and others are all features worthy of more resource usage. Many of those are things you can choose not to use if you want to save power.

For all those features turned off (before and after), the usage increases with each version.


Not the same, we expect the problems you mention. There are just some laws of nature that we get used to dealing with. Tech has the tendency to produce random problems. The one we have all dealt with is, everything was working fine and then suddenly stopped. You call tech support and after an hour of troubleshooting it with them we get, the, "We've never seen this before. It must be caused by one of your other SUCKEE tech toys." Ahhhhhhhhhh...


I think in 20~50 years those random software issues will be what we knew for our all life, basically what reality is, and we’ll just give warm patronizing looks to kids complaining stuff doesn’t work.


software doesn't decay, it just sucks even if you preserve it


I know a product that can do more damage with an accidental backspace: the iMessage app for MacOS (Messages).

If you're any sort of power user, you likely know that you can backspace by the word instead of the by the character, using Ctrl + BS on Linux or Cmd + BS on Mac.

In the Messages app, the shorcut to delete your _entire chat history_ is also Cmd + BS, and it works even if your caret is in the text box. So if you type five words and then Cmd + BS six times, you will be prompted to delete your entire chat history.

I do this almost every day. So far I've never compulsively hit return but I am dreading the day it happens.


Isn't alt-BS "delete word"? cmd-BS is "delete to beginning of line" for me.


This always annoyed me but it looks like this behavior is gone in Big Sur. You can ⌘⌫ to your heart's content.


It's funny how small a feature can make you want to upgrade.


Gnome Notes has similar behavior: Whenever you use Ctrl + BS, the note you are currently writing it gets put into the trash, even though you just wanted to delete a word quickly. You can recover the note so it's not the worst possible behavior, but it still sucks.


Option-Delete is whole word delete.


Coaca uses the Emacs style GNU readline keys in all the text fields, just use c-w.


> lose all the data you typed into a web form if you accidentally backspace while not in a text field, because it navigates off the page

Nowadays, I would consider this a problem with the browser. How often does one navigate backwards with the backspace key?

Recently, I had some doubts over whether or not I should clobber the native browser behaviour for "ctrl-s", but then I realized that nobody anywhere EVER saves a web page to disk... and if they really needed to, the browser toolbar is right there.


I for one fully expect the backspace button to work if I do not currently have a text field focused. once you learn keybindings for an application, there's usually no way to perform that function as quickly/efficiently with a mouse. please do not break the conventional ones.

ctrl-s is probably fine to break though. even when it does "save" the page, it rarely does so in a useful way.


On Mac the other default for navigating backwards is Cmd-[, if that helps any! (It's also a default in many applications for navigating backwards in whatever sense the app may intend.)

I don't use Windows other than for gaming, so I'm afraid I don't know if there are other shortcuts other than backspace.


Alt-Left is back and Alt-Right is forward. Sadly, their equivalents on Mac are also for navigating to the beginning and end of the line so if you're in a text box you lose that control.

Your alternative is handy. I wonder if it also works on Linux.


Command-[/] for history navigation and command-{/} for switching tabs are macOS conventional keybindings that work almost anywhere there’s a concept of history or tabs.

they may work on Linux in an attempt to support Mac users.


This is backwards. Mac OS inherited EMACS keybindings.They work in dialogs, etc.

https://stackoverflow.com/questions/25275598/a-list-of-all-e...


I could be wrong, but I don’t think these are emacs keybindings: although, the Cocoa text input system definitely is heavily influenced by emacs, both in design and in this sort of detail.

The emacs-derived keybindings use Control on macOS, the Mac ones use Command


fyi, you can get pretty far on the basic shortcuts subbing ctrl for cmd.


Oh, I fully expect CTRL-Q to close my browser window (with all the tabs) when I mistype CTRL-W. That doesn't make it any good.

Do you know at all times what element has the focus? An error there can be of high consequence. (Even though browsers do make an effort to refill forms on page forward, it doesn't always work.)

It is a very bad shortcut, and there's always an alternative one anyway, because it's not always available.


Fwiw, on Chrome in Mac, you can configure it so that pressing cmd-q won't close the browser without holding it for a few seconds.


I’ve just tested on Chrome: backspace on its own does nothing, going to the previous page is bound to cmd-LeftArrow.


Chrome made this intentional change about four years ago. Prior to that, backspace had the described behavior.


I'm one of those holdout firefox users. every day I come across more websites that only work on chrome though :/


Edge, or at least the Chromium version of it, has changed the navigate back key to be ALT+Left Arrow instead of the backspace key. It was annoying at first because I have over two decades of muscle memory for hitting backspace to go back. After a couple of days I got used to it and now am happy I can backspace without accidentally navigating away because focus wasn't where I thought it was.


Alt+Left/Right navigating through history has been in IE since at least IE4. I don't recall if it worked before then, and trying to look up ~25 year old documentation is somewhat difficult (especially since it would have shipped in a WinHelp file with the software instead).


Since backspace worked in every IE version up until Edge/Chromium, I never knew the ALT+arrows combinations worked. It wasn't until backspace stopped working that I did a search to find what was going on and saw the ALT key combos. So they may have been there all along but I never had reason to find out about them until recently.


Pretty sure I've been using Alt-Left as my back shortcut in Firefox for over a decade also


This was a change that Chromium introduced in July 2016. I remember it being slightly controversial at the time and the issue from the tracker being posted to HN :)


> but then I realized that nobody anywhere EVER saves a web page to disk...

Some people do it all the time. I was emailed a saved page the other day.

I was responsible for a single page web app, and the error detection code was stored in a <script> tag within the page, so I got plenty of “errors” logged for people trying to access saved pages.


https://xkcd.com/1172/ :)

Literally murdering children over here. I just knew somebody was going to come along and prove me wrong!


> I find bugs in every software I use (my freaking microwave oven control panel!).

My dishwasher, which has only buttons to select what to do during the next wash cycle, has a firmware bug.

Sometimes when the door is closed, it will start one of the pumps. If I cycle "heated drying" on then off again, the pump will stop. I figured this out because, well, I've worked on firmware and I understand the how of how software can be stupid.

After I learned to recognize watchdog resets, I started seeing them more and more often, and became even more terrified of how bad software is.


After I learned to recognize watchdog resets, I started seeing them more and more often

Yup, sounds like my TV. It's not even one of the smart ones, I was careful to avoid those. But once every few days, it stops responding to the remote control when performing some action (opening the EPG, switching channels). I then have to wait about ten seconds for the display to go dark and the TV to "reboot" itself, so I can continue channel surfing.


I prefer open-source tools because I know where the agency for pain lies: myself. With modern Rust and Go tooling you can download an application's source code, modify it, and compile it painlessly in half an hour.

So why do I tolerate bugs in software like that? Because I know I can fix them. And I also know I won't always. Small gods have handed me tools to remake the world as I would see fit and I do not use them. Are they at fault for not having made the world as I would prefer? Or am I at fault for not using the tools?

In any case, I've noticed a sort of dichotomy among users in their reaction to tools that fail. There are those who go "this tool sucks how can I do my work" and there are those who go "my work is what I want to do which tool can I use instead". The latter set get a lot more done. Once observing this I have attempted to modify my behaviour to be like the latter and have effectively become better.


> Small gods have handed me tools to remake the world as I would see fit and I do not use them.

But they didn't give you the only tool you really needed: Time.

Having meaningful access to the source is very important, but its value is limited because even small improvements often take a large amount of time especially to code you're unfamiliar with. Once you've made that improvement, maintaining it (or up-streaming it so someone else might maintain it) can take a tremendous amount of time.


Haha, yes. Time is a thing no one can make for me. But they cannot make it for themselves either. So I take what is offered gratefully and make what I can from it.


> "my work is what I want to do which tool can I use instead"

I can't wrap my head around this, could you explain further?


For amusement sake I removed punctuation. The contrast is between people who find themselves unable to move when a tool sucks and people who just solve a problem. The latter just move on from tools that don't solve their problem or solve it poorly. If the problem is big enough they solve it themselves.

They don't have to be engineers. They'll pay $100 / mo to solve it or hire a guy on Upwork to solve it or cobble something together on Zapier + Airtable. The thing is, the tool is insignificant. They don't really spend an appreciable amount of time on complaining about it because it's faster to stop using it.


I can easily see myself going the latter course of action but I do question how sustainable it is. I wouldn't personally enter a business to make a bag of money and then quickly leave so I'd mix approaches. How about you?


I think that sounds sound.


>I don’t know what has to happen for people to stop tolerating software as it is.

What we've got here is a question of cost and choice. If my choices are all equally bad, IE: vendor one is not any worse than vendor two, then inertia or cost become determining factors. In terms of consumer software - consumers have been conditioned to have low expectations, and these costs are further reduced because prices are so often free or very low-cost. In regards to commercial-focused packages - again, so often we put up with it because the systems we're using are so complicated and specialized that the pool of options are limited and/or the domain is so complicated that problems are inevitable.

So long as this is the landscape, few software producers have incentives to do the things necessary to improve, and/or believe they can spread the cost of improvement over a long period, IE: don't make the investment until the pain is too great.


> I don’t know what has to happen for people to stop tolerating software as it is.

Deaths, a lot of deaths.

Software Engineering needs a PE type licensure and a union. We need a way to stand together to advocate for better working conditions, practices, and tools.


What is the evidence that professional licensing would help? Even the most skilled programmers produce bugs. Professional licensing would only raise the barrier to entry for a well-paying job.


I completely agree. But I fear that if this happens then the companies will look for ways to produce software with less friction compared to the now-stricter software development practice in general. And it would be "let's hire Indians for peanuts" all over again.

Really not sure what's the way out of this corner that we've all collectively painted ourselves into.


I am the same. I have a motto, actually, for more than a decade: the more software you add to something, the worse it gets.

I coined it watching the robotic soda vending machine crash and reboot frequently.


Discipline and professional responsibility on the behalf of programmers, coupled with patience.

Everyone is in such an irrational hurry, it's been built into the "culture" such that rushing and making messes is acceptable. And by extension, customers expect things to be shit so you don't get in much trouble for doing it.

It's a feedback loop that only stops if companies (and individual programmers) start taking pride in craft > careless speed and money like they used to do back in the 50s/60s.


> It's a feedback loop that only stops if companies (and individual programmers) start taking pride in craft

Most programmers, and many companies, want to produce something of quality, well crafted.

The drive for low quality kibble comes directly from consumers, and the inability/cost of judging value.

A consumer can’t be expected to be a UI expert, and a slightly better UI might not drive sales because other factors are more important. I try to buy hardware with good UI, but I often make compromises for other factors.


That's a cop out for not doing the work. The choice is still in the hands of the people making the thing or offering the service.


> accidentally backspace while not in a text field

Thankfully that can be disabled, but I find it to be one of the most infuriating 'features' of Firefox. If it weren't something that could be turned off, it would be a deal killer all by itself.


Well, Chrome used to do this too until they removed it in 2016. Internet explorer does this. It was more of a default than a 'feature of Firefox'.


Also, in most normal web pages—at least those not using hideously over-complicated scripting—just pressing your "forward" button (Alt+Right, for example) will navigate forward and restore the edits you had in the form.


I didn't even know this "feature" was available on Firefox (despite using it since version 2). Turns out, it is only enabled on Windows:

http://kb.mozillazine.org/Browser.backspace_action

Maybe it was copied from IE to keep "closer to the platform"?


It's definitely not limited to Windows. I am exclusively a MacOS user (for my workstation, at least) and I have had to disable it on any new Firefox install.


My Windows 10 Desktop runs very well.

My FreeNAS, Arch Linux and my Android phone as well.

I think, we get paid because we are building new stuff and have to maintain shitty stuff. If my job would literaly just desigining it high level, clicking it together and then it works, no one would need me.

Yes its frustrating sometimes.

I would like to cure cancer instead of debugging why this update broke our system.


My microwave will run for a split second if you hit the start button. I suspect it’s running the software to operate where the time variable is 0 seconds. That there’s no guard against 0 so it just runs until the counter part of the code runs.

I’ve microwaved a burrito by mashing the start button hundreds of times.


The thing that drives me absolutely bonkers is incompatible "upgrades".


Software has n^2 as many paths a piece of code can take depending on how it branches. Also, you cannot simulate the rest of the world (externalities).

If I design a bracket for a TV mount, do you blame the bracket when someone hangs a 3000 kg bookcase on it?

You’re expect software to be perfect, yet ignore the massive limits everything in the world has.

Not saying there aren’t quality issues with software. I’m saying software development is really difficult.


Cobbler’s children have no shoes.


Software is indeed downright hostile sometimes... it's gotten better with autosave... but your IDE, MS Word, or Photoshop crashing would sometimes cost you half a day of work if not more. It was infuriating and even discouraging sometimes until you learned the subconscious behavior of pressing ctrl-s all the time.


> my freaking microwave oven control panel

You mean where 99 is greater than 100?


As an experiment, I logged every bug I encountered for a few days. I averaged 5 bugs a day, not counting dark patterns or bad design.

Everything is broken, and nobody is upset. [1]

Some of the software I use is so unreliable that I expect it to fail. I expect the Vodafone login page not to work properly. I expect one of my airpods not to connect on the first try. I expect my banking app to show random error messages, even though it works just fine. Most online stock brokers have issues at the worst possible times. My bookkeeping app is frequently wrong, per my tax advisor. Since everything is broken, the best I can do is to mentally assign all those apps a trustworthiness score, and avoid betting too much on them.

The worst part is that support for all that software has been largely automated. If you have a problem that can't be fixed by a chatbot or a crowdsourced support community, you are largely helpless. Google can wipe everything you love, and there's no one to punch in the face (to borrow from Grapes of Wrath).

So far, my only solution to this is to be a late adopter, and to favour simplicity over sophistication. I was recently considering going from paper notebooks to a tablet. That initiative stopped at the electronics store. The Surface Go wanted me to go through a setup wizard (after dismissing a few notifications). Two of the 4 iPads had working pencils. The ReMarkable reviews mention a host of issues. I never encountered any bugs with my Moleskine. It pairs flawlessly with any pencil I want, including older models.

[1] https://www.hanselman.com/blog/EverythingsBrokenAndNobodysUp...


Thank you for mentioning AirPods! I was so excited to buy them after a ton of reviews saying they just worked. They said the days of fighting with Bluetooth audio were gone. I believed them, then reality showed up.

My Mac sometimes unpairs them and worst it doesn’t find them. Some times while my baby is sleeping I put my AirPods and play a loud video just to realize they were not connected. My wife’s right side AirPod just stopped working after one year of use...

If apple is considered top tier in reliability, then technology in general really just sucks!


To provide an anecdotal counter-example, AirPods have worked seamlessly for me so far. Much quicker to connect and more reliable than Bluetooth. So their marketing isn't completely off-base :)


I was under the impression that AirPods works over Bluetooth, did Apple really invent their own wireless technology + protocol just for the AirPods?


It does (they can be used with Android devices), afaik only the pairing part is proprietary.


Shh nobody tell him AirPods are just bluetooth headphones :)

https://www.apple.com/airpods-pro/specs/


But they're not; they're one specific implementation of BT with some proprietary secret sauce for the multi-device pairing etc.


My manager had to buy new headphones as when she is WFH she used airpods and iphone to connect to calls and the irpods constantly cycled leaving garbage audio whenever she tried to speak.

I'm using a BT Jabra headset, with noise cancelling I got for about the same cost, 16+ hours of battery, easy pairing and super useful phone app, great ANC, and solid audio quality, at least to a non-audiophile. My biggest complaint is the closed back design leaves my ears a bit irritated after 4+ hours of use. Not an airpods competitor but for the cost I am way happier.


I use my AirPods with multiple devices, and that process is also fraught with problems. Switching to another device takes an absurdly long time. About once per month, switching will make bluetoothaudiod peg a core and ultimately hard crash the entire computer. Yes, the kernel panics if it doesn't hear from this userspace process frequently enough.

Surprisingly, iCloud syncing works fine. If I pair my AirPods with one device, it always pairs with all of them.


I accidentally coughed whiskey through my nose when I read that.


They work well enough, but they don't work magically well.

The main issue is with the right pod not always turning on when I take it out of the case. The solution is to put it back in the case for 5 seconds and to try again.

The second most important issue is the airpods falling out of sync with each other. It seems like the signal from my Samsung S9 in my pocket is choppy. Looking left or right for too long will make the signal drop. Putting my hands in my pockets also will. If I put the phone in my backpack, it's okay.

This is still more pleasant than wired headphones, but it's far from a magical experience.


I really don't get it.

Personally, I hate ear buds and, as such, never bought ear buds. Rather, I spent ~$20 on SoundBot bluetooth headphones starting some five years ago (long before air pods, methinks) and haven't had problems with them at all.

I also have a seven year-old phone (HTC OneMax) running custom (unofficial/ported by a random hacker) Android[0], and it pretty much works.

Sure the battery life has degraded since 2014, but that's to be expected, no? I wish I could replace the battery (as I did with my 15+ year-old Panasonic cordless phones), but there really aren't too many mainstream mobile devices that allow that any more.

As for poor quality software/hardware, if you don't like it, vote with your feet and/or wallet.

If stuff doesn't work, why use it? Even more, if stuff doesn't work and you can't/won't fix it yourself, then don't use it.

Software devs and hardware manufacturers don't care about whiny blog posts or complaints on HN, they care about the bottom line. Impact the bottom line and you may have a chance at improvement.

Stuff that actually addresses the issue is useful. A great example is the lack of Android support after 4.4/KitKat on the HTC OneMax mentioned above and the abandonment of it on Cyanogenmod/LineageOS in 2017, where those (myself included although I'd never hacked on Android ever -- and failed miserably -- thankfully someone else did not) impacted by this took action to provide the latest Android on an old, unsupported, discontinued device.

If you're not taking positive action toward making things better (whether that's fixing the problems or voting with your feet/wallet), then you're not going to have any impact.

While whinging about it on your blog may be a way to relieve the stress you feel about whatever issue(s) you may have, it's not constructive or useful.

That is unless your goal is to get lots of comments on HN where the Apple Fanbois sagely agree, and lament there's nothing to be done about it because Apple is the pinnacle of tech and since no one could possibly do anything better than Apple (or the apps that run on their gear) therefore all technology sucks.

And that's objectively false. There's lots of tech out there that's quite good. I suggest using that and shunning rather than using, then whinging about the stuff that sucks.

[0] https://forum.xda-developers.com/htc-one-max/rom-lineageos-1...

Edit: Fixed typos/formatting issues.


> I was recently considering going from paper notebooks to a tablet. That initiative stopped at the electronics store.

Good, you dodged a bullet there.

I mean, I love my 2-in-1 Dell (a slightly cheaper but still high-end Surface-like device). The pen, as much as it's useful (I'm not even considering buying a touchscreen-enabled device without solid support of a pen anymore; it's so much better UX than fingers), still has lots of subtle and annoying bugs. Maybe in 20 years people will work out the kinks. More likely, the concept will be abandoned in favor of some new modality that will also never be perfected.


>Everything is broken, and nobody is upset.

Most software is still net positive in productivity. We tend to place more emphasis on failures as users.

Remember you're running millions of lines of code that talks to other computers running millions of lines of code that communicates over a network running millions of lines of code to deliver some information on the order of seconds to minutes -- and then something responds to that information and everything happens all over again.

All day, every day, trillions of packets of information get delivered just fine. Try doing that as a human, delivering letters. You probably won't even approach a million packets delivered in your life time. And people have the audacity to say, "oh my, some things didn't work, this is completely broken"

In only a single generation, we went from voice communicators to super computers in our pockets. The utility vastly, vastly, vastly overshadows the glitches that come with frenetic advancement. How long did it take humans to invent basic numbers?


Everyone loves to complain and take for granted what good they have!


> So far, my only solution to this is to be a late adopter, and to favour simplicity over sophistication. I was recently considering going from paper notebooks to a tablet. That initiative stopped at the electronics store. The Surface Go wanted me to go through a setup wizard (after dismissing a few notifications). Two of the 4 iPads had working pencils. The ReMarkable reviews mention a host of issues. I never encountered any bugs with my Moleskine. It pairs flawlessly with any pencil I want, including older models.

Not to mention that it:

- doesn't need charging

- never freezes or crashes

- is much cheaper than a laptop or tablet

- distraction-free (no Internet, no apps, etc.)


All of those were on my list of cons, particularly the lack of distractions. I avoided the Surface completely because Windows is anything but quiet and maintenance-free.

The iPad seemed pretty solid, but I'd have to turn it on and unlock it to see my notes, unlike a notebook.

The Remarkable seemed nice, bht there are lots of complaints that paper doesn't have.

The Supernote A6X was the most promising, but it was hard do get in Germany.


I rsearched the ReMarkable 1 and 2. I ended up just getting a Rocketbook. It's very simple concept. "Paper" in the form of hard plastic. The pen it uses is the Pilot Frixion, which is an erasable pen. Hence you have a notebook to record notes for a while. If there's anything important I'll manually transfer it to my OneNote (I don't use Rocketbook's picture taking app).

Most notes I take only need to exist for a few weeks and then I erase...so transferring it to "long term storage" is rare.

I do have an iPad and note taking apps like Notability if I know something will need to go to "long term storage" but I find I use the Rocketbook more.


I was looking to replace the A6 sketchbook I carry everywhere, and the A5 notebook that always sits in front of my keyboard.

I thought it would be nice to access my notes when I don't have my notebook on me, and to have layers, zooming, undos etc. However, the more I look into it, the more absurd it seems.

I'm replacing a 15€ notebook and a 2€ mechanical pencil by a 400€ gadget that doesn't quite work. Why? So that I can spend my time organising notes in a digital space. Why? I don't really know.

It would be cool to have layers, zooming and an undo button. It would also be cool to have access to my notes even when I don't have my notebook. However, it would just be cool. It doesn't actually solve a serious problem.


The big draw for me for a Remarkable is being able to hand-sign and annotate PDFs without printing them out or using Adobe Reader's atrocious support for annotations. I'd love to be able to say that e-signing is the future but nobody really accepts it without question.

I'd also be replacing the piles and piles of legal pads I go through every year. Most of the time the notes are ephemeral except when I'm working across from someone in which case I really need them to exist digitally so I don't lose them.

I just wish I didn't have to wait 6 months for the second version.


The problem is with the idea of "continuous delivery". Many people fail to understand that technological advances only increase productivity if the innovations, the great leaps forward, are relatively rare, with long periods of stability and refinement in between.

There's always an adjustment period, where people have to spend time learning a new technology, and any issues with the new technology need to be resolved. The gains in productivity happen mainly after the adjustment period. But we've eliminated the periods of stability and are constantly pushing for more "innovation", which means we're in constant periods of adjustment and resolving problems, where the promise of increased productivity is never fully met.

The worst idea ever in technology is regularly scheduled updates. Innovation has never and will never happen on a schedule. This is simply greed-driven, promotion-driven, pointy-haired-boss-driven development.

Produce something new and great... but then let us all enjoy the new thing for a while. Novelty for its own sake is not productive.


> The worst idea ever in technology is regularly scheduled updates. Innovation has never and will never happen on a schedule. This is simply greed-driven, promotion-driven, pointy-haired-boss-driven development.

this is sort of uncharitable. the development/maintenance cycle for software is incompatible with the traditional way of monetizing a product (ie, design it up front, manufacture at scale, and then the buyer gets what they get, barring severe safety defects). buyers of software expect the product to at least mostly work in the first place, but they also expect bugs to continue to be fixed after the sale, even if the bugs are introduced through unforeseeable interactions with other software.

imo, subscriptions are actually the ideal way of aligning incentives for products that involve ongoing maintenance. but buyers tend to consider this a ripoff if they don't actually see a stream of new features in development. while it introduces some unfortunate constraints in the dev cycle, bundling up features in a scheduled update is a good way to make it visible to users that their subscription dollars aren't just falling into a black hole. trickling out new features "when they're done" earns the respect of engineers, but results in the average user simply not noticing that progress is being made.


> buyers of software expect the product to at least mostly work in the first place, but they also expect bugs to continue to be fixed after the sale,

Contrast that to traditional physical goods, where buyers expect the product to work as advertised, right out of the box, or their money back. Software in the Internet era has it easy, because it gets to release shitty half-finished versions, and then keep charging money while never quite finishing the software.

> subscriptions are actually the ideal way of aligning incentives for products that involve ongoing maintenance. but buyers tend to consider this a ripoff if they don't actually see a stream of new features in development

Because software does not decay on its own (despite the misleading term "bitrot" being popular in tech circles). That's literally why digitizing data has taken the world by storm: digital data does not decay; as long as the physical medium is readable, you can make a perfect copy of the data it contains. As a buyer, I don't expect my software to need maintenance. I expect it to work out of the box (just like I expect every physical product to work out of the box), and once I find software that fulfills my needs, I expect it to work all the way until computing technology moves forward so much that it's no longer possible to run the software. Which, in the era of virtual machines, may take decades.

So yeah, there's a need to clearly justify to the customers why you're charging subscription, because software in its natural state does not need maintenance.


>As a buyer, I don't expect my software to need maintenance. I expect it to work out of the box (just like I expect every physical product to work out of the box)

Software is far more complex than most physical products. There are only so many failure modes for a screwdriver or a couch and they're all pretty foreseeable. The most complicated physical systems, like a car, house, or even a human body, do need maintenance.

I'm frequently frustrated with software bugs like everyone else (my building and apartment have this awful smartlock system that's riddled with bugs and which bricked me out of my own home due to a bad app update a few weeks ago--shoutout to Latch) but I'm not sure I'm on aboard with an anti-maintenance attitude. If there are bugs I'd like them to be fixed!


> I'm not sure I'm on aboard with an anti-maintenance attitude. If there are bugs I'd like them to be fixed!

Me too! I'm not trying to be anti-maintenance (though I do wish technology will be developing towards less and maintenance required, but that's another topic). I'm pro-quality. The impression I'm having is that maintenance burden on software is being created in order to justify subscription model - and that the ability to do post-release updates made vendors and devs no longer care about delivering reliable and quality software (customers become the new QA, bugfixes can always be added latter, except they tend to be deprioritized in favor of new features).

Note I'm not postulating a conspiracy theory, just a spontaneous coordination of the entire industry due to market incentives. But the effect is still there, and I feel it needs to be countered.


> Because software does not decay on its own (despite the misleading term "bitrot" being popular in tech circles).

in theory yes, in practice no. I work on a product that targets windows and macos. on windows yeah, a version of our software from 2015 probably works as well as it did the day it was released. apple deprecates stuff in their API every year that we have to go back and update. they also break a lot of stuff that isn't formally deprecated and we have to find workarounds for that to. "our software will work forever as long as you never update your OS" is not acceptable to most customers.


Sure, but until quite recently, OSes were not updated at such breakneck pace. At this point we're hitting what can be seen as industry-wide, self-justifying scam: software needs subscriptions because it's being continuously updated; it's being continuously updated in big part because every other software is being continuously updated.

Still, as a Linux and Windows user, I've absolutely grown to expect my desktop software to work 10 years or more without updates. After that, I can always spin up a VM with an older Windows version.


I find that what causes SW to break most often is changes to the OS, either other app updates or OS updates.

I work in a hardware company, and for any important function I usually set up a dedicated computer, install the software, and then never touch it ever

This is how the more sophisticated oscilliscopes etc. work. They often have Windows XP installed if you buy them used. Simple, doesn't break, no internet, if it's mission critical or expensive it's worth a dedicated and frozen computer


> they also expect bugs to continue to be fixed after the sale

I wasn't disagreeing with that. I mentioned "long periods of stability and refinement" — refinement including bug fixes — and "any issues with the new technology need to be resolved". But again, bug fixes don't magically happen on a schedule either. Maybe fixes are easy, maybe they're hard, you never know in advance.

> bundling up features in a scheduled update is a good way to make it visible to users that their subscription dollars aren't just falling into a black hole

This is exactly why it's not true that "subscriptions are actually the ideal way of aligning incentives for products that involve ongoing maintenance". Instead of maintenance, subscriptions incentivize continuous delivery of new features, and consequently continuous delivery of new bugs.


I suspect this is a reversal of cause and effect.

Did consumers demand subscription services? Or did vendors (led by Adobe) decide to change to subscriptions to get uniform cash flow?

At agencies I have worked at all creatives I worked with would prefer to spend $200-400 and have a permanent software license. Perhaps this isn't a representative group.


I'm not saying consumers demanded subscriptions. vendors push them because it's a saner way of managing revenue for products that require maintenance post-sale anyway.

that said, I think consumers would prefer subscriptions if they understood how it aligned incentives. one way or another, a product will stop receiving support when the money stops flowing in. with a permanent license, it ends when people stop buying licenses. with subscriptions, it continues as long as enough people keep paying.


> I think consumers would prefer subscriptions if they understood how it aligned incentives.

Funny how consumers tend to despise them though.

The term "subscription" itself a typically a euphemism for "rental". There are a small number of companies who offer a year of updates that you get to keep forever (which makes consumers play the game of when exactly to buy to maximize the new features in that year), but most so-called subscriptions disable the software entirely if you stop paying. In other words, rental.

Long-term rental is almost always a bad deal for consumers. One of the few exceptions is housing, because many consumers can't afford to buy a house, and also houses are one of the least liquid assets you can own, if you have to move it (and yourself). Otherwise, rental is going to cost you a lot more in the long run.

Financially, rental can work well for the seller, of course, but we end up with "subscription fatigue", where the market can't sustain as many sellers, and the few rich companies get richer (which is exactly why they were "pioneered" by BigCos such as Adobe, Microsoft, and Apple).


> Funny how consumers tend to despise them though.

sure, and as an individual I behave the same way. I always want to solve my problem in the cheapest possible way. still, I can't help but notice that products with stable ongoing revenue tend to get much better support.

I think the clearest example is with games. most games get released with a pile of bugs. a bunch get fixed in a release day patch and then there are a few more patches over the next few months (when most of the sales happen). once the initial wave of sales subsides, you tend to be stuck with whatever bugs remain. cs:source had several game breaking bugs for years (defuse kit over bomb blocking defuse, defusing through walls, etc.) despite being one of the most popular FPS titles of its time. AFAIK, most of these still exist fifteen years later. csgo, which is monetized through microtransactions, gets bugs fixed almost as fast as they can be posted to reddit/youtube. microtransactions aren't quite the same as subscriptions, of course, but they generate revenue proportional to the current userbase, rather than the rate that people buy the game for the first time (which will inevitably dry up).


> that said, I think consumers would prefer subscriptions if they understood how it aligned incentives.

Actually, I think subscriptions misaligns incentives. With subscriptions, it becomes important for the vendors to keep releasing updates (so that the customers feel like they're getting value out of the subscription), which means having bug-free software is a terrible idea. You'd need to either release intentionally buggy software (so you can ship a follow-up version to fix it) or go on a feature treadmill (in which case trying to stabilize has rapidly diminishing returns and high opportunity cost).

As a consumer, software that was developed knowing it would never be fixed and has to be perfect the first try is much better (even if it still has bugs). Mario64 had bugs (e.g. backwards jump going really fast); but the bugs weren't really noticeable in normal gameplay because they couldn't just ship an update most of the size of the whole game before you start to play.


No, subscriptions introduce the perverse incentive to release unfinished products and slowly drip-feed fixes. Having to test extensively before a one-and-only release may not drive nearly as much revenue, nor provide a running deliverable stream for a given engineer/product manager's CV, but it is clearly the better user experience.


you say this, but given a choice between a "finished" product and a competitor with more bugs and more wanted/needed features, users will almost always pick the latter unless it catastrophically affects their workflow. engineers care about quality; customers care about the fastest way of solving their problem.


> engineers care about quality; customers care about the fastest way of solving their problem

Almost all customers care about quality. The problem is that many customers have only very limited information about products, so they have a hard time judging quality vs. competitors before (or even after) purchase.

This reveals a general problem with the market: it doesn't select for quality. Otherwise we wouldn't be having this conversation. The market is really good at producing cheap crap. So the truth is, yes, engineers have to care about quality. The motivation for quality has to come from pride in your own work, not from outside market forces. If you care about quality, then you have to strive for that over quantity, and also charge sustainable prices instead of trying to lowball. You may not be the market leader, but there are many profitable niches. Some customers are definitely willing to pay for quality.


> Almost all customers care about quality.

yes, but not to the exclusion of features. I work on a B2B product where our customers bill their customers by the hour, so they tend to have a pretty good idea of how much time a feature saves them. if a competitor adds a feature that cuts the time needed for a project in half (not unrealistic) but crashes and forces them to start over a quarter of the time, the customers will still buy their product instead of ours. they'll complain incessantly on the competitor's forums about the crashes and threaten to switch back to our product, but they won't actually do it unless we come up with something new that saves them even more time.

customers care about saving time and/or money; they only care about quality to the extent that it furthers that fundamental goal. if there is some bug-ridden alternative that solves their problem faster, they will do their best to find it and purchase it.

edit: to be clear, I mean "reliability" when I say "quality"; their are many other "qualities" a product can have, one of them being "cheap".


There's a big difference between not caring and making a tradeoff. You can care about A and B but decide B is more important than A. But if you don't care about A, then there's no tradeoff, you just choose B no matter what.

The question is, why do we make consumers make that tradeoff? Why are we shipping junk at all? There shouldn't be a reliability tradeoff. All products should be reliable. It ought to be a bare minimum standard.


> There's a big difference between not caring and making a tradeoff. You can care about A and B but decide B is more important than A. But if you don't care about A, then there's no tradeoff, you just choose B no matter what.

fair enough, I probably overstated my point with some of the wording.

> The question is, why do we make consumers make that tradeoff? Why are we shipping junk at all? There shouldn't be a reliability tradeoff. All products should be reliable. It ought to be a bare minimum standard.

everything in life is a tradeoff. we could make software more reliable, but then we would have to spend less time adding new features, or we would have to hire more/better engineers and charge more. maybe we even get to pay down tech debt and gain the ability to add features faster in the long run. doesn't really matter if someone else dumps a bunch of buggy new features in the meantime, converts our customers, and forces us out of business. in the absence of some industry-wide gentleman's agreement or regulation, we have to observe the behavior of customers and do what their behavior (not words!) indicates they want.


> doesn't really matter if someone else dumps a bunch of buggy new features in the meantime, converts our customers, and forces us out of business

This is always presented as the doomsday scenario, but how often does it actually happen?

The story of Apple in the Tim Cook era is unrelenting annual releases, more and more "subscriptions", massive return of cash to AAPL shareholders, but decreasing product quality. Did Apple make that tradeoff because they were scared of going out of business? No, they were doing very well before Cook took over. Cook simply has lower standards than Jobs, there's no other reason. He's been great for investors, not so great for customers.


You state that "unless" as if it weren't generally the common state of "competitor with more bugs."

No, users pick a finished product.


Developers get paid every couple of weeks, so an agile business tries to track the value gained from that expense on a similar cadence. The process has become more about corporate governance than delivering customer value.


> Innovation has never and will never happen on a schedule. This is simply greed-driven, promotion-driven, pointy-haired-boss-driven development.

For consumer goods, it is the hype cycle.

New updates and releases get press. They also restore consumer confidence.

If Samsung announced that next year they weren't releasing a new Galaxy phone, the entire industry would freak out. Consumers would lose confidence in buying Samsung phones, journalists would write articles questioning if Samsung was pulling out of the market, a lot of bad things would happen.

Give it 18 months without a release and people would start to think of Samsung as "that company that used to make phones."

They would have to fight like heck to restore their image.

Software is the same way. In the Vista/7/8 era, Microsoft looked like they were falling behind because their competitors started releasing yearly, or even bi-yearly, feature updates.

Sure every Android version up until 7 was kinda-sorta-terrible, but it kept Android in the news. Likewise, Apple got huge free press every time they announced a new revision of OS X (now MacOS), and every time they came out an announced a new version of iOS.

The result? "The desktop is dying, phones are where the real innovation is at!" articles being published even faster than those software updates came out.

You can of course release too fast, rarely do Chrome or Firefox's releases get any press (unless there is a controversial UI change), but in general frequent updates are free advertising.

Tesla is also great at this, I'm nowhere near being in the market for a Tesla, but at least a couple times a year I still end up hearing about software features they are rolling out!


> If Samsung announced that next year they weren't releasing a new Galaxy phone, the entire industry would freak out.

The smartphone industry has only themselves to blame for setting up this expectation. But it is possible to get off the train. I remember when Apple announced they were dropping out of the annual MacWorld San Francisco conference, because they didn't want to constrain their product release cycle. Apple survived that just fine.

> In the Vista/7/8 era, Microsoft looked like they were falling behind because their competitors started releasing yearly, or even bi-yearly, feature updates.

Competitors? Windows and Mac had near 100% market share on desktop. There was only 1 competitor. Vista was released in January 2007, but Mac OS X 10.5 Leopard was infamously delayed until October 2007 because of iPhone, so this telling of history doesn't seem entirely accurate. Moreover, Mac OS X releases were already slowing. Here's a list of months since the previous major .0 release:

10.1.0 6

10.2.0 11

10.3.0 14

10.4.0 18

10.5.0 18

10.6.0 22

10.7.0 23

10.8.0 12

10.9.0 15

10.10.0 12

10.11.0 11

10.12.0 12

10.13.0 12

10.14.0 12

10.15.0 12

Thus, major Mac OS releases were slowing down year by year — which is totally sensible — but then Steve Jobs died after 10.7 was released in 2011, and only then did they switch to a yearly schedule.


The competitor in the 7 era wasn't more PCs, it was tablet mania. Fear of tablets replacing PCs lead to Windows 8.

Of course that didn't come to pass, but everyone acted like it was the future and the market responded accordingly.


This criticism is based on the assumption that all updates are created equal. But they aren't.

One week, the update could be a relatively minor bug fix. The next week, a major feature upgrade that's been in the pipeline for months.

You also remove the ambiguity of "Is this worth pushing out? When should I push this out? Should I do some more fixes or push this one out first?". You got fixes, push them out in the next update.

Your criticism also assumes a small team. If you have a large enough team where you can split them into new feature development and current bug fixing, they're going to work at different rates and be ready at different times. If instead your entire team just works on "the product", then there is no effective difference between fixing issues and creating functionality.


I feel like the slow decline of software quality has been in lockstep with the gradual transition from (expensive and non-measurable) manual software/hardware testing and QA to automated frameworks and rollout-based quality assurance.

I constantly encounter broken functionality, buggy or unpleasant UIs, just as the author has. It feels like many of these problems could be avoided if you just had one person whose job it was to sit there and look for broken stuff. (I'm sure I'm biased as someone whose first job out of college was to sit there and look for broken stuff.)


I would tell a slightly different version of this story, focusing in on "rollout-based quality assurance".

I would say that effortless, automatic updates are to blame.

When you can always just push an update, the impact of a given bug goes way down. It's no longer mission-critical to exterminate flaws before shipping; a totally broken feature becomes a mere annoyance. So project prioritization shifts from polishing an artifact to outweighing the (presumed inevitable) constant stream of little annoyances with fixes and features. I think the shift towards automated testing is just a symptom; an attempt to bridge the gap in this brave new world.

For a clear-cut example of this phenomenon, look to the video game industry. Until around 2007, games received no updates. Ever. Once a game shipped, it was shipped. There wasn't even a mechanism for installing an update from physical media.

Right around that time, "glitches" went from very rare unicorns that people would spend lots of time actually seeking out, to nearly everyday occurrences. As long as it doesn't corrupt someone's save file, they mostly laugh it off and upload a clip to YouTube to show their friends. This is just how things are now.

(Edit: I should have scoped this to "console games")


> Until around 2007, games received no updates. Ever. Once a game shipped, it was shipped. There wasn't even a mechanism for installing an update from physical media.

Sure, but they still (sometimes) released (a few) extra revisions of a game. They were just targeted at people who bought physical copies after the revision date, rather than at existing customers.

Or said updates came on the 1.0 version of the game as shipped in markets that got the game later than others. (Just imagine — per-market release versioning. Every market effectively got its own fork of the codebase!)

Or said updates came in the form of a re-release port. There are patches made to the emulated downloadable app-store re-releases of some games, that never made it into any physical edition of the game.

Also, before home consoles, arcade game machines did receive bug-fix updates regularly. Arcade machines were essentially provided “as a service” from their manufacturers, with support contracts et al. Sort of like vending machines are today. If you reported a bug, they’d fix it and send you a new EEPROM chip to swap out in your cabinet. If there was a critical bug that affected all units, they’d send techs out to everybody’s machines to swap out the ROM for the newest revision. (For this reason, it’s actually kind of hard to do art-conservation / archiving of arcade games. The cabinets almost never have fully “original-release” components inside.)


I'm sure this happened occasionally, but it wasn't advertised. Nobody was buying a new copy to get an update. Reviewers weren't revising their reviews in accordance (something which does actually happen now). It was still absolutely mission-critical to get things as polished as possible the first time around.


I think you've both got a piece of it. I've programmed PC software, embedded software, and mobile software, and my gut feeling (without data) is that the shipped software quality is inversely proportional to the update frequency and ease of updates. Had nothing to do with how smart or skilled the developers and testers were. Had nothing to do with management's priorities. Update frequency and ease changed immensely once we could feasibly deliver patches over the Internet. Before easy updates, you'd actually quality check every corner of the application, you'd actually fix those P2s and P3s. You'd do exploratory testing off the test plan rails to find things. There was even a concept of "done" in software, as in, you eventually stop constantly jamming features in and tweaking the UI in maddening ways.

Now, it's just "LOL just ship it, users will just deal with it until the next release!" Now, it's "Do experiments on N% in prod and use end users for A/B testing. If something's broken we'll update!"

In several industries, it's actually totally expected that v1.0 of the application simply won't work at all. It's more important for these companies to ship non-working software than to miss the deadline and ship something that works! Because who cares? Users will bear the cost and update.


I agree that games used to have far fewer bugs when shipped, but it's not true that games never received updates back in the day. I distinctly remember queuing on file sharing sites as a kid in the early 2000s to download half-life updates and updates for other games.


I suppose my statements above should be mostly limited to games on consoles, not PC. That's what I had in mind.


I remember when DLC used to be called “a patch” and it was both bug fixes and also huge amounts of new content. I wish I could remember what games this pertained to.


absolutely this. Back in the early 2000s quality assurance was insane. A release is was working on (aaa title from a major studio) was blocked because players' eyes were rendered incorrectly and you had to really zoom in to even see the glitch. And of course you had to find and fix all bugs before october, or else you won't be able to hit Christmas sales.

Once internet updates became the norm, it all became pretty much like the rest of software industry. (At least game companies still have QA departments, a lot of mainstream web companies have dispensed with those as well.)


Or as I tell my wife. I am playing my favorite game 'updating playstation'. I turn the thing on so rarely by the time I usually do turn it on there is 1-2GB of updates waiting. Glad I have a semi decent internet connection these days...


I have witnessed this first hand comparing two systems.

One system has no patching, and updates incur some non-trivial amount of effort on the part of the installer. Releases are a few times a year, at most.

The other system has patching, updates are lighter weight, and as a result, the system has THOUSANDS of patches released over the last decade, north of 2 per work day.

Guess which system is higher quality? The former.

Much, much higher quality.


> Until around 2007, games received no updates.

Warcraft II, from 1995 received multiple patches. So did many other games from that era.

Do you perhaps mean console games when you say "games"?


Um, how long have you been in the industry?

Software has always sucked and had these issues. It has nothing to to with automated QA. The reason you see more issues is 'way back in the day' your software had a very limited number of things that it did, and in general it did not involve accessing a network or chugging down massive volumes of data from untrusted sources.

I work for a company that has a lot of individuals that test for QA issues, they have list miles long of things to check and write reports on.

The problem is more of "It's much easier to write mountains of code than it is to ensure that it works in all cases"


I worked for almost 15 years in embedded and have 25YOE, and no, software has not always sucked as much as it does now.

I agree with your last point a lot, though I would modify it slightly: it's much easier to write mountains of code now than it as, and it's now common and much easier to import external dependencies (especially at system level) than ever, and those dependencies have tens of millions of lines of mediocre code all by themselves.


The one thing that has changed is updates can be delivered easily. This takes some of the pressure off in terms of QA because rolling out a fix to a centralised service delivered through the browser is quick and painless. The cost of pressing millions of CD's kept developers in check in the past.


That's really not the only thing that changed.

Games are also massively more complicated today. It's one thing for three people getting a 2.5 MB single player DOS game reasonably bug free. It's an entirely different thing to get the 5 GB game made by a team of 100 or 1,000 people.


Most of the size difference is assets anyway, right? Games today use off the shelf engines. It should be alot easier to make rubust games when most of the techincally hard parts are allready done.


And, as you’ve alluded to, the scope of what software does for us on a daily basis has expanded by several orders of magnitude. Not only have a number of devices that used to be purely electro-mechanical been reworked to use microcontrollers (cars, everything in the kitchen), but the scope of activities that have migrated onto the web or our phones is truly massive.

30 years ago, software bugs might interfere with you professionally, but they wouldn’t stop your ability to get money from the bank, cook food, or do any other day to day tasks.


> Not only have a number of devices that used to be purely electro-mechanical been reworked to use microcontrollers (cars, everything in the kitchen),

Yup, and in most cases this not only did not improve them, but made them less useful and more fragile. Let's be honest: the software is there only because it can save on manufacturing costs, and sometimes can be used for extra marketing benefit. No attention is being given to providing value to the customer.


Depends on what you’re talking about.

Car engines are vastly improved in both reliability, cleanliness, and efficiency by the introduction of computers into them. You might not like that when it goes wrong, but we all appreciate not breathing in pre-computerized car engine exhaust.

And that’s the rub. While shoddily written software shoehorned into cheap consumer goods obviously degrades the experience, there are tons of places where well written software has massively improved the quality of the goods that they’re added into. Objectively car engines are just better for the addition of software both in design and in operation. They’re smaller, more powerful, more reliable, cleaner burning, and more efficient than they were before we computerized them.


Yes but at the same time the auto makers let another team make Electron apps for the dashboard. Engine ECUs are nice and decoupled. Maybe the hard realtime requirements is what saves them and keep all the novell crap out?


And that’s why this is hard. There are cases where software drastically improves the objective quality of things (car engines), cases where well done software makes items significantly more enjoyable (some car infotainment), and cases where software makes things worse (everything in the kitchen). Separating them out is hard.


I have to disagree in the case of car engines. When I was a kid my dad and uncles spent hours each month fixing minor problems with their purely electro-mechanical cars. I don't miss carburetors or mechanical ignition timing one bit. Electronically controlled engine functions are much better both in terms of efficiency and consistency.


That's true. I don't have that much experience with car repair so I might be wrong in perceiving the 90s and the 2000s models as the optimum in terms of car reliability - the stuff mostly works without funky issues, parts are cheap, repairs can be made by anyone who spent some time with a wrench, and you don't have to visit a shop with a license for poking around car's computer over every minor issue. That is to say: advances in computer control don't have to go hand in hand with making the cars expensive to service and not end-user repairable. But they do, because greed.


> Software has always sucked and had these issues.

I disagree, subjectively it had its ups and downs and we are in a down phase right now. YMMV.


Software has never been better; there are just more users to please.

And despite the issues lamented in this think piece, it is _Apple_ that the author should blame for setting technology expectations impossibly high.

No organization had been remotely as successful in understanding and releasing tech products that were truly great.

Everything since is just a comparison to expectations Apple set. Even when Apple fails, it is in comparison with an Apple that does not.


You also need management willing to prioritize engineering time to fixing that broken stuff, and engineers who actually know how to make non-broken stuff. My experience is that having all three of these prerequisites is pretty rare.


I'm going to restate something that I said a few days ago:

There are a number of hats developers are expected to wear today:

1. Developer of new features

2. Sustainer of prior code and features

3. Tester of all of this

4. Constant student (outside work because who'd pay their employees to learn?)

The priority for the business is (1), so 2-4 get neglected. This compounds over time to mean that old code isn't properly refactored or rewritten when it should be, and none of the code is tested as thoroughly as it should be, and none but the smartest or most dedicated are really going to be perpetual students (or they'll choose to study things that interest them but don't help at work, like me).

When the old code and poor tests create sufficient problems, you get a business failure or a total rewrite. Which strips out half (or more) of the features and the whole process gets restarted.


Don't forget:

5. The mentor of younger developers.

It's another thing some companies expect you to do, but don't allocate time for it, so at the point you finally know enough to not do a bad job, you're suddenly being pulled out of 1-4 and expected to do 5.


Oof, we had a guy retire over that. In his appraisal he was dinged for not doing enough. They counted his training of new hires as 1 "point" (or whatever, the system was weird). They neglected to consider that he was training 5+ new hires in that year. He'd had enough, put in one last year to wrap some stuff up and then he was out and free.


And also lots of devops likely these days:

- Infrastructure design

- Deployment

- Monitoring / log-based debugging and fault tracing / handling customer issues

In addition to wanting a "full stack" developer of course...


Right. It's the curse of the name. DevOps wasn't meant to be a role, but a philosophy of working and organization. But management (and practitioners) latched onto the idea of it as a role and hosed themselves. Now you've got a 20+ year veteran developer tasked with keeping dozens of servers properly configured, secured, and operating as well.

At some point we have to accept that specialization isn't just for insects. It's helpful to have a proper sysadmin or DBA or whatever (appropriate to your domain) within the team, and not just diffuse those roles amongst the developers themselves.


As someone who has been programming since 1984, software quality has greatly improved not declined.

The level of complexity in modern day software is orders of magnitude greater than that of even a decade ago.

What has changed is our reliance on that software. We are now so deeply embedded into our software existence we see these flaws up close.


Great comment. People also used to use only a few pieces of software in any given day (or week, even!) e.g. email, web browser, and word processor.

Now, our computer ("phone") is with us everywhere we go and we'll use dozens of complex applications per day, connected by dozens of APIs, networks, protocols, and hardware features. It's a miracle any of it works sometimes! Thank you to everyone for making this stuff seem like magic; my twelve-year-old self would be amazed at how well it works.

I do feel like there are more UI bugs as we optimize for certain metrics over others. Timing updates has become far more complicated, so we get weird UI refreshes as new data comes in, stale caches, missed notifications, etc. Turning it off and on again often works, surprisingly (probably because devs start with clean environments often, so that's the functional baseline).

Lastly, it is probably far more lucrative for a technology based business to use their most valuable minds for the Next Thing, rather than iterating on the current thing. Incremental revenue improvements just don't cut it in a capital-driven world; everyone is trying to escape the local maxima to find billion/trillion dollar businesses.


Yeah, but then you'd have to raise the price to pay for the testing and the jokers a block down the street who YOLO'd their competing product into the marketplace without testing would grab all your sales. You'd be out of business and your competitors would be laughing their way to the bank while the customers still suffered constantly from a broken product.


If it were only for novel products this would be a reasonable argument. See Netscape's rush to release their browser as an example of what you're talk about.

But the worst part is that this is an issue with established products that have secured their market, and will even receive money every year from their customers. They have both the money and the time to pace themselves and test things properly, but they don't.


> They have both the money and the time to pace themselves and test things properly, but they don't.

But not the motivation, because "it works" and improving UX would cost money and doesn't have an immediately visible return.


Can you name a few of those who really truly have time but don't take their time? Also it seems there's always an opportunity cost. So time is always "of the essence".


Right, you could be spending those dollars running ads, which the market is more likely to respond to :/


Microsoft, Apple, Google.

An MS bug in an enterprise piece of software (so they get paid every year for it) is Skype for Business. Perhaps my org hasn't updated their installation, but if I drag a contact from the chat window (say you message me and aren't in my contacts yet) and then drag it to the contact list (in status view) it will reliably crash the program. If I drag it to groups view it'll place the contact in a group. My guess is that dragging it to status there is no "default" group and so there's some kind of null pointer exception occurring (it tries to add it but with group given as either junk or null).

I used to see more bugs in Outlook, but it seems somewhat better with the last update so I haven't noticed them. Though it was fun when I had negative 2 billion messages for a week or so (I certainly didn't have enough to cause overflow so I have no idea how it wrapped around like that).

Google's is an issue of usability of their webapps (IME), not strictly buggy but not sufficiently tested. Behind the proxy at work Maps is incredibly unreliable. It takes several reloads for it to actually start working "correctly", but don't change where you're looking too much (you can zoom in, but do not pan around). That's not the only unreliable one in this situation, but it is the most easily demonstrated.

Apple's Messages and Mail constantly tell me (they've been better the last few months, but it still happens) that I have unread messages, I'd search and search and never find them. Then I'd pull it up on a different device and finally see the unread message (which was both visible and marked as read on the original device).

Some of these may be shallow or seem petty, but it's an unpleasant experience that after so many years and with so much money should've been resolved for each of them. I'm willing to tolerate an indie game crashing on me. I'm not willing to tolerate an enterprise software solution (MS) crashing for a natural user behavior.

EDIT: I think resolved, but Apple's iOS calculator bugs were annoying for such a simple program. Not strictly a bug, but Windows' calculator, these days, is an unusable mess in many ways. It shouldn't require so many system resources to add some numbers together (the same could be said of many small utilities that were rewritten for, I think, Windows 10 or Windows 8).

Gmail showing me email meant for <first><last>@gmail.com instead of <first>.<last>@gmail.com. The fact that they ignored the . at all in the user names. The Youtube app on iOS when I first installed it and quickly uninstalled it years ago was an incredible annoyance. It wouldn't reliably show me the video I'd actually clicked on that caused the app to open.

Adobe is an almost perfect example of holding their market captive. I've run into a number of Adobe Acrobat issues over the years, though fewer recently (but I use it less often now). But especially Acrobat Reader on Mac OS X was awful, I actually once had to reinstall the OS to get it to stop fucking up PDF display even after uninstalling the software (I never found out what it had done to the system, and gave up). I needed it because I couldn't find anything else (at the time) that supported digital signatures in PDFs on the Mac (in the sense that it actually worked, I think Preview let me do it but what it made wasn't usable by the people receiving the file).

EDIT2: Another Google one, with Chrome on macOS in full screen. Hiding the location bar means you straight up can't get to it. You have to reenable it, versus a sane behavior like autohiding and moving the cursor to the top restoring it.


Skype for Business is just Lync [used to be Office Communicator, maybe?] renamed/reskinned, right?

I think that product is cursed. It was always a dumpster fire, like IBM's Lotus Notes. It's there so you their clients don't accidentally try Slack or any of the sane alternatives.

All products of MS, Apple, Google are in a constant churn mode. New design. New integrations with whatever platform changes happened in the back, new features to match the competitors, new trends, new mobile apps, new browser features, new framework.

Those are the products that are not the real products and not real money cows, so they get very limited attention.

I agree all of these are horrible. GMail is still a slow piece of shit. I recently tried Thunderbird, and .. it slowed down too. Wtf. Slack is slow too. Typing has become slow for some reason in a lot of "apps", maybe too fancy fonts?

Anyway, these companies have huge opportunity costs. Just look at Google. They try whatever crosses their mind and nothing is good enough compared to "ads". And so they shut them down because opportunity costs. (Not because upkeep, but because then your attention is not on the next big thing, whatever that will be.)


> I agree all of these are horrible. GMail is still a slow piece of shit. I recently tried Thunderbird, and .. it slowed down too. Wtf. Slack is slow too. Typing has become slow for some reason in a lot of "apps", maybe too fancy fonts?

Possibly because more companies are moving desktop apps to Electron so they can run JS everywhere.


So more telemetry is not a guarantee for bugs being fixed.


I agree with this sentiment in general. Of course, we have a ton more features than we used to have, but I think given the newness of software in years past, we were OK with bugs and issues because of the novelty of it all.

Now we are in 2020 and iPhone updates STILL cause battery issues. The iPhone has been out for 14 years...

Our expectations have changed. Tools like Excel should just work - and yet, when I try to save a file, sometimes it freezes and crashes. How is that acceptable now?


This, but also remember that we're living through a transition from SAAP to SAAS, where the "service" is actually data extraction for advertisers. For that, software must be minimally useful to a user, but the true aim of UX is not user-satisfaction but data extraction or subscription lock-in.


I work somewhere that primarily relies on manual testing rather than automated testing. It definitely does not make the software more reliable here, at the least. :-)


There're three kinds of manual testing:

1. Manual testing that should be manual (exploratory).

2. Manual tests that are new and haven't been automated yet (but will be).

3. Manual tests that should be automated.

(3) is the one many people see and suffer through (I know I have). They need to be automated to free up time for (1), which is where many issues are actually discovered. But if (3) dominates your time, you can never get to (1) and you'll constantly ship broken things (or more broken than they should be).


This is very insightful, thanks!


Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: