Hacker News new | past | comments | ask | show | jobs | submit login
People expect technology to suck because it sucks (tonsky.me)
447 points by ivanche on Sept 25, 2020 | hide | past | favorite | 434 comments



For most people, technology is a haunted house riddled with unpleasant surprises. With no agency, they are at the mercy of other people's bad ideas that keep changing. Everything needs to be updated because everything else needs to be updated, because everything needs to be updated. Duh!

Software updates! Guess what! Here's a new UI for ya. We moved all the stuff! It's like someone threw you a surprise birthday party, but not on your birthday, on their birthday, and their idea of the best gift evar is to hire an interior designer (for free! lucky you!) who completely rearranges your house inside and out and springs it on you after you return from the grocery store. And there's no going back.

At first it was exciting--when I was 15--then it's slightly bothersome, then downright annoying, then it's infuriating, then it's just tiring. Your brain learns that there is no point in learning anything anymore because they're just going to scramble it again anyway. Learned helplessness. People age out, ended up feeling old and useless and jaded because their skillset is completely inapplicable, after just a few years.

Yeah, I can understand why people hate tech.


I logged into Mailchimp yesterday and found that they moved the header navigation to the left side.

Instead of the previous menu option words like Campaigns or Audience there were icons signifying each that I had to hover over to figure out what they might mean. Then when I went to my Reports the css breakpoints seemed to be wonky making that screen hard to read and use.

Half-jokingly It almost feels like constantly confusing people is a trick to boost engagement temporarily while people are forced to figure things out.


Also half-jokingly I feel like the exact same thing happens in grocery stores.


That they frequently change the layout of their stores? I've never noticed that at any of the stores I shop at.


You aren't wrong at all. Stores regularly re-organize and what it says to customers is "your knowledge is worth nothing". The disregard of customer knowledge is an absolute anti pattern.


Some stores do it, some don't. However, when they do it's intentional in order to force you to go through aisles you might not have walked through otherwise, thus exposing you to more advertising and chances for impulse buys in addition to what you actually planned to get.

Yes, it's a definite dark pattern, but not so much an antipattern.


> I can understand why people hate tech.

To add to that, now that I'm a retired lifelong techie I realize why "old folks" back in the day would hesitate to give up the old, outdated software that they knew how to use.

E.g. I'd prod older friends and family to give up wordperfect - which they knew and loved - in order to progress to the feature-rich-new MS WORD.

Now I'm a linux advocate with its archaic terminal commands and I can empathize with anyone who wants their laptop, phone, TV, microwave, etc. to stop evolving!!


Water under the bridge now, but I bet you did some of them a real disservice.

Wordperfect had "reveal codes", so when the WYSIWYG gave you something you didn't want, you could pop open the actual document representation and wrangle the tags until you What You Want Is What You See.

MS Word has no such function, so when it screws you, and it does, you're good and screwed.


Well, since I was their go-to tech support I paid the price!

re: Reveal codes - not being able to press ALT F3 and cleanup the formatting mess that MS WORD would inevitably get into was torture!


Linux is also far from stable. There is the mess that is Linux desktops like Gnome 2, Gnome 3, Unity (okay, this was only an Ubuntu escapade). The init system changed and the result is that you have to think about things you usually don't want to. There's things like Snap and Flatpack, which pretend to make things easier, but ultimately lead to more complexity...


Hot damn is this the most concise description of how I feel.

I’ve always described it as “the design team justifying their own existence after the job is done.”

Let software get stable and boring.


> I’ve always described it as “the design team justifying their own existence after the job is done.”

I actually think that's really what is going on. Wish I had first hand evidence though.

I do know of a tangential phenomenon at a friend's work place. Her org has a dedicated build tools team. So every 6 months every project's build infrastructure needs to change to something entirely new, because the build tools team keeps having to justify its existence.

I don't know why a company would let this sort of thing happen. It's a massive waste of time for every team.


(Late to the party but) Yes, this, absolutely this. It's almost a rule now that, above some very low threshold, the more expertise and hours you throw at UX, the worse the UX is.

Some of the most annoying UX I've had is on Quora, Facebook, and the reddit redesign, which all spend a veritable fortune on it, while the best ones I've seen are something a non-specialist slapped together with bootstrap.


The thing is, I do not really hate tech, if UNIX, and UNIX alone (no GUI), is considered "tech". Most of the programs in the freely available open-source UNIX OS I use do NOT need to be updated. They just keep working and are quite reliable (at least compared to their GUI alternatives).

I do sometimes wish that there could be alternative (not "replacement") ways to do things we use "tech" to do today, where the alternatives only required UNIX (no GUI). This way if we get frustrated with a graphical UI, and myriad "updates", we can just do these things the "old-fashioned way", with comparatively smaller, simpler, command line UNIX programs.

To me, the people who would be very opposed to this idea are not users, they are developers. Having been raised on computers in the 1980's I can attest that computer users never cared about "UI" or "UX", they just did what they needed to do to use the computer. It is developers, especially contemporarary, who are actually care about "UI" and "UX", not computer users. In fact, some of them are passionate about these aspects of using a computer.


Adam Savage was talking about a scribing tool for machining which was very expensive, but which he likes very much [0].

Before recommending it, however, he felt it important to mention that for people who don't machine very much, far cheaper scribes work well because unless it's your job, your tooling is less likely to be the bottleneck, and you have fewer resources. When you machine professionally, you're tooling is likely your bottleneck and you've more resources.

I think this holds for tech and software. Think of resources here as "time spent learning APIs, bash, and remembering tar mnemonics".

At first, dragging and dropping folders isn't going to be your bottleneck. Need to move 1000s of folders scattered on the hard-drive? If you're not using a terminal, you'll be in trouble.

Everyone cares about UX, it's their experience when using tech. It's just that GUIs are better for some contexts than others.

[0] https://youtu.be/n5laGi3GO7M?t=356


Except with tar you don't even have to memorize anything, tar --help will tell you what you forgot.

   ~ $ tar --help
   BusyBox v1.31.1 (2020-03-26 00:59:22 -00) multi-call binary.

   Usage: tar c|x|t [-ZzJjahmvokO] [-f TARFILE] [-C DIR] [-T FILE] [-X FILE] [--exclude PATTERN]... [FILE]...

   Create, extract, or list files from a tar file

        c       Create
        x       Extract
        t       List
        -f FILE Name of TARFILE ('-' for stdin/out)
        -C DIR  Change to DIR before operation
        -v      Verbose
        -O      Extract to stdout
        -m      Don't restore mtime
        -o      Don't restore user:group
        -k      Don't replace existing files
        -Z      (De)compress using compress
        -z      (De)compress using gzip
        -J      (De)compress using xz
        -j      (De)compress using bzip2
        -a      (De)compress using lzma
        -h      Follow symlinks
        -T FILE File with names to include
        -X FILE File with glob patterns to exclude
        --exclude PATTERN       Glob pattern to exclude
   ~ $
And what's the recent surprise UI change? That xz decompression gets autodetected and doesn't need -J? Most software isn't even as friendly as that infamous command.


> To me, the people who would be very opposed to this idea are not users, they are developers. Having been raised on computers in the 1980's I can attest that computer users never cared about "UI" or "UX", they just did what they needed to do to use the computer. It is developers, especially contemporarary, who are actually care about "UI" and "UX", not computer users.

... what? Are you suggesting computer users in 2020 - which includes everyone from your nana on her iPhone to a toddler watching YouTube on a tablet - want to use CLIs, and are being forced by baddie developers into using apps?


Remember that "alternative" is not the same as "replacement". This is similar to the idea of "more than one way to do it" in computer languages. Users have freedom to choose. Here, one of the ways is without GUI, using UNIX. Only applies where the task does not inherently require graphics.


> For most people, technology is a haunted house riddled with unpleasant surprises.

I'd change that to: "For most people, corporate neoliberal technology is a haunted house riddled with unpleasant surprises."

Writing that recognizes that we live with the most un-free market of all time:

"We are in the middle of a global transformation. What that means is, we're seeing the painful construction of a global market economy. And over the past 30 years neoliberalism has fashioned this system. Markets have been opened, and yet intellectual property rights have ensured that a tiny minority of people are receiving most of the income." [1]

And:

"How can politicians look into TV cameras and say we have a free market system when patents guarantee monopoly incomes for twenty years, preventing anyone from competing? How can they claim there are free markets when copyright rules give a guaranteed income for seventy years after a person’s death? How can they claim free markets exist when one person or company is given a subsidy and not others, or when they sell off the commons that belong to all of us, at a discount, to a favoured individual or company, or when Uber, TaskRabbit and their ilk act as unregulated labour brokers, profiting from the labour of others?" [2]

[1] https://www.youtube.com/watch?v=nnYhZCUYOxs

[2] https://www.resilience.org/stories/2017-08-03/book-day-corru...


I have worked for software companies for over 25 years, mostly on teams building software, and I hate software. I find bugs in every software I use (my freaking microwave oven control panel!). In addition to questionable quality, software is often downright hostile (lose all the data you typed into a web form if you accidentally backspace while not in a text field, because it navigates off the page). Ironically software engineering tools (build systems, etc.) are some of the worst. I don’t know what has to happen for people to stop tolerating software as it is.


Doesn’t reality suck the same ?

My gas car stinks, destroys the planet, needs yearly maintenance, crashes in everything the second I stop paying attention.

My house decays days after day. Floors need constant cleaning, wall have holes from small impacts, paint contains inedible fragments and disperse nocive gas.

Bees are building nests on my balcony and it’s definitely not what it was built for, nor where they should be.

How can we tolerate such a life ?


I live in an old house, and routinely discover ugly hacks that were done by the previous owner, presumably due to laziness, cost or just lack of skill. For example, they buried tons of stuff (toys, furniture, water heater etc) in the backyard and built a terrace on top of the pile to cover it up, apparently because they were too lazy to take it to the dump. The terrace decayed, so I had to tear it down, but in doing so I had to clean up their mess so I could actually use the garden. I'm not annoyed at the planks for decaying, as that is to be expected, just like you are expected to e.g. keep up with third party dependencies that you have chosen to include. Discovering a mess like the one I found in my garden, however, evoked the same feelings in me as when I look at a badly written code base and just wonder how anyone could ship something of such low quality to paying customers.

I guess my point is that there is a difference between things sucking because of the laws of nature, and things sucking because of incompetence, laziness or indifference.


To be fair, the previous person didn't know you were going to try to plant vegetables in their landfill.


But those same owners failed to mention the landfill during the handover


Not to mention, it was almost certainly illegal.


Out of sight, out of mind!


to be fair ulrikrasmussen didn't know the previous owner was trying to plant scrap trees


A similar thing happened at a relatives' house, a long disused storage space under a deck needed to be cleaned and whatever natural forces were at work had accumulated enough new dirt to actually bury items stored under there (a similar array of items, since no one had played with the children's stuff and a few old chairs and such had been thrown there).

It's a lot of work to dig a hole large enough for a water heater, I wouldn't be surprised if something similar happened (I probably would have also checked inside the water heater since if you wanted to bury something and keep it dry one might consider a water heater tank as a possible container, not sure it actually works but it's a natural idea).


When is the last time leaving your keys in the car caused your house to suddenly slide 10 foot southwest?

When is the last time you flipped a light switch, and suddenly your pool disappeared?

Have you ever had French doors appear in your dining room because of a "Windows Update" on Wednesday morning?

Have you ever had to wait for half an hour for your house to boot later on that same Wednesday?

When is the last time you closed a door, and were killed by a hailstorm of bowling balls?

At least with a light switch, you know it's very unlikely to cause structural issues, or plumbing issues, or drain your bank account. Computers are singularly horrible in the ways things can fail.


I agree with your underlying point, but it's also important to point out that computers are also singularly wonderful in that it's usually much faster and easier to reverse failures, and then to diagnose and debug in a non-impactful manner.

To take your second example - if I could then flip the light switch back, and the pool reappeared, then I'd be miffed but not particularly annoyed (assuming I was able to fix that obvious-bug either myself or with an update in a timely fashion). If the pool stayed gone, then yeah, I'd be pissed.

Of course, that whole argument goes out the window when the tech in question isn't controlled by you. Which is often the case.


Tell that to the 346 people who perished because of negligent and (in my opinion, malicious in terms of regulatory deception) undocumented, uncommunicated programming of the speed trim system of the 737 MAX.

Or the folks who perished because of badly programmed software interlocks on the THERAC-25 radiotherapy machine.

Just knowing or figuring out to flip that switch may be an insurmountable barrier depending on the circumstances when a failure state occurs. Especially when the implementation is intentionally hidden so as to facilitate continued market value extraction opportunities from the happy accident of information asymmetry.


I agree with the sentiment of the post and the replies.

Yet your examples hint at something more.

Those massive failures are by people not by tech. Mismanagement and incompetence and systems designed to obfuscate accountability.

Which happens aplenty in non tech fields.


In wiring a house, there is a built in assumption that something could go wrong and disrupt the wiring. That's why we had fuses, and now circuit breakers, grounding, ground fault interrupters, metal conduit, etc. All of these serve to limit the side effects of faults.

When you turn on a switch... it's part of a circuit which is current limited, and in fact there are several limits on that current, all the way back to the source... each designed to protect a link in the chain. Each of those breakers limits the capability to source current further downstream.

When you run a task in any modern OS, it runs with the full privileges of the user id with which it was launched. This is like hooking a generating station directly up to your floor lamp in the living room with no breakers. If the process has a fault, there is nothing the Operating System will do to prevent it from being used to subvert other parts of the system, there is no limit to what it can do.

There are systems that require you to specify how many resources a given task is to be allowed to access. It turns out that such systems can be just as user friendly as the ones we're used to, but they do require things be re-written because the ground assumptions in the security model are different.

Capability Based Security (also known as "Multi-Level Security) was born out of a need to have both Sensitive and Secret information shared on a computer that scheduled Air Traffic during the Vietnam Conflict. (If I remember the situation correctly) The flights themselves were sensitive, and the locations of the enemy radar were top secret (because people risked their lives spying to find them).

It was extremely important that information could not leak, and solutions were found, and work!

About 10 years ago, when I learned about this, and considered the scope of work required to make it available in general purpose Operating Systems, I estimated it would take 15 years until the need for Capability Based Security would be realized, and another 5 more or so until it was ready. I think we're on track.... 2025 people will start adopting it, and 2030 it will be the defacto way things are done.

Genode is a long standing project to bring this new type of security to the masses... I'm still waiting until the point I get to play with it... and have been for a while.

Things will get better... these types of tools, along with "information hiding", getting rid of raw pointers and other clever but dangerous tricks will help as well.

[Edit: Re-arranged to clarify, and improve flow]


The problem with an increase in security is that it almost always comes with a tradeoff of higher complexity. Higher complexity means more difficulty tracing. It also means the state space of a general purpose machine ostensibly there to be configured to fulfill the user's goals is a priori heavily constrained.

Point being, I don't see a shift in the direction of security above usability or ease of mentally modeling doing anything but worsening the problem. I could be wrong on that though, but the last 20 or so years of further encroachment by industry on User's perogative to configure their machine as they like doesn't inspire great confidence in me.

I can say I'm totally reading up on that though. I hadn't heard of it before, and it sounds interesting.


Completely agree - hence why I said _usually_. Another example of irrevocable harm is when ML algorithms dictate some medical treatment or social program.

But, _usually_, it's easier to reverse some changed-data somewhere than it is to reverse an actual change-of-state in the physical world. At least, the inherent effort required to do so is less - but policies or obfuscation may make it harder.


I’d argue computer programs failing mode are often less gruesome that real life’s gas and electric failures.

As a kid we had a gas range, and it was pretty easy turn on a burner and just leave it open without lighting it. Or just start cooking something and forget about it, depending on your situation your house is gone.


Normally the gas has quite a distinctive odor just for these kinds of situations. Sucks if you leave your house and enter it again lighting a cigarette though.


> When is the last time you flipped a light switch, and suddenly your pool disappeared?

Or the pool just disappeared for no reason and you couldn't get it back unless you sold your house and rebought it?


Whens the last time that you had a car door working door, and it fell off when you opened it? (MVP, no tests) [I'm not talking about a worn out car]


I don’t know where you got these examples, but they were fantastic.


Just trying to make analogies people can understand over the years.

The current state of computer security.... is like building a fort, out of cases of C-4 explosive.

How so? Almost every program has bugs, many of which can be exploited remotely. It is effectively impossible NOT to have a zero-day exploitable hole in any given computer. Thus, every single computer can be exploited... and then used to attack more computers.... in a chain reaction.... like a fort built out of C-4.


I think the difference is that the entire software/hardware stack is a world created entirely by humans, untouched by "reality" for the purposes of all these annoyances, so it feels like we should be able to wrangle it better after so many decades. It's entirely our own creation, and we decide every iota of it, and yet it bites us (justifiably or not - turns out thousands of people each creating different layers of a gigantic state machine is hard to perfectly coordinate, but we may have been able to do better by now if we had been more thoughtful and patient throughout).


I hear you, but feel like we are biased by what we accepted as normal in our formative years, and that filter doesn’t apply on what we are discovering now that we’re grown up professionals.

For instance books have been with us for centuries, and honestly most of them suck. Paper pages are thin and sometimes cut your finger (how many times did you get cut by an ebook ?), most are weak to liquids yet our world is filled with liquids everywhere, sometimes coming down from the sky. Updates are painful and costly and non scalable. Font sizes can’t be changed, you have to use an external device to deal with it.

Not saying there are perfect alternatives or that the tradeoffs don’t make sense. Just that we learned very early that books have these limitations and we’ll need to live with them to be a member of society. And we can agree all of these aspects could be and sometimes are fixed, but most people are just ok with books ‘sucking’ in those ways.


We’ve also had centuries to improve the technology of books and I think that makes a difference.

Although the weaknesses you cite seem like problems in search of a solution. No one ever expected books to have variable font size ... why would they?

Lastly let’s recall the book five hundred years ago is dramatically different from the book of today. For example your point about liquids is now in many ways resolved by the cheapness and ubiquity of books. 500 years ago, not so much.


On book font size, there actually is a market solution for the issue: if enough sales are expected the same book (same content) will be sold in different formats, pocket size, deluxe paperback, standard edition etc.

Same for translations, with even books with dual languages side by side.

I find fascinating how the arrival of ebook readers made us rethink how we relate to books, and a lot of annoyances got surfaced only now because there was no comparison point before. My favorite is how you cannot ignore the length of a book while reading it: you can’t pretend not realizing there’s only a dozen pages left and the story must come to an end.


While nobody expected books to have variable font sizes, the fact that ebooks do allow it to be adjusted means that people with deteriorating vision may still read them.


A magnifying glass was the original solution to this problem. They never run out of battery.


And you can use any glass with any book!


Yeah... and once you're no longer paying by the page there's really little upside to using a small font you have to squint at or use margins that are too narrow to easily scan the page. I have no idea how long the books I read are, but I probably read them a few hundred words per screen simply because it's way easier to keep my place. Average for a small paperback exceeds 300.


In the early days of the Gutenberg press when most were illiterate, they would gather together and the one person could read would read to the rest of the group. So, arguably, it was both easier and more inclusive for a blind person to read what there was to read then than now. At least they didn't have to rely on any special accommodations.


It's worth noting that it took 75 years after Gutenberg's press before some disgusted printer came up with the idea of page numbers. As the saying goes, all progress depends on the unreasonable man, who eventually becomes disgusted enough to make things change. Quality matters, pride in design and workmanship matters, and it's not at all bigoted to point out that China, which now manufactures most of the stuff in our 21st century world (or at least the components of it), has a culture of designing and producing absolute dung. We should not accept unacceptable quality just for apparently low prices.


Books are also capable of being copied before or when damaged, passed on trivially, and are not prone to sudden existential failure because a server on the other side of the world was deactivated.

They can't be stolen back by the publisher or seller, can be trivially transformed into different formats, can take annotations, can be rebound with wear, and even if paper has it's faults, reading a page of a well maintained page in 1000 years is as easy as as the day it was written, even if significant swathes of technological backslide occur, and is only prone to the challenge created by human cultural evolution as opposed to loss of the processor or software required to decode/convert/display it.

An HDCP protected chunk of media may as well not exist in 1000 years.


> I think the difference is that the entire software/hardware stack is a world created entirely by humans, untouched by "reality" for the purposes of all these annoyances, so it feels like we should be able to wrangle it better after so many decades.

Humans, as the makers of these systems, are part of that reality, which was not created by us. The reality is that we are great apes writing precise machine instructions with our general intelligence that was not purpose built for being that precise but selected for survival. Our cognitive machinery cannot exhaustively predict all the possibilities of failure of what we write, if we are working in teams, we have transfer most of our technical ideas still through natural language, in a combinatorially increasing manner as the team size increases etc. None of this is user hostile, it is just human fallibilities and limitations in play. And since we can't alter our cognitive capacity drastically, we can only make more machines against these (e.g. unittests) with their own limitations. I think the scale of what we have been achieving despite these limitations are just fantastic.

If anything users are becoming too egocentric, expecting the world to conform to their comfort, with a dash of construal level fallacy, underestimating from a mile away how easy it would be to write bug free programs with perfect designs in a real world, by real people, with real budgets etc.


> If anything users are becoming too egocentric, expecting the world to conform to their comfort, with a dash of construal level fallacy, underestimating from a mile away how easy it would be to write bug free programs with perfect designs in a real world, by real people, with real budgets etc.

Selection bias. You only hear from users who want new features. You rarely hear from users who don't want new features and just want software to stop being buggy and acting like a goddamn haunted house.


I was talking about the “users who don't want new features and just want software to stop being buggy and acting like a goddamn haunted house.” so the selection bias is yours.

Most bugs are just annoying, consequences are not catastrophic if your favorite tuner app forgets your settings, your word processor messes up the formatting, your pdf reader crashes. You can recover with some frustration and wasted time. The perception of these being catastrophic failures shows the sense of entitlement users have because they are used to a certain smoothness in their experience and expect everything to go their way. This doesn’t match the underlying realities of the task; it is very easy to construe a sense of a working program in one’s mind but it is exponentially difficult to make the implementation actually bug free, usable and functional the way user wants.


> Most bugs are just annoying, consequences are not catastrophic if your favorite tuner app forgets your settings, your word processor messes up the formatting, your pdf reader crashes.

So what? Users get upset when your crap doesn't work. Stop being flippant and pushing back. Pushing back is not your (our) job. Complaining how hard your job is not your job. Griping and moaning about irate users is also not your job. Delivering a product that does what is says it will do on the tin is actually your job. Believe it or not, you produce something people depend on!

Imagine your power steering goes out on left hand turns going north downhill. You take it into the mechanic and all you get is "That's just annoying, not catastrophic. You can recover with just some wasted time. It's exponentially more difficult to make the implementation actually bug free!"

Users quite rightly spot bullshit excuses. And we have none. Save the settings, fix the formatting, stop the crashing.


> Pushing back is not your (our) job

Please tell me more about my job internet stranger.

You’re making the same arguments without adding substance, just emotional rhetoric and unnecessary personalizing.

> Imagine your power steering goes out on left hand turns going north downhill.

Imagine that steering wheel stopped working depending on the highway you’re driving on (software & hardware platform). Why didn’t they test this on every single highway? Because that would be was combinatorially explosive.

I’m glad you’re making a physical world analogy. Comparable physical world projects have orders of magnitude less moving parts that need to interfit, and assembly gives immediate feedback whether they can fit. They also have orders of magnitude less inputs they are expected to operate on, which makes it easier to exhaustively test their function.

“Shut up and just make it work” might have been popularized by certain tech personas, but unless you have Steve Jobs levels of clout, pulling that stuff in most dev shops will quickly make you very unpopular whether you’re a IC, a manager or a product manager.


> Imagine that steering wheel stopped working depending on the highway you’re driving on (software & hardware platform). Why didn’t they test this on every single highway? Because that would be was combinatorially explosive....

Users guffaw at this point. They do not understand why your stuff is so complicated and broken. They think you suck at your job. Both you in the collective sense (you engineers) and you in the personal sense. They start plotting ways to stop using your stuff because you are so frustrating to deal with.

> They also have orders of magnitude less inputs they are expected to operate on, which makes it easier to exhaustively test their function.

I think you still do not understand my point. Users fundamentally do not care about it. Everything, to them, is a problem of your creation and they'd quite rightly regard your long-winded explanations with complete skepticism. To you it always feels likes it's someone else's fault, but to users it sounds like complete BS. No matter how right you are about it being someone else's fault. Someone else's fault is the very definition of a lame excuse from their perspective. They are still getting nowhere and you are even more useless to them because you still can't fix anything and just confuse them and waste their time.

It's a very user-hostile attitude and makes for terrible customer relations. That attitude is also counter productive and helps no one. No wonder people hate tech.


Reality creeps in from us creating tech-utopia trough "legacy" systems where the internet, cryptography, multi-core architecture and full program isolation didn't exist yet.

Software has a nasty habit of iterating on yesterday's ideas instead of rewriting for tomorrow. Not that there's anything wrong with that, it seems to be the path of least resistance thusfar.


I disagree - by and large, we don't need to "rewrite for tomorrow". Almost every significant problem in Computer Science and Software Engineering was solved (well) decades ago, often in the 1960s. Security is a bigger issue now, but it was largely solved years ago.

The problem is that we do engage in so much "rewriting", instead of leveraging known good, quality code (or at least fully-fleshed out algos, etc.) in our "new, modern, elegant, and trendy" software edifices of crap.

To me, this may be the one really good thing to come of the cloud (as opposed to the re-mainframe-ication of IT): the "OS" of the 21st century, allowing plumbing together proven scalable and reliable cloud/network services to build better software. (Again, not a new idea, this was behind the "Unix Philosophy" of pipes, filters, and making each program do one thing well. Eventually, it will be in fashion to realize this really is a better way...)

We need smaller, better software, not the latest trendy languages, insanely complex platforms that no one understands, and toolchains of staggering complexity that produce crap code so bloated that it requires computers thousands of times faster than the Crays of the 1990s just to do ordinary desktop stuff. (Seriously, folks, the Raspberry Pi 4 on the next table is a rough match for the Cray we had in Houston when I worked for Chevron in the early 90s! Think about that, and then think about how little data you really need to deal with to do the job, vs what you're actually shuffling around.)


You reminded me of this quote.

“Einstein repeatedly argued that there must be simplified explanations of nature, because God is not capricious or arbitrary. No such faith comforts the software engineer.”


We probably could make gas stink less just like we could fix that bug. The ROI just isn’t there.


Gas stinking is a feature, not a bug. It's a safety measure. Particularly when we're talking gas (and not gasoline), which is naturally odorless and made to stink on purpose.


> Doesn’t reality suck the same ?

No. A hardware product like a car has predictable wear and tear governed mainly by the laws of physics. The fact that I can no longer use my smart speaker because the manufacturer decided to stop supporting it, went out of business, or got bought is not at all the same. My car will still work through all of those things in the exact same way. It also doesn't throw up random dialogs (or whatever a physical equivalent would be) that stop the product from working until I interact with it. Not the same at all.


A car has tons of parts that, sure are "governed by physics", but in effect just randomly fail. I can theoretically understand that my there's a clunk in the frontend of my car because I've exceeded the MTTF of a suspension bushing. To almost everyone though, it's essentially just a random event based on nothing they've perceived.


John Deere has entered the chat.

Also, car companies have been tinkering with "electronic repossession" - remote kill switches due to nonpayment.

So ... get ready for other things to suck as we attach technology to them.


> John Deere has entered the chat.

Thank you for bringing this point. The actual problem is not the software itself, but its proprietary nature and infinite hunt for profit without any limits. Consider free software instead and you will see that it is improving year by year, despite very slowly (which is logical, in the absence of infinite resources).

My Linux machine never forces me to reboot, shows any ads or suddenly changes its interface.


> It also doesn't throw up random dialogs (or whatever a physical equivalent would be) that stop the product from working until I interact with it.

I see you never had a (EU) Ford.


Sure there are things that don’t work, but it’s not nearly comparable. In my life I’ve never had a problem with a refrigerator, had maybe two problems with washer/dryer, my closet Just Works, books Just Work (with an ink smudge maybe every 50-100 reading hours) etc. I can expect each of those things to last decades. Looking at the goods I’ve bought on Amazon recently, digital electronics/software as a category doesn’t hold a candle to everything else I buy in terms of reliability.


> never had a problem with a refrigerator

These turn of phrases make me wonder what we are really expecting from software.

I can’t imagine you never slapped the door of your fridge and it didn’t properly close. You gave it a nudge when you realized it, and it was fine, but it must have happened. And your whole food supply would be rotten if you didn’t notice in time.

Or do we monitor energy consumption close enough to realize it’s eating much more than what should be expected, the same way people complain about chrome eating too much memory ?

It can also get pretty noisy but I’d assume most people just think it’s normal.

And we put the blame on ourselves for a lot of issues (didn’t put the bottle at the right place, didn’t put the right amount of force to close, didn’t set the right temperature, forget to leave space around the fridge for ventilation etc.). But few users blame themselves for not having understood the software and worked around its weakness, we just call it broken.

That benavior is normal, but I’d take a lot of “my applicances just work” with a grain of salt.


> I can’t imagine you never slapped the door of your fridge and it didn’t properly close. You gave it a nudge when you realized it, and it was fine, but it must have happened. And your whole food supply would be rotten if you didn’t notice in time.

But if the fridge was software it would randomly turn off and ruin all your food. The light would sometimes stay on, except when you open the door. It would require you to decline an update before you could get the milk out for breakfast. During an update the fridge and freezer components would switch places and then give tips about efficient ways you could manually move everything. If you bought a new fridge, part of it would locked shut until you paid more money, but the one in the store was already unlocked. And god forbid you lose your 2FA device used to setup the fridge -- it will destroy everything inside (including irreplaceable heirloom tomatoes) upon reset. It will then update to a subscription model where custom temperature settings will require a monthly fee or you'll be limited in the number of items you can store in the fridge or number of times you can open the door per day.


We saw a failure case like this with a microwave in the workplace kitchen. It somehow got into a mode where it only turned on when the door opened. Needless to say we threw it out shortly after that was discovered. We didn't bother debugging it, but my guess is it was a hardware problem because the interlock should have obviously made it work the opposite of how it was and you'd hope that a software glitch couldn't get it into a mode like that!


Oh crap. That could blind a person. I thought microwaves had to have a hardware interlock for that reason


Since this is critical health and safety stuff that can lead to serious injury, everyone I've ever seen uses hardware interlocks - generally, the door-closed switch is in series with the microwave power supply, so it's impossible for it to make microwaves with the door open. Only an idiot would put a safety interlock under software control, when a simple switch will do.


A lot of these look like pricing and marketing issues to me.

Fridges have been with us long enough in a ‘pay everything upfront’ setting that we‘d battle to the bitter end if we had to do micro-payments or aggressive planned obsolescence.

To your point, I lived in long stay apartments where you put coins to have the fridge and air conditioning work because they didn’t bother having pay as you leave metered use. That’s super rare (I hope ? I’d expect the same in some predatory situations towards low income people), but it’s to say that alternative exists.

Otherwise fridges randomly turning off is just a matter of time and/or build quality. Sooner or later it happens (or it stops turning on, which is arguably better, but you wouldn’t say it’s great)


> In my life I’ve never had a problem with a refrigerator, had maybe two problems with washer/dryer

I think blaming software for this is a little naive. Take a look at consumer reports for any modern fridge, stovetop/oven, washer/dryer, etc, and you will see complaints about fridge motors dying, touch panels going on the fritz, etc. -- none of which involve anything more than low level firmware.

If you want to put a tinfoil hat on, you can consider that it may be planned obsolescence, but to put the blame squarely on software, I would disagree with.


> If you want to put a tinfoil hat on, you can consider that it may be planned obsolescence, but to put the blame squarely on software, I would disagree with.

You don't need tin foil hat when facing the truth :).

Also, while things you mentioned aren't software-related, they're modern tech-related. Like, last 20 years. Fridge motors dry out because they're being made cheaper, with not enough lubricant put into them and no option of opening them up and pouring in the grease. Touch panels are going on the fritz because touch panels suck (that's often very much software), and they shouldn't be put on appliances in the first place. But it's cheaper, so they are.

Worth noting that there wasn't some big price reduction happening from solid appliances 20 years ago to crappy ones today. Price remained roughly fixed, but appliances started to suck more and more.


Right, but the move towards cheaper and lower-quality is more the fault of the current economic system and its incentives than it is the fault of software.


It's very instructive to look back to the 70's when electronics running a little bit of software had just come into being.

The big deal, at first, was really with memory. Your alarm clock could ring at the same time reliably. If you invested in a VCR, it could record at a programmed time. If you had a synthesizer it could store and recall exact preprogrammed patches. Pinball machines could downsize in weight and keep truly accurate scores instead of relying on tempermental relays and score reels. And so on, with every category of gadgets getting the computerization treatment. Although not everything succeeded there were lots of straightforward cost and quality improvements, with the main downside being that IC designs are less obviously repairable.

And then pretty much every year afterward, the push was towards cheaper with more software, with decreasing margins of real improvement, with the "smart" device representing an endpoint where the product is often price discounted because its networking capability lets it collect and sell data.

What comes to mind is the Rube Goldberg machines and their expression of a past era of exuberant invention, where physical machines were becoming increasingly intricate in ways not entirely practical. Our software is kind of like that.


"Kind of" like that?

Every other week I read about someone's entirely-too-roundabout way of doing X via an IoT device (requiring packets to probably circumnavigate the globe). Meanwhile I'm sitting here opening my garage door with a physically wired switch like a pleb.


Why do we have a touch panel on a fridge in the first place? The only thing we need to be able to specify is desired temperature...


... and at that, one could argue that even that isn't necessary.

I just checked my fridge, it has six buttons and an LCD panel, and in all my (4) years of home ownership, I haven't touched the buttons a single time.


> Worth noting that there wasn't some big price reduction happening from solid appliances 20 years ago to crappy ones today. Price remained roughly fixed, but appliances started to suck more and more.

First, the "solid appliances" weren't 20 years ago, but more like 25-30.

And though there wasn't a big price reduction in the interim:

- Refrigerators are more energy efficient.

- Refrigerators have larger internal volume for a given size.

Equivalent improvements have been made to other appliance types such as washers and dryers, but not stoves, as far as I know.

Those improvements are largely orthogonal to declining design and build quality, but it should be noted that there are at least some ways in which newer appliances have been getting better (that aren't just gimmicky features) while prices remained the same.


Right, but my $400 bose headphones have a broken integration with my $2400 mbp. Swiping the volume controls on the headphones also moves the balance.

Conveniently, because macos is ass, it's nondeterministic whether the balance controls display in Sound Preferences to fix the balance issue. You just have to open and close the settings panel in the hopes that it will display.

I'm a software engineer and I don't even know where to begin to debug this idiocy.

Duolingo regularly freezes audio in chrome. Once this happens, no audio will play in chrome until you restart or kill "Utility: Audio Service" with the chrome task manager.


This is the second time in as many weeks I've read a complaint about Bose headphones. The first was that their Bluetooth was so janky the audio itself was delayed multiple seconds and out of sync with video playing on the device.

That blew my mind, my $20 Amazon-purchased Bluetooth earphones just work™ with no delay.


I guess it depends how fussy you are. I've had several fridges/freezers/ovens that don't actually maintain the set temperature. for ovens this is merely annoying, but for fridges and especially freezers, this is a food safety issue. on fridges/freezers that have a dial instead of a digital temperature control, I've found that some just can't maintain a stable temperature, no matter how much I fiddle with them. after setting them "just right" with my own thermometer, I'll come back the next day to find an exploded bottle in my fridge or cold water in my ice tray.

books work really well until a pipe bursts in your attic. then you wake up and notice half your collection has been ruined (personal experience).


Well, digital services are much more complicated than a fridge, or a book. Not only that, but they also require a machine to run that is also orders of magnitude more complicated than a fridge or a closet.


Do you know how hard it is to produce a book from nothing? Make paper, all the same thick, print and so on. Or just a steel tube .. all can be done without computers and software. The process behind is difficult for every step. I work in industrie automation and I can tell you, right now, with this quality in software and "computer everywhere" we are building a super high tower in the softest sand.


Part of the complaint is that these things don’t have to be as complicated as they are. The fridge with the touchscreen screen is usually more frustrating than the fridge from thirty years ago. The smart TV is usually more frustrating for its smart features.

Things are getting more complicated, like you say, but they frequently aren’t getting enough better to justify the added complexity, especially given all the issues that come along with it.


> Doesn’t reality suck the same?

To me, software is as if when I open a book to read it, then, the book suddenly snaps itself shut, hurting my fingers.

Thereafter, the book gets wings, tries to fly away, but bumps into my coffee mug on my desk, so coffee spills on the floor. Then the book does fly out through the closed window — smashing the glass into pieces — and gets larger and larger wings, flying higher and higher until it disappears into the blue sky.

It's as if software was alive, and does random things I don't always agree with.

But actually — the bees building nests on the balcony: That feels pretty close to misbehaving software. Or the cat, bringing in a snake from outdoors. Or a squirrel shewing through a power cable, shutting down a city.


There is a difference between design trade offs and flawed design of deviations from the design. Your car does what it’s supposed to do, within the predictable limits imposed by the fact that it’s a gas-powered car. Since MacOS 10.15.4 or .5, my 16” MacBook Pro crashes waking up from sleep due to some glitch in waking up the GPU.

Of course, people perceive that software sucks because it’s more complicated than people perceive. I forget what book said it, but an operating system has more separate components than an aircraft carrier and they’re more tightly coupled. (I’m not sure that’s true, but it conveys the idea.)


Houses, cars, etc are far more reliable and well designed than software. Think about all the extreme conditions cars continue to function in. How many people don't even follow basic maintenance schedules?

Another key difference is that in maintenance of your home, you have complete control. It's extremely easy to understand and act to improve or maintain it. When large software systems (like the IRS login) have problems, you are totally helpless.


> Houses, cars, etc are far more reliable and well designed than software

Buy software the price of a house and you’ll be right to expect the same build level.

Then even at the price of your house you’ll have fun with mold growing inside the walls issues, soil that degrades in unexpected ways after heavy rain hits the hill you’re built on; rooms were fresh and bright enough on a hot summer day when you visited, but you realize overall orientation makes way darker and gloomy in winter that you expected. And you’ll pay for that house for your next 20 years.

Cars are the same at a lower level, and you see small issues creep up as you lower your budget (or go buy a fancy vintage italian car and you’re in for the wild ride).

> Another key difference is that in maintenance of your home, you have complete control.

In the good old days people had timers on their desk to remember to restart programs before they crash. Also saving stuff, making backups etc.

Of course online services are a different beast, but it’s more akin to fighting bureaucracy, which I see as a our society’s software in a way, with the shitty forms with not enough space for your name and other niceties.


This is a straw man argument.

Cars vary widely in their product quality. Houses vary widely in their product quality. Some things in life are inevitable facts of nature, but product quality is not. Quality is to a large extent determined by the time and care taken by the manufacturer.


>My gas car stinks, destroys the planet, needs yearly maintenance, crashes in everything the second I stop paying attention.

That's not a good example, nor is it parallel to the dynamic the article describes.

Your car stinks a lot less than cars did 10/30/50 years ago (emits less in the way of pollutants or CO2 per mile driven), is less likely to kill you in a crash involving the same size cars/velocities (despite weighing less!), needs less maintenance, lasts longer, and can notify you of potential collisions and sometimes avoid them.

It's probably only worse in terms of education needed to perform maintenance or nominal sticker price.


But devices have also gotten smaller, lighter and more efficient, and software can also do much more today than it could a long time ago. I think the analogy is fine.


When the analogy was built off of saying that saying that cars are bad my metric X, when the claim was that software has gotten worse by metric X, and cars have actually gotten better my metric X, no, it's not a good analogy.

And yes, there are some ways in which hardware has improved. But the claim is that, judged by what you're using it for, most UX-apparent aspects have gotten worse. Is there a clear way this is wrong? If you look at most UX-apparent metrics, it hasn't. Latency from keystroke to character render has gotten worse. App start time has gotten worse. Lots of other latencies have gotten worse.

None of the nightmares described in the article were typical of software UX.

These would be arguably fine if the additional features you get were a necessary cost, but they're not.

I'm also not sure that devices have gotten more efficient in all respects. Each version of iOS gets more energy-intensive, for example.


> Latency from keystroke to character render has gotten worse. App start time has gotten worse. Lots of other latencies have gotten worse.

Do you have sources for this? I mean, I'm not sure there aren't rose-tinted glasses here.

> These would be arguably fine if the additional features you get were a necessary cost, but they're not.

> Each version of iOS gets more energy-intensive, for example.

I would argue that multitasking, camera postprocessing, widgets, background app refresh, and others are all features worthy of more resource usage. Many of those are things you can choose not to use if you want to save power.


Increasing keystroke latency was discussed on HN before: https://news.ycombinator.com/item?id=15485672

>I would argue that multitasking, camera postprocessing, widgets, background app refresh, and others are all features worthy of more resource usage. Many of those are things you can choose not to use if you want to save power.

For all those features turned off (before and after), the usage increases with each version.


Not the same, we expect the problems you mention. There are just some laws of nature that we get used to dealing with. Tech has the tendency to produce random problems. The one we have all dealt with is, everything was working fine and then suddenly stopped. You call tech support and after an hour of troubleshooting it with them we get, the, "We've never seen this before. It must be caused by one of your other SUCKEE tech toys." Ahhhhhhhhhh...


I think in 20~50 years those random software issues will be what we knew for our all life, basically what reality is, and we’ll just give warm patronizing looks to kids complaining stuff doesn’t work.


software doesn't decay, it just sucks even if you preserve it


I know a product that can do more damage with an accidental backspace: the iMessage app for MacOS (Messages).

If you're any sort of power user, you likely know that you can backspace by the word instead of the by the character, using Ctrl + BS on Linux or Cmd + BS on Mac.

In the Messages app, the shorcut to delete your _entire chat history_ is also Cmd + BS, and it works even if your caret is in the text box. So if you type five words and then Cmd + BS six times, you will be prompted to delete your entire chat history.

I do this almost every day. So far I've never compulsively hit return but I am dreading the day it happens.


Isn't alt-BS "delete word"? cmd-BS is "delete to beginning of line" for me.


This always annoyed me but it looks like this behavior is gone in Big Sur. You can ⌘⌫ to your heart's content.


It's funny how small a feature can make you want to upgrade.


Gnome Notes has similar behavior: Whenever you use Ctrl + BS, the note you are currently writing it gets put into the trash, even though you just wanted to delete a word quickly. You can recover the note so it's not the worst possible behavior, but it still sucks.


Option-Delete is whole word delete.


Coaca uses the Emacs style GNU readline keys in all the text fields, just use c-w.


> lose all the data you typed into a web form if you accidentally backspace while not in a text field, because it navigates off the page

Nowadays, I would consider this a problem with the browser. How often does one navigate backwards with the backspace key?

Recently, I had some doubts over whether or not I should clobber the native browser behaviour for "ctrl-s", but then I realized that nobody anywhere EVER saves a web page to disk... and if they really needed to, the browser toolbar is right there.


I for one fully expect the backspace button to work if I do not currently have a text field focused. once you learn keybindings for an application, there's usually no way to perform that function as quickly/efficiently with a mouse. please do not break the conventional ones.

ctrl-s is probably fine to break though. even when it does "save" the page, it rarely does so in a useful way.


On Mac the other default for navigating backwards is Cmd-[, if that helps any! (It's also a default in many applications for navigating backwards in whatever sense the app may intend.)

I don't use Windows other than for gaming, so I'm afraid I don't know if there are other shortcuts other than backspace.


Alt-Left is back and Alt-Right is forward. Sadly, their equivalents on Mac are also for navigating to the beginning and end of the line so if you're in a text box you lose that control.

Your alternative is handy. I wonder if it also works on Linux.


Command-[/] for history navigation and command-{/} for switching tabs are macOS conventional keybindings that work almost anywhere there’s a concept of history or tabs.

they may work on Linux in an attempt to support Mac users.


This is backwards. Mac OS inherited EMACS keybindings.They work in dialogs, etc.

https://stackoverflow.com/questions/25275598/a-list-of-all-e...


I could be wrong, but I don’t think these are emacs keybindings: although, the Cocoa text input system definitely is heavily influenced by emacs, both in design and in this sort of detail.

The emacs-derived keybindings use Control on macOS, the Mac ones use Command


fyi, you can get pretty far on the basic shortcuts subbing ctrl for cmd.


Oh, I fully expect CTRL-Q to close my browser window (with all the tabs) when I mistype CTRL-W. That doesn't make it any good.

Do you know at all times what element has the focus? An error there can be of high consequence. (Even though browsers do make an effort to refill forms on page forward, it doesn't always work.)

It is a very bad shortcut, and there's always an alternative one anyway, because it's not always available.


Fwiw, on Chrome in Mac, you can configure it so that pressing cmd-q won't close the browser without holding it for a few seconds.


I’ve just tested on Chrome: backspace on its own does nothing, going to the previous page is bound to cmd-LeftArrow.


Chrome made this intentional change about four years ago. Prior to that, backspace had the described behavior.


I'm one of those holdout firefox users. every day I come across more websites that only work on chrome though :/


Edge, or at least the Chromium version of it, has changed the navigate back key to be ALT+Left Arrow instead of the backspace key. It was annoying at first because I have over two decades of muscle memory for hitting backspace to go back. After a couple of days I got used to it and now am happy I can backspace without accidentally navigating away because focus wasn't where I thought it was.


Alt+Left/Right navigating through history has been in IE since at least IE4. I don't recall if it worked before then, and trying to look up ~25 year old documentation is somewhat difficult (especially since it would have shipped in a WinHelp file with the software instead).


Since backspace worked in every IE version up until Edge/Chromium, I never knew the ALT+arrows combinations worked. It wasn't until backspace stopped working that I did a search to find what was going on and saw the ALT key combos. So they may have been there all along but I never had reason to find out about them until recently.


Pretty sure I've been using Alt-Left as my back shortcut in Firefox for over a decade also


This was a change that Chromium introduced in July 2016. I remember it being slightly controversial at the time and the issue from the tracker being posted to HN :)


> but then I realized that nobody anywhere EVER saves a web page to disk...

Some people do it all the time. I was emailed a saved page the other day.

I was responsible for a single page web app, and the error detection code was stored in a <script> tag within the page, so I got plenty of “errors” logged for people trying to access saved pages.


https://xkcd.com/1172/ :)

Literally murdering children over here. I just knew somebody was going to come along and prove me wrong!


Chrome hasn't supported backspace since 2016


> I find bugs in every software I use (my freaking microwave oven control panel!).

My dishwasher, which has only buttons to select what to do during the next wash cycle, has a firmware bug.

Sometimes when the door is closed, it will start one of the pumps. If I cycle "heated drying" on then off again, the pump will stop. I figured this out because, well, I've worked on firmware and I understand the how of how software can be stupid.

After I learned to recognize watchdog resets, I started seeing them more and more often, and became even more terrified of how bad software is.


After I learned to recognize watchdog resets, I started seeing them more and more often

Yup, sounds like my TV. It's not even one of the smart ones, I was careful to avoid those. But once every few days, it stops responding to the remote control when performing some action (opening the EPG, switching channels). I then have to wait about ten seconds for the display to go dark and the TV to "reboot" itself, so I can continue channel surfing.


I prefer open-source tools because I know where the agency for pain lies: myself. With modern Rust and Go tooling you can download an application's source code, modify it, and compile it painlessly in half an hour.

So why do I tolerate bugs in software like that? Because I know I can fix them. And I also know I won't always. Small gods have handed me tools to remake the world as I would see fit and I do not use them. Are they at fault for not having made the world as I would prefer? Or am I at fault for not using the tools?

In any case, I've noticed a sort of dichotomy among users in their reaction to tools that fail. There are those who go "this tool sucks how can I do my work" and there are those who go "my work is what I want to do which tool can I use instead". The latter set get a lot more done. Once observing this I have attempted to modify my behaviour to be like the latter and have effectively become better.


> Small gods have handed me tools to remake the world as I would see fit and I do not use them.

But they didn't give you the only tool you really needed: Time.

Having meaningful access to the source is very important, but its value is limited because even small improvements often take a large amount of time especially to code you're unfamiliar with. Once you've made that improvement, maintaining it (or up-streaming it so someone else might maintain it) can take a tremendous amount of time.


Haha, yes. Time is a thing no one can make for me. But they cannot make it for themselves either. So I take what is offered gratefully and make what I can from it.


> "my work is what I want to do which tool can I use instead"

I can't wrap my head around this, could you explain further?


For amusement sake I removed punctuation. The contrast is between people who find themselves unable to move when a tool sucks and people who just solve a problem. The latter just move on from tools that don't solve their problem or solve it poorly. If the problem is big enough they solve it themselves.

They don't have to be engineers. They'll pay $100 / mo to solve it or hire a guy on Upwork to solve it or cobble something together on Zapier + Airtable. The thing is, the tool is insignificant. They don't really spend an appreciable amount of time on complaining about it because it's faster to stop using it.


I can easily see myself going the latter course of action but I do question how sustainable it is. I wouldn't personally enter a business to make a bag of money and then quickly leave so I'd mix approaches. How about you?


I think that sounds sound.


>I don’t know what has to happen for people to stop tolerating software as it is.

What we've got here is a question of cost and choice. If my choices are all equally bad, IE: vendor one is not any worse than vendor two, then inertia or cost become determining factors. In terms of consumer software - consumers have been conditioned to have low expectations, and these costs are further reduced because prices are so often free or very low-cost. In regards to commercial-focused packages - again, so often we put up with it because the systems we're using are so complicated and specialized that the pool of options are limited and/or the domain is so complicated that problems are inevitable.

So long as this is the landscape, few software producers have incentives to do the things necessary to improve, and/or believe they can spread the cost of improvement over a long period, IE: don't make the investment until the pain is too great.


> I don’t know what has to happen for people to stop tolerating software as it is.

Deaths, a lot of deaths.

Software Engineering needs a PE type licensure and a union. We need a way to stand together to advocate for better working conditions, practices, and tools.


What is the evidence that professional licensing would help? Even the most skilled programmers produce bugs. Professional licensing would only raise the barrier to entry for a well-paying job.


I completely agree. But I fear that if this happens then the companies will look for ways to produce software with less friction compared to the now-stricter software development practice in general. And it would be "let's hire Indians for peanuts" all over again.

Really not sure what's the way out of this corner that we've all collectively painted ourselves into.


I am the same. I have a motto, actually, for more than a decade: the more software you add to something, the worse it gets.

I coined it watching the robotic soda vending machine crash and reboot frequently.


Discipline and professional responsibility on the behalf of programmers, coupled with patience.

Everyone is in such an irrational hurry, it's been built into the "culture" such that rushing and making messes is acceptable. And by extension, customers expect things to be shit so you don't get in much trouble for doing it.

It's a feedback loop that only stops if companies (and individual programmers) start taking pride in craft > careless speed and money like they used to do back in the 50s/60s.


> It's a feedback loop that only stops if companies (and individual programmers) start taking pride in craft

Most programmers, and many companies, want to produce something of quality, well crafted.

The drive for low quality kibble comes directly from consumers, and the inability/cost of judging value.

A consumer can’t be expected to be a UI expert, and a slightly better UI might not drive sales because other factors are more important. I try to buy hardware with good UI, but I often make compromises for other factors.


That's a cop out for not doing the work. The choice is still in the hands of the people making the thing or offering the service.


> accidentally backspace while not in a text field

Thankfully that can be disabled, but I find it to be one of the most infuriating 'features' of Firefox. If it weren't something that could be turned off, it would be a deal killer all by itself.


Well, Chrome used to do this too until they removed it in 2016. Internet explorer does this. It was more of a default than a 'feature of Firefox'.


Also, in most normal web pages—at least those not using hideously over-complicated scripting—just pressing your "forward" button (Alt+Right, for example) will navigate forward and restore the edits you had in the form.


I didn't even know this "feature" was available on Firefox (despite using it since version 2). Turns out, it is only enabled on Windows:

http://kb.mozillazine.org/Browser.backspace_action

Maybe it was copied from IE to keep "closer to the platform"?


It's definitely not limited to Windows. I am exclusively a MacOS user (for my workstation, at least) and I have had to disable it on any new Firefox install.


My Windows 10 Desktop runs very well.

My FreeNAS, Arch Linux and my Android phone as well.

I think, we get paid because we are building new stuff and have to maintain shitty stuff. If my job would literaly just desigining it high level, clicking it together and then it works, no one would need me.

Yes its frustrating sometimes.

I would like to cure cancer instead of debugging why this update broke our system.


My microwave will run for a split second if you hit the start button. I suspect it’s running the software to operate where the time variable is 0 seconds. That there’s no guard against 0 so it just runs until the counter part of the code runs.

I’ve microwaved a burrito by mashing the start button hundreds of times.


The thing that drives me absolutely bonkers is incompatible "upgrades".


Software has n^2 as many paths a piece of code can take depending on how it branches. Also, you cannot simulate the rest of the world (externalities).

If I design a bracket for a TV mount, do you blame the bracket when someone hangs a 3000 kg bookcase on it?

You’re expect software to be perfect, yet ignore the massive limits everything in the world has.

Not saying there aren’t quality issues with software. I’m saying software development is really difficult.


Cobbler’s children have no shoes.


Software is indeed downright hostile sometimes... it's gotten better with autosave... but your IDE, MS Word, or Photoshop crashing would sometimes cost you half a day of work if not more. It was infuriating and even discouraging sometimes until you learned the subconscious behavior of pressing ctrl-s all the time.


> my freaking microwave oven control panel

You mean where 99 is greater than 100?


As an experiment, I logged every bug I encountered for a few days. I averaged 5 bugs a day, not counting dark patterns or bad design.

Everything is broken, and nobody is upset. [1]

Some of the software I use is so unreliable that I expect it to fail. I expect the Vodafone login page not to work properly. I expect one of my airpods not to connect on the first try. I expect my banking app to show random error messages, even though it works just fine. Most online stock brokers have issues at the worst possible times. My bookkeeping app is frequently wrong, per my tax advisor. Since everything is broken, the best I can do is to mentally assign all those apps a trustworthiness score, and avoid betting too much on them.

The worst part is that support for all that software has been largely automated. If you have a problem that can't be fixed by a chatbot or a crowdsourced support community, you are largely helpless. Google can wipe everything you love, and there's no one to punch in the face (to borrow from Grapes of Wrath).

So far, my only solution to this is to be a late adopter, and to favour simplicity over sophistication. I was recently considering going from paper notebooks to a tablet. That initiative stopped at the electronics store. The Surface Go wanted me to go through a setup wizard (after dismissing a few notifications). Two of the 4 iPads had working pencils. The ReMarkable reviews mention a host of issues. I never encountered any bugs with my Moleskine. It pairs flawlessly with any pencil I want, including older models.

[1] https://www.hanselman.com/blog/EverythingsBrokenAndNobodysUp...


Thank you for mentioning AirPods! I was so excited to buy them after a ton of reviews saying they just worked. They said the days of fighting with Bluetooth audio were gone. I believed them, then reality showed up.

My Mac sometimes unpairs them and worst it doesn’t find them. Some times while my baby is sleeping I put my AirPods and play a loud video just to realize they were not connected. My wife’s right side AirPod just stopped working after one year of use...

If apple is considered top tier in reliability, then technology in general really just sucks!


To provide an anecdotal counter-example, AirPods have worked seamlessly for me so far. Much quicker to connect and more reliable than Bluetooth. So their marketing isn't completely off-base :)


I was under the impression that AirPods works over Bluetooth, did Apple really invent their own wireless technology + protocol just for the AirPods?


It does (they can be used with Android devices), afaik only the pairing part is proprietary.


Shh nobody tell him AirPods are just bluetooth headphones :)

https://www.apple.com/airpods-pro/specs/


But they're not; they're one specific implementation of BT with some proprietary secret sauce for the multi-device pairing etc.


My manager had to buy new headphones as when she is WFH she used airpods and iphone to connect to calls and the irpods constantly cycled leaving garbage audio whenever she tried to speak.

I'm using a BT Jabra headset, with noise cancelling I got for about the same cost, 16+ hours of battery, easy pairing and super useful phone app, great ANC, and solid audio quality, at least to a non-audiophile. My biggest complaint is the closed back design leaves my ears a bit irritated after 4+ hours of use. Not an airpods competitor but for the cost I am way happier.


I use my AirPods with multiple devices, and that process is also fraught with problems. Switching to another device takes an absurdly long time. About once per month, switching will make bluetoothaudiod peg a core and ultimately hard crash the entire computer. Yes, the kernel panics if it doesn't hear from this userspace process frequently enough.

Surprisingly, iCloud syncing works fine. If I pair my AirPods with one device, it always pairs with all of them.


I accidentally coughed whiskey through my nose when I read that.


They work well enough, but they don't work magically well.

The main issue is with the right pod not always turning on when I take it out of the case. The solution is to put it back in the case for 5 seconds and to try again.

The second most important issue is the airpods falling out of sync with each other. It seems like the signal from my Samsung S9 in my pocket is choppy. Looking left or right for too long will make the signal drop. Putting my hands in my pockets also will. If I put the phone in my backpack, it's okay.

This is still more pleasant than wired headphones, but it's far from a magical experience.


I really don't get it.

Personally, I hate ear buds and, as such, never bought ear buds. Rather, I spent ~$20 on SoundBot bluetooth headphones starting some five years ago (long before air pods, methinks) and haven't had problems with them at all.

I also have a seven year-old phone (HTC OneMax) running custom (unofficial/ported by a random hacker) Android[0], and it pretty much works.

Sure the battery life has degraded since 2014, but that's to be expected, no? I wish I could replace the battery (as I did with my 15+ year-old Panasonic cordless phones), but there really aren't too many mainstream mobile devices that allow that any more.

As for poor quality software/hardware, if you don't like it, vote with your feet and/or wallet.

If stuff doesn't work, why use it? Even more, if stuff doesn't work and you can't/won't fix it yourself, then don't use it.

Software devs and hardware manufacturers don't care about whiny blog posts or complaints on HN, they care about the bottom line. Impact the bottom line and you may have a chance at improvement.

Stuff that actually addresses the issue is useful. A great example is the lack of Android support after 4.4/KitKat on the HTC OneMax mentioned above and the abandonment of it on Cyanogenmod/LineageOS in 2017, where those (myself included although I'd never hacked on Android ever -- and failed miserably -- thankfully someone else did not) impacted by this took action to provide the latest Android on an old, unsupported, discontinued device.

If you're not taking positive action toward making things better (whether that's fixing the problems or voting with your feet/wallet), then you're not going to have any impact.

While whinging about it on your blog may be a way to relieve the stress you feel about whatever issue(s) you may have, it's not constructive or useful.

That is unless your goal is to get lots of comments on HN where the Apple Fanbois sagely agree, and lament there's nothing to be done about it because Apple is the pinnacle of tech and since no one could possibly do anything better than Apple (or the apps that run on their gear) therefore all technology sucks.

And that's objectively false. There's lots of tech out there that's quite good. I suggest using that and shunning rather than using, then whinging about the stuff that sucks.

[0] https://forum.xda-developers.com/htc-one-max/rom-lineageos-1...

Edit: Fixed typos/formatting issues.


> I was recently considering going from paper notebooks to a tablet. That initiative stopped at the electronics store.

Good, you dodged a bullet there.

I mean, I love my 2-in-1 Dell (a slightly cheaper but still high-end Surface-like device). The pen, as much as it's useful (I'm not even considering buying a touchscreen-enabled device without solid support of a pen anymore; it's so much better UX than fingers), still has lots of subtle and annoying bugs. Maybe in 20 years people will work out the kinks. More likely, the concept will be abandoned in favor of some new modality that will also never be perfected.


>Everything is broken, and nobody is upset.

Most software is still net positive in productivity. We tend to place more emphasis on failures as users.

Remember you're running millions of lines of code that talks to other computers running millions of lines of code that communicates over a network running millions of lines of code to deliver some information on the order of seconds to minutes -- and then something responds to that information and everything happens all over again.

All day, every day, trillions of packets of information get delivered just fine. Try doing that as a human, delivering letters. You probably won't even approach a million packets delivered in your life time. And people have the audacity to say, "oh my, some things didn't work, this is completely broken"

In only a single generation, we went from voice communicators to super computers in our pockets. The utility vastly, vastly, vastly overshadows the glitches that come with frenetic advancement. How long did it take humans to invent basic numbers?


Everyone loves to complain and take for granted what good they have!


I rsearched the ReMarkable 1 and 2. I ended up just getting a Rocketbook. It's very simple concept. "Paper" in the form of hard plastic. The pen it uses is the Pilot Frixion, which is an erasable pen. Hence you have a notebook to record notes for a while. If there's anything important I'll manually transfer it to my OneNote (I don't use Rocketbook's picture taking app).

Most notes I take only need to exist for a few weeks and then I erase...so transferring it to "long term storage" is rare.

I do have an iPad and note taking apps like Notability if I know something will need to go to "long term storage" but I find I use the Rocketbook more.


I was looking to replace the A6 sketchbook I carry everywhere, and the A5 notebook that always sits in front of my keyboard.

I thought it would be nice to access my notes when I don't have my notebook on me, and to have layers, zooming, undos etc. However, the more I look into it, the more absurd it seems.

I'm replacing a 15€ notebook and a 2€ mechanical pencil by a 400€ gadget that doesn't quite work. Why? So that I can spend my time organising notes in a digital space. Why? I don't really know.

It would be cool to have layers, zooming and an undo button. It would also be cool to have access to my notes even when I don't have my notebook. However, it would just be cool. It doesn't actually solve a serious problem.


The big draw for me for a Remarkable is being able to hand-sign and annotate PDFs without printing them out or using Adobe Reader's atrocious support for annotations. I'd love to be able to say that e-signing is the future but nobody really accepts it without question.

I'd also be replacing the piles and piles of legal pads I go through every year. Most of the time the notes are ephemeral except when I'm working across from someone in which case I really need them to exist digitally so I don't lose them.

I just wish I didn't have to wait 6 months for the second version.


> So far, my only solution to this is to be a late adopter, and to favour simplicity over sophistication. I was recently considering going from paper notebooks to a tablet. That initiative stopped at the electronics store. The Surface Go wanted me to go through a setup wizard (after dismissing a few notifications). Two of the 4 iPads had working pencils. The ReMarkable reviews mention a host of issues. I never encountered any bugs with my Moleskine. It pairs flawlessly with any pencil I want, including older models.

Not to mention that it:

- doesn't need charging

- never freezes or crashes

- is much cheaper than a laptop or tablet

- distraction-free (no Internet, no apps, etc.)


All of those were on my list of cons, particularly the lack of distractions. I avoided the Surface completely because Windows is anything but quiet and maintenance-free.

The iPad seemed pretty solid, but I'd have to turn it on and unlock it to see my notes, unlike a notebook.

The Remarkable seemed nice, bht there are lots of complaints that paper doesn't have.

The Supernote A6X was the most promising, but it was hard do get in Germany.


The problem is with the idea of "continuous delivery". Many people fail to understand that technological advances only increase productivity if the innovations, the great leaps forward, are relatively rare, with long periods of stability and refinement in between.

There's always an adjustment period, where people have to spend time learning a new technology, and any issues with the new technology need to be resolved. The gains in productivity happen mainly after the adjustment period. But we've eliminated the periods of stability and are constantly pushing for more "innovation", which means we're in constant periods of adjustment and resolving problems, where the promise of increased productivity is never fully met.

The worst idea ever in technology is regularly scheduled updates. Innovation has never and will never happen on a schedule. This is simply greed-driven, promotion-driven, pointy-haired-boss-driven development.

Produce something new and great... but then let us all enjoy the new thing for a while. Novelty for its own sake is not productive.


> The worst idea ever in technology is regularly scheduled updates. Innovation has never and will never happen on a schedule. This is simply greed-driven, promotion-driven, pointy-haired-boss-driven development.

this is sort of uncharitable. the development/maintenance cycle for software is incompatible with the traditional way of monetizing a product (ie, design it up front, manufacture at scale, and then the buyer gets what they get, barring severe safety defects). buyers of software expect the product to at least mostly work in the first place, but they also expect bugs to continue to be fixed after the sale, even if the bugs are introduced through unforeseeable interactions with other software.

imo, subscriptions are actually the ideal way of aligning incentives for products that involve ongoing maintenance. but buyers tend to consider this a ripoff if they don't actually see a stream of new features in development. while it introduces some unfortunate constraints in the dev cycle, bundling up features in a scheduled update is a good way to make it visible to users that their subscription dollars aren't just falling into a black hole. trickling out new features "when they're done" earns the respect of engineers, but results in the average user simply not noticing that progress is being made.


> buyers of software expect the product to at least mostly work in the first place, but they also expect bugs to continue to be fixed after the sale,

Contrast that to traditional physical goods, where buyers expect the product to work as advertised, right out of the box, or their money back. Software in the Internet era has it easy, because it gets to release shitty half-finished versions, and then keep charging money while never quite finishing the software.

> subscriptions are actually the ideal way of aligning incentives for products that involve ongoing maintenance. but buyers tend to consider this a ripoff if they don't actually see a stream of new features in development

Because software does not decay on its own (despite the misleading term "bitrot" being popular in tech circles). That's literally why digitizing data has taken the world by storm: digital data does not decay; as long as the physical medium is readable, you can make a perfect copy of the data it contains. As a buyer, I don't expect my software to need maintenance. I expect it to work out of the box (just like I expect every physical product to work out of the box), and once I find software that fulfills my needs, I expect it to work all the way until computing technology moves forward so much that it's no longer possible to run the software. Which, in the era of virtual machines, may take decades.

So yeah, there's a need to clearly justify to the customers why you're charging subscription, because software in its natural state does not need maintenance.


>As a buyer, I don't expect my software to need maintenance. I expect it to work out of the box (just like I expect every physical product to work out of the box)

Software is far more complex than most physical products. There are only so many failure modes for a screwdriver or a couch and they're all pretty foreseeable. The most complicated physical systems, like a car, house, or even a human body, do need maintenance.

I'm frequently frustrated with software bugs like everyone else (my building and apartment have this awful smartlock system that's riddled with bugs and which bricked me out of my own home due to a bad app update a few weeks ago--shoutout to Latch) but I'm not sure I'm on aboard with an anti-maintenance attitude. If there are bugs I'd like them to be fixed!


> I'm not sure I'm on aboard with an anti-maintenance attitude. If there are bugs I'd like them to be fixed!

Me too! I'm not trying to be anti-maintenance (though I do wish technology will be developing towards less and maintenance required, but that's another topic). I'm pro-quality. The impression I'm having is that maintenance burden on software is being created in order to justify subscription model - and that the ability to do post-release updates made vendors and devs no longer care about delivering reliable and quality software (customers become the new QA, bugfixes can always be added latter, except they tend to be deprioritized in favor of new features).

Note I'm not postulating a conspiracy theory, just a spontaneous coordination of the entire industry due to market incentives. But the effect is still there, and I feel it needs to be countered.


> Because software does not decay on its own (despite the misleading term "bitrot" being popular in tech circles).

in theory yes, in practice no. I work on a product that targets windows and macos. on windows yeah, a version of our software from 2015 probably works as well as it did the day it was released. apple deprecates stuff in their API every year that we have to go back and update. they also break a lot of stuff that isn't formally deprecated and we have to find workarounds for that to. "our software will work forever as long as you never update your OS" is not acceptable to most customers.


Sure, but until quite recently, OSes were not updated at such breakneck pace. At this point we're hitting what can be seen as industry-wide, self-justifying scam: software needs subscriptions because it's being continuously updated; it's being continuously updated in big part because every other software is being continuously updated.

Still, as a Linux and Windows user, I've absolutely grown to expect my desktop software to work 10 years or more without updates. After that, I can always spin up a VM with an older Windows version.


I find that what causes SW to break most often is changes to the OS, either other app updates or OS updates.

I work in a hardware company, and for any important function I usually set up a dedicated computer, install the software, and then never touch it ever

This is how the more sophisticated oscilliscopes etc. work. They often have Windows XP installed if you buy them used. Simple, doesn't break, no internet, if it's mission critical or expensive it's worth a dedicated and frozen computer


> they also expect bugs to continue to be fixed after the sale

I wasn't disagreeing with that. I mentioned "long periods of stability and refinement" — refinement including bug fixes — and "any issues with the new technology need to be resolved". But again, bug fixes don't magically happen on a schedule either. Maybe fixes are easy, maybe they're hard, you never know in advance.

> bundling up features in a scheduled update is a good way to make it visible to users that their subscription dollars aren't just falling into a black hole

This is exactly why it's not true that "subscriptions are actually the ideal way of aligning incentives for products that involve ongoing maintenance". Instead of maintenance, subscriptions incentivize continuous delivery of new features, and consequently continuous delivery of new bugs.


I suspect this is a reversal of cause and effect.

Did consumers demand subscription services? Or did vendors (led by Adobe) decide to change to subscriptions to get uniform cash flow?

At agencies I have worked at all creatives I worked with would prefer to spend $200-400 and have a permanent software license. Perhaps this isn't a representative group.


I'm not saying consumers demanded subscriptions. vendors push them because it's a saner way of managing revenue for products that require maintenance post-sale anyway.

that said, I think consumers would prefer subscriptions if they understood how it aligned incentives. one way or another, a product will stop receiving support when the money stops flowing in. with a permanent license, it ends when people stop buying licenses. with subscriptions, it continues as long as enough people keep paying.


> I think consumers would prefer subscriptions if they understood how it aligned incentives.

Funny how consumers tend to despise them though.

The term "subscription" itself a typically a euphemism for "rental". There are a small number of companies who offer a year of updates that you get to keep forever (which makes consumers play the game of when exactly to buy to maximize the new features in that year), but most so-called subscriptions disable the software entirely if you stop paying. In other words, rental.

Long-term rental is almost always a bad deal for consumers. One of the few exceptions is housing, because many consumers can't afford to buy a house, and also houses are one of the least liquid assets you can own, if you have to move it (and yourself). Otherwise, rental is going to cost you a lot more in the long run.

Financially, rental can work well for the seller, of course, but we end up with "subscription fatigue", where the market can't sustain as many sellers, and the few rich companies get richer (which is exactly why they were "pioneered" by BigCos such as Adobe, Microsoft, and Apple).


> Funny how consumers tend to despise them though.

sure, and as an individual I behave the same way. I always want to solve my problem in the cheapest possible way. still, I can't help but notice that products with stable ongoing revenue tend to get much better support.

I think the clearest example is with games. most games get released with a pile of bugs. a bunch get fixed in a release day patch and then there are a few more patches over the next few months (when most of the sales happen). once the initial wave of sales subsides, you tend to be stuck with whatever bugs remain. cs:source had several game breaking bugs for years (defuse kit over bomb blocking defuse, defusing through walls, etc.) despite being one of the most popular FPS titles of its time. AFAIK, most of these still exist fifteen years later. csgo, which is monetized through microtransactions, gets bugs fixed almost as fast as they can be posted to reddit/youtube. microtransactions aren't quite the same as subscriptions, of course, but they generate revenue proportional to the current userbase, rather than the rate that people buy the game for the first time (which will inevitably dry up).


> that said, I think consumers would prefer subscriptions if they understood how it aligned incentives.

Actually, I think subscriptions misaligns incentives. With subscriptions, it becomes important for the vendors to keep releasing updates (so that the customers feel like they're getting value out of the subscription), which means having bug-free software is a terrible idea. You'd need to either release intentionally buggy software (so you can ship a follow-up version to fix it) or go on a feature treadmill (in which case trying to stabilize has rapidly diminishing returns and high opportunity cost).

As a consumer, software that was developed knowing it would never be fixed and has to be perfect the first try is much better (even if it still has bugs). Mario64 had bugs (e.g. backwards jump going really fast); but the bugs weren't really noticeable in normal gameplay because they couldn't just ship an update most of the size of the whole game before you start to play.


No, subscriptions introduce the perverse incentive to release unfinished products and slowly drip-feed fixes. Having to test extensively before a one-and-only release may not drive nearly as much revenue, nor provide a running deliverable stream for a given engineer/product manager's CV, but it is clearly the better user experience.


you say this, but given a choice between a "finished" product and a competitor with more bugs and more wanted/needed features, users will almost always pick the latter unless it catastrophically affects their workflow. engineers care about quality; customers care about the fastest way of solving their problem.


> engineers care about quality; customers care about the fastest way of solving their problem

Almost all customers care about quality. The problem is that many customers have only very limited information about products, so they have a hard time judging quality vs. competitors before (or even after) purchase.

This reveals a general problem with the market: it doesn't select for quality. Otherwise we wouldn't be having this conversation. The market is really good at producing cheap crap. So the truth is, yes, engineers have to care about quality. The motivation for quality has to come from pride in your own work, not from outside market forces. If you care about quality, then you have to strive for that over quantity, and also charge sustainable prices instead of trying to lowball. You may not be the market leader, but there are many profitable niches. Some customers are definitely willing to pay for quality.


> Almost all customers care about quality.

yes, but not to the exclusion of features. I work on a B2B product where our customers bill their customers by the hour, so they tend to have a pretty good idea of how much time a feature saves them. if a competitor adds a feature that cuts the time needed for a project in half (not unrealistic) but crashes and forces them to start over a quarter of the time, the customers will still buy their product instead of ours. they'll complain incessantly on the competitor's forums about the crashes and threaten to switch back to our product, but they won't actually do it unless we come up with something new that saves them even more time.

customers care about saving time and/or money; they only care about quality to the extent that it furthers that fundamental goal. if there is some bug-ridden alternative that solves their problem faster, they will do their best to find it and purchase it.

edit: to be clear, I mean "reliability" when I say "quality"; their are many other "qualities" a product can have, one of them being "cheap".


There's a big difference between not caring and making a tradeoff. You can care about A and B but decide B is more important than A. But if you don't care about A, then there's no tradeoff, you just choose B no matter what.

The question is, why do we make consumers make that tradeoff? Why are we shipping junk at all? There shouldn't be a reliability tradeoff. All products should be reliable. It ought to be a bare minimum standard.


> There's a big difference between not caring and making a tradeoff. You can care about A and B but decide B is more important than A. But if you don't care about A, then there's no tradeoff, you just choose B no matter what.

fair enough, I probably overstated my point with some of the wording.

> The question is, why do we make consumers make that tradeoff? Why are we shipping junk at all? There shouldn't be a reliability tradeoff. All products should be reliable. It ought to be a bare minimum standard.

everything in life is a tradeoff. we could make software more reliable, but then we would have to spend less time adding new features, or we would have to hire more/better engineers and charge more. maybe we even get to pay down tech debt and gain the ability to add features faster in the long run. doesn't really matter if someone else dumps a bunch of buggy new features in the meantime, converts our customers, and forces us out of business. in the absence of some industry-wide gentleman's agreement or regulation, we have to observe the behavior of customers and do what their behavior (not words!) indicates they want.


> doesn't really matter if someone else dumps a bunch of buggy new features in the meantime, converts our customers, and forces us out of business

This is always presented as the doomsday scenario, but how often does it actually happen?

The story of Apple in the Tim Cook era is unrelenting annual releases, more and more "subscriptions", massive return of cash to AAPL shareholders, but decreasing product quality. Did Apple make that tradeoff because they were scared of going out of business? No, they were doing very well before Cook took over. Cook simply has lower standards than Jobs, there's no other reason. He's been great for investors, not so great for customers.


You state that "unless" as if it weren't generally the common state of "competitor with more bugs."

No, users pick a finished product.


Developers get paid every couple of weeks, so an agile business tries to track the value gained from that expense on a similar cadence. The process has become more about corporate governance than delivering customer value.


> Innovation has never and will never happen on a schedule. This is simply greed-driven, promotion-driven, pointy-haired-boss-driven development.

For consumer goods, it is the hype cycle.

New updates and releases get press. They also restore consumer confidence.

If Samsung announced that next year they weren't releasing a new Galaxy phone, the entire industry would freak out. Consumers would lose confidence in buying Samsung phones, journalists would write articles questioning if Samsung was pulling out of the market, a lot of bad things would happen.

Give it 18 months without a release and people would start to think of Samsung as "that company that used to make phones."

They would have to fight like heck to restore their image.

Software is the same way. In the Vista/7/8 era, Microsoft looked like they were falling behind because their competitors started releasing yearly, or even bi-yearly, feature updates.

Sure every Android version up until 7 was kinda-sorta-terrible, but it kept Android in the news. Likewise, Apple got huge free press every time they announced a new revision of OS X (now MacOS), and every time they came out an announced a new version of iOS.

The result? "The desktop is dying, phones are where the real innovation is at!" articles being published even faster than those software updates came out.

You can of course release too fast, rarely do Chrome or Firefox's releases get any press (unless there is a controversial UI change), but in general frequent updates are free advertising.

Tesla is also great at this, I'm nowhere near being in the market for a Tesla, but at least a couple times a year I still end up hearing about software features they are rolling out!


> If Samsung announced that next year they weren't releasing a new Galaxy phone, the entire industry would freak out.

The smartphone industry has only themselves to blame for setting up this expectation. But it is possible to get off the train. I remember when Apple announced they were dropping out of the annual MacWorld San Francisco conference, because they didn't want to constrain their product release cycle. Apple survived that just fine.

> In the Vista/7/8 era, Microsoft looked like they were falling behind because their competitors started releasing yearly, or even bi-yearly, feature updates.

Competitors? Windows and Mac had near 100% market share on desktop. There was only 1 competitor. Vista was released in January 2007, but Mac OS X 10.5 Leopard was infamously delayed until October 2007 because of iPhone, so this telling of history doesn't seem entirely accurate. Moreover, Mac OS X releases were already slowing. Here's a list of months since the previous major .0 release:

10.1.0 6

10.2.0 11

10.3.0 14

10.4.0 18

10.5.0 18

10.6.0 22

10.7.0 23

10.8.0 12

10.9.0 15

10.10.0 12

10.11.0 11

10.12.0 12

10.13.0 12

10.14.0 12

10.15.0 12

Thus, major Mac OS releases were slowing down year by year — which is totally sensible — but then Steve Jobs died after 10.7 was released in 2011, and only then did they switch to a yearly schedule.


The competitor in the 7 era wasn't more PCs, it was tablet mania. Fear of tablets replacing PCs lead to Windows 8.

Of course that didn't come to pass, but everyone acted like it was the future and the market responded accordingly.


This criticism is based on the assumption that all updates are created equal. But they aren't.

One week, the update could be a relatively minor bug fix. The next week, a major feature upgrade that's been in the pipeline for months.

You also remove the ambiguity of "Is this worth pushing out? When should I push this out? Should I do some more fixes or push this one out first?". You got fixes, push them out in the next update.

Your criticism also assumes a small team. If you have a large enough team where you can split them into new feature development and current bug fixing, they're going to work at different rates and be ready at different times. If instead your entire team just works on "the product", then there is no effective difference between fixing issues and creating functionality.


I feel like the slow decline of software quality has been in lockstep with the gradual transition from (expensive and non-measurable) manual software/hardware testing and QA to automated frameworks and rollout-based quality assurance.

I constantly encounter broken functionality, buggy or unpleasant UIs, just as the author has. It feels like many of these problems could be avoided if you just had one person whose job it was to sit there and look for broken stuff. (I'm sure I'm biased as someone whose first job out of college was to sit there and look for broken stuff.)


I would tell a slightly different version of this story, focusing in on "rollout-based quality assurance".

I would say that effortless, automatic updates are to blame.

When you can always just push an update, the impact of a given bug goes way down. It's no longer mission-critical to exterminate flaws before shipping; a totally broken feature becomes a mere annoyance. So project prioritization shifts from polishing an artifact to outweighing the (presumed inevitable) constant stream of little annoyances with fixes and features. I think the shift towards automated testing is just a symptom; an attempt to bridge the gap in this brave new world.

For a clear-cut example of this phenomenon, look to the video game industry. Until around 2007, games received no updates. Ever. Once a game shipped, it was shipped. There wasn't even a mechanism for installing an update from physical media.

Right around that time, "glitches" went from very rare unicorns that people would spend lots of time actually seeking out, to nearly everyday occurrences. As long as it doesn't corrupt someone's save file, they mostly laugh it off and upload a clip to YouTube to show their friends. This is just how things are now.

(Edit: I should have scoped this to "console games")


> Until around 2007, games received no updates. Ever. Once a game shipped, it was shipped. There wasn't even a mechanism for installing an update from physical media.

Sure, but they still (sometimes) released (a few) extra revisions of a game. They were just targeted at people who bought physical copies after the revision date, rather than at existing customers.

Or said updates came on the 1.0 version of the game as shipped in markets that got the game later than others. (Just imagine — per-market release versioning. Every market effectively got its own fork of the codebase!)

Or said updates came in the form of a re-release port. There are patches made to the emulated downloadable app-store re-releases of some games, that never made it into any physical edition of the game.

Also, before home consoles, arcade game machines did receive bug-fix updates regularly. Arcade machines were essentially provided “as a service” from their manufacturers, with support contracts et al. Sort of like vending machines are today. If you reported a bug, they’d fix it and send you a new EEPROM chip to swap out in your cabinet. If there was a critical bug that affected all units, they’d send techs out to everybody’s machines to swap out the ROM for the newest revision. (For this reason, it’s actually kind of hard to do art-conservation / archiving of arcade games. The cabinets almost never have fully “original-release” components inside.)


I'm sure this happened occasionally, but it wasn't advertised. Nobody was buying a new copy to get an update. Reviewers weren't revising their reviews in accordance (something which does actually happen now). It was still absolutely mission-critical to get things as polished as possible the first time around.


I think you've both got a piece of it. I've programmed PC software, embedded software, and mobile software, and my gut feeling (without data) is that the shipped software quality is inversely proportional to the update frequency and ease of updates. Had nothing to do with how smart or skilled the developers and testers were. Had nothing to do with management's priorities. Update frequency and ease changed immensely once we could feasibly deliver patches over the Internet. Before easy updates, you'd actually quality check every corner of the application, you'd actually fix those P2s and P3s. You'd do exploratory testing off the test plan rails to find things. There was even a concept of "done" in software, as in, you eventually stop constantly jamming features in and tweaking the UI in maddening ways.

Now, it's just "LOL just ship it, users will just deal with it until the next release!" Now, it's "Do experiments on N% in prod and use end users for A/B testing. If something's broken we'll update!"

In several industries, it's actually totally expected that v1.0 of the application simply won't work at all. It's more important for these companies to ship non-working software than to miss the deadline and ship something that works! Because who cares? Users will bear the cost and update.


I agree that games used to have far fewer bugs when shipped, but it's not true that games never received updates back in the day. I distinctly remember queuing on file sharing sites as a kid in the early 2000s to download half-life updates and updates for other games.


I suppose my statements above should be mostly limited to games on consoles, not PC. That's what I had in mind.


I remember when DLC used to be called “a patch” and it was both bug fixes and also huge amounts of new content. I wish I could remember what games this pertained to.


absolutely this. Back in the early 2000s quality assurance was insane. A release is was working on (aaa title from a major studio) was blocked because players' eyes were rendered incorrectly and you had to really zoom in to even see the glitch. And of course you had to find and fix all bugs before october, or else you won't be able to hit Christmas sales.

Once internet updates became the norm, it all became pretty much like the rest of software industry. (At least game companies still have QA departments, a lot of mainstream web companies have dispensed with those as well.)


Or as I tell my wife. I am playing my favorite game 'updating playstation'. I turn the thing on so rarely by the time I usually do turn it on there is 1-2GB of updates waiting. Glad I have a semi decent internet connection these days...


I have witnessed this first hand comparing two systems.

One system has no patching, and updates incur some non-trivial amount of effort on the part of the installer. Releases are a few times a year, at most.

The other system has patching, updates are lighter weight, and as a result, the system has THOUSANDS of patches released over the last decade, north of 2 per work day.

Guess which system is higher quality? The former.

Much, much higher quality.


> Until around 2007, games received no updates.

Warcraft II, from 1995 received multiple patches. So did many other games from that era.

Do you perhaps mean console games when you say "games"?


Um, how long have you been in the industry?

Software has always sucked and had these issues. It has nothing to to with automated QA. The reason you see more issues is 'way back in the day' your software had a very limited number of things that it did, and in general it did not involve accessing a network or chugging down massive volumes of data from untrusted sources.

I work for a company that has a lot of individuals that test for QA issues, they have list miles long of things to check and write reports on.

The problem is more of "It's much easier to write mountains of code than it is to ensure that it works in all cases"


I worked for almost 15 years in embedded and have 25YOE, and no, software has not always sucked as much as it does now.

I agree with your last point a lot, though I would modify it slightly: it's much easier to write mountains of code now than it as, and it's now common and much easier to import external dependencies (especially at system level) than ever, and those dependencies have tens of millions of lines of mediocre code all by themselves.


The one thing that has changed is updates can be delivered easily. This takes some of the pressure off in terms of QA because rolling out a fix to a centralised service delivered through the browser is quick and painless. The cost of pressing millions of CD's kept developers in check in the past.


That's really not the only thing that changed.

Games are also massively more complicated today. It's one thing for three people getting a 2.5 MB single player DOS game reasonably bug free. It's an entirely different thing to get the 5 GB game made by a team of 100 or 1,000 people.


Most of the size difference is assets anyway, right? Games today use off the shelf engines. It should be alot easier to make rubust games when most of the techincally hard parts are allready done.


And, as you’ve alluded to, the scope of what software does for us on a daily basis has expanded by several orders of magnitude. Not only have a number of devices that used to be purely electro-mechanical been reworked to use microcontrollers (cars, everything in the kitchen), but the scope of activities that have migrated onto the web or our phones is truly massive.

30 years ago, software bugs might interfere with you professionally, but they wouldn’t stop your ability to get money from the bank, cook food, or do any other day to day tasks.


> Not only have a number of devices that used to be purely electro-mechanical been reworked to use microcontrollers (cars, everything in the kitchen),

Yup, and in most cases this not only did not improve them, but made them less useful and more fragile. Let's be honest: the software is there only because it can save on manufacturing costs, and sometimes can be used for extra marketing benefit. No attention is being given to providing value to the customer.


Depends on what you’re talking about.

Car engines are vastly improved in both reliability, cleanliness, and efficiency by the introduction of computers into them. You might not like that when it goes wrong, but we all appreciate not breathing in pre-computerized car engine exhaust.

And that’s the rub. While shoddily written software shoehorned into cheap consumer goods obviously degrades the experience, there are tons of places where well written software has massively improved the quality of the goods that they’re added into. Objectively car engines are just better for the addition of software both in design and in operation. They’re smaller, more powerful, more reliable, cleaner burning, and more efficient than they were before we computerized them.


Yes but at the same time the auto makers let another team make Electron apps for the dashboard. Engine ECUs are nice and decoupled. Maybe the hard realtime requirements is what saves them and keep all the novell crap out?


And that’s why this is hard. There are cases where software drastically improves the objective quality of things (car engines), cases where well done software makes items significantly more enjoyable (some car infotainment), and cases where software makes things worse (everything in the kitchen). Separating them out is hard.


I have to disagree in the case of car engines. When I was a kid my dad and uncles spent hours each month fixing minor problems with their purely electro-mechanical cars. I don't miss carburetors or mechanical ignition timing one bit. Electronically controlled engine functions are much better both in terms of efficiency and consistency.


That's true. I don't have that much experience with car repair so I might be wrong in perceiving the 90s and the 2000s models as the optimum in terms of car reliability - the stuff mostly works without funky issues, parts are cheap, repairs can be made by anyone who spent some time with a wrench, and you don't have to visit a shop with a license for poking around car's computer over every minor issue. That is to say: advances in computer control don't have to go hand in hand with making the cars expensive to service and not end-user repairable. But they do, because greed.


> Software has always sucked and had these issues.

I disagree, subjectively it had its ups and downs and we are in a down phase right now. YMMV.


Software has never been better; there are just more users to please.

And despite the issues lamented in this think piece, it is _Apple_ that the author should blame for setting technology expectations impossibly high.

No organization had been remotely as successful in understanding and releasing tech products that were truly great.

Everything since is just a comparison to expectations Apple set. Even when Apple fails, it is in comparison with an Apple that does not.


You also need management willing to prioritize engineering time to fixing that broken stuff, and engineers who actually know how to make non-broken stuff. My experience is that having all three of these prerequisites is pretty rare.


I'm going to restate something that I said a few days ago:

There are a number of hats developers are expected to wear today:

1. Developer of new features

2. Sustainer of prior code and features

3. Tester of all of this

4. Constant student (outside work because who'd pay their employees to learn?)

The priority for the business is (1), so 2-4 get neglected. This compounds over time to mean that old code isn't properly refactored or rewritten when it should be, and none of the code is tested as thoroughly as it should be, and none but the smartest or most dedicated are really going to be perpetual students (or they'll choose to study things that interest them but don't help at work, like me).

When the old code and poor tests create sufficient problems, you get a business failure or a total rewrite. Which strips out half (or more) of the features and the whole process gets restarted.


Don't forget:

5. The mentor of younger developers.

It's another thing some companies expect you to do, but don't allocate time for it, so at the point you finally know enough to not do a bad job, you're suddenly being pulled out of 1-4 and expected to do 5.


Oof, we had a guy retire over that. In his appraisal he was dinged for not doing enough. They counted his training of new hires as 1 "point" (or whatever, the system was weird). They neglected to consider that he was training 5+ new hires in that year. He'd had enough, put in one last year to wrap some stuff up and then he was out and free.


And also lots of devops likely these days:

- Infrastructure design

- Deployment

- Monitoring / log-based debugging and fault tracing / handling customer issues

In addition to wanting a "full stack" developer of course...


Right. It's the curse of the name. DevOps wasn't meant to be a role, but a philosophy of working and organization. But management (and practitioners) latched onto the idea of it as a role and hosed themselves. Now you've got a 20+ year veteran developer tasked with keeping dozens of servers properly configured, secured, and operating as well.

At some point we have to accept that specialization isn't just for insects. It's helpful to have a proper sysadmin or DBA or whatever (appropriate to your domain) within the team, and not just diffuse those roles amongst the developers themselves.


As someone who has been programming since 1984, software quality has greatly improved not declined.

The level of complexity in modern day software is orders of magnitude greater than that of even a decade ago.

What has changed is our reliance on that software. We are now so deeply embedded into our software existence we see these flaws up close.


Great comment. People also used to use only a few pieces of software in any given day (or week, even!) e.g. email, web browser, and word processor.

Now, our computer ("phone") is with us everywhere we go and we'll use dozens of complex applications per day, connected by dozens of APIs, networks, protocols, and hardware features. It's a miracle any of it works sometimes! Thank you to everyone for making this stuff seem like magic; my twelve-year-old self would be amazed at how well it works.

I do feel like there are more UI bugs as we optimize for certain metrics over others. Timing updates has become far more complicated, so we get weird UI refreshes as new data comes in, stale caches, missed notifications, etc. Turning it off and on again often works, surprisingly (probably because devs start with clean environments often, so that's the functional baseline).

Lastly, it is probably far more lucrative for a technology based business to use their most valuable minds for the Next Thing, rather than iterating on the current thing. Incremental revenue improvements just don't cut it in a capital-driven world; everyone is trying to escape the local maxima to find billion/trillion dollar businesses.


Yeah, but then you'd have to raise the price to pay for the testing and the jokers a block down the street who YOLO'd their competing product into the marketplace without testing would grab all your sales. You'd be out of business and your competitors would be laughing their way to the bank while the customers still suffered constantly from a broken product.


If it were only for novel products this would be a reasonable argument. See Netscape's rush to release their browser as an example of what you're talk about.

But the worst part is that this is an issue with established products that have secured their market, and will even receive money every year from their customers. They have both the money and the time to pace themselves and test things properly, but they don't.


> They have both the money and the time to pace themselves and test things properly, but they don't.

But not the motivation, because "it works" and improving UX would cost money and doesn't have an immediately visible return.


Can you name a few of those who really truly have time but don't take their time? Also it seems there's always an opportunity cost. So time is always "of the essence".


Right, you could be spending those dollars running ads, which the market is more likely to respond to :/


Microsoft, Apple, Google.

An MS bug in an enterprise piece of software (so they get paid every year for it) is Skype for Business. Perhaps my org hasn't updated their installation, but if I drag a contact from the chat window (say you message me and aren't in my contacts yet) and then drag it to the contact list (in status view) it will reliably crash the program. If I drag it to groups view it'll place the contact in a group. My guess is that dragging it to status there is no "default" group and so there's some kind of null pointer exception occurring (it tries to add it but with group given as either junk or null).

I used to see more bugs in Outlook, but it seems somewhat better with the last update so I haven't noticed them. Though it was fun when I had negative 2 billion messages for a week or so (I certainly didn't have enough to cause overflow so I have no idea how it wrapped around like that).

Google's is an issue of usability of their webapps (IME), not strictly buggy but not sufficiently tested. Behind the proxy at work Maps is incredibly unreliable. It takes several reloads for it to actually start working "correctly", but don't change where you're looking too much (you can zoom in, but do not pan around). That's not the only unreliable one in this situation, but it is the most easily demonstrated.

Apple's Messages and Mail constantly tell me (they've been better the last few months, but it still happens) that I have unread messages, I'd search and search and never find them. Then I'd pull it up on a different device and finally see the unread message (which was both visible and marked as read on the original device).

Some of these may be shallow or seem petty, but it's an unpleasant experience that after so many years and with so much money should've been resolved for each of them. I'm willing to tolerate an indie game crashing on me. I'm not willing to tolerate an enterprise software solution (MS) crashing for a natural user behavior.

EDIT: I think resolved, but Apple's iOS calculator bugs were annoying for such a simple program. Not strictly a bug, but Windows' calculator, these days, is an unusable mess in many ways. It shouldn't require so many system resources to add some numbers together (the same could be said of many small utilities that were rewritten for, I think, Windows 10 or Windows 8).

Gmail showing me email meant for <first><last>@gmail.com instead of <first>.<last>@gmail.com. The fact that they ignored the . at all in the user names. The Youtube app on iOS when I first installed it and quickly uninstalled it years ago was an incredible annoyance. It wouldn't reliably show me the video I'd actually clicked on that caused the app to open.

Adobe is an almost perfect example of holding their market captive. I've run into a number of Adobe Acrobat issues over the years, though fewer recently (but I use it less often now). But especially Acrobat Reader on Mac OS X was awful, I actually once had to reinstall the OS to get it to stop fucking up PDF display even after uninstalling the software (I never found out what it had done to the system, and gave up). I needed it because I couldn't find anything else (at the time) that supported digital signatures in PDFs on the Mac (in the sense that it actually worked, I think Preview let me do it but what it made wasn't usable by the people receiving the file).

EDIT2: Another Google one, with Chrome on macOS in full screen. Hiding the location bar means you straight up can't get to it. You have to reenable it, versus a sane behavior like autohiding and moving the cursor to the top restoring it.


Skype for Business is just Lync [used to be Office Communicator, maybe?] renamed/reskinned, right?

I think that product is cursed. It was always a dumpster fire, like IBM's Lotus Notes. It's there so you their clients don't accidentally try Slack or any of the sane alternatives.

All products of MS, Apple, Google are in a constant churn mode. New design. New integrations with whatever platform changes happened in the back, new features to match the competitors, new trends, new mobile apps, new browser features, new framework.

Those are the products that are not the real products and not real money cows, so they get very limited attention.

I agree all of these are horrible. GMail is still a slow piece of shit. I recently tried Thunderbird, and .. it slowed down too. Wtf. Slack is slow too. Typing has become slow for some reason in a lot of "apps", maybe too fancy fonts?

Anyway, these companies have huge opportunity costs. Just look at Google. They try whatever crosses their mind and nothing is good enough compared to "ads". And so they shut them down because opportunity costs. (Not because upkeep, but because then your attention is not on the next big thing, whatever that will be.)


> I agree all of these are horrible. GMail is still a slow piece of shit. I recently tried Thunderbird, and .. it slowed down too. Wtf. Slack is slow too. Typing has become slow for some reason in a lot of "apps", maybe too fancy fonts?

Possibly because more companies are moving desktop apps to Electron so they can run JS everywhere.


So more telemetry is not a guarantee for bugs being fixed.


I agree with this sentiment in general. Of course, we have a ton more features than we used to have, but I think given the newness of software in years past, we were OK with bugs and issues because of the novelty of it all.

Now we are in 2020 and iPhone updates STILL cause battery issues. The iPhone has been out for 14 years...

Our expectations have changed. Tools like Excel should just work - and yet, when I try to save a file, sometimes it freezes and crashes. How is that acceptable now?


This, but also remember that we're living through a transition from SAAP to SAAS, where the "service" is actually data extraction for advertisers. For that, software must be minimally useful to a user, but the true aim of UX is not user-satisfaction but data extraction or subscription lock-in.


I work somewhere that primarily relies on manual testing rather than automated testing. It definitely does not make the software more reliable here, at the least. :-)


There're three kinds of manual testing:

1. Manual testing that should be manual (exploratory).

2. Manual tests that are new and haven't been automated yet (but will be).

3. Manual tests that should be automated.

(3) is the one many people see and suffer through (I know I have). They need to be automated to free up time for (1), which is where many issues are actually discovered. But if (3) dominates your time, you can never get to (1) and you'll constantly ship broken things (or more broken than they should be).


This is very insightful, thanks!


I am a life long Linux user and programmer well accustomed to searching for support and fixing issues with software.

Recently I got an RSI from programming and could not use a keyboard for 10 months. I could only use my Android phone. There were so many problems and I had so little insight. I _did_ search for answers and found contradictory or incorrect instructions. Often I had to resort to uninstalling apps and taking my chances on their competitors. This strategy had a very low rate of success. Many things I just gave up on. I essentially lost 10 months of productivity and became despondent.

One specific problem I had was that I was unable to control which PDF viewer would open automatically when I downloaded a PDF. For not particularly interesting reasons regarding my workflow, it was important to me that Adobe open instead of the viewer built in to the file navigation app. I followed the instructions online to rectify this. However when I navigated the menu to the appropriate place, the option simply didn't exist. I don't fully understand, I'm not and will never be an Android dev, but I believe it had something to do with the built in viewer not being a first class app but some sort of "applet".

Deep in this frustration it clicked with me; this is how most people see computers. Black boxes that work against you. Unfathomable and unserviceable.

PSA: Stretch your hands. Take care with the ergonomics of your keyboard, mouse, and desk. If typing hurts - stop.


Hope you're feeling better now!


Much better thanks!

Probably should've mentioned that!


Advancement in tech are nowadays driven only by profit, which in turn is driven by sales, which prioritise what people are likely to pay for. People expect technology to be affordable and they really don't care what the effect of this is. Put competition in the mix and you get a lot of stuff that sucks because the field of battle is not the quality, is pricing and marketing.

And then stuff dies after a short time. And/or is just left forgotten because it's more broken/an annoyance than it is useful. Or, and this is key, you are stuck with it because for a mix of reasons there is no viable alternative.

One would expect that on the long run the ultimate victims of this (the users of technology) would realise that rather then the ten thousand loads of tech crap they buy they would be much better off with a bunch of stuff that actually works without annoyances for a better quality of life but people just don't. Because again, pricing and marketing.

It used to be _create value and make a profit from it_, now it's the other way around: _aim for profit and consider creating value as a cost of doing so_ (and therefore should be minimised, who gives a frack if the user is annoyed or the thing is broken as long as there is profit).

People has been even trained for decades by major products in the industry to accept that tech doesn't work/is broken/sucks. The good stuff we have exist because at some point for some small time quality became competitive advantage, then on top of that other cycles of craploads piled on.

We did this.


Expecting that billions of people will simultaneously act against their incentives is madness. You can't assign collective fault to "we" who did this by making bad choices individually. It's a system that has optimized itself to this end goal.

I feel pretty good about my choices in comparison. I use an older, sock Android device, with simple wired headphones, so that particular issue with AirPods has never occurred to me. My laptop is a bulletproof old Thinkpad running Ubuntu LTS, which is likewise pretty reliable. Software that I have to use for work but that I expect to be crappy goes in a VM. I use this crappy software to the best of my ability to create other software and hardware that I expect to work reliably, and to do so for decades.

But none of those signals and "good choices" ever feed back into better choices by the titans of our industry. When an iPhone or AirPods device uses up all the recharges available in it's battery, that shows up as profit in the numbers Apple uses to make decisions.

Don't blame the victims for the situation they're prey to.


I don't see your point, it's obvious that among billions _some_ made better choices, but collectively it's still our choices that led to this. And I think it's pointless to blame "titans of the industry" while almost literally everyone else too buy into the same pattern, or you mean to tell that there is a _signigicant_ sum of companies in the tech industry that ships with a "quality first" mindset? I do not mean to be sarcastic, if that's your point please make a list, I would really appreciate that.

And yes, I blame the victims for their/our short-sightedness, it's not like anyone forced people to buy and re-buy broken products for decades.


> When an iPhone or AirPods device uses up all the recharges available in it's battery, that shows up as profit in the numbers Apple uses to make decisions.

You make it sound like only the batteries in Apple devices have finite charging cycles, which is obviously not true.


Although I agree quite wholeheartedly with your comment and the article, I still feel it's possible to find good or at least better software out there. I recently switched to Android because it is far less buggy than iOS. I honestly think iOS is the buggiest software I've ever used. I also similarly switched from Spotify to Amazon Music Unlimited for the same reasons. I have yet to find even a single bug in AMU. It's honestly quite impressive in today's software world.

I really hope more companies start to see quality and reliability as a competitive advantage. I know I'm an outlier, but I gladly pay more or switch when I find a product that is more reliable.


It’s interesting, there are a lot of little bugs in software today, but I think it’s easy to forget that common bugs used to be a lot worse.

In the 90s it was pretty common to have a kernel panic and hard crash, blue screen of death, etc. Data loss was common.

Folks who lived through that have such a strong compulsion to hit CMD-S every few minutes, that a lot of cloud software today offers a “save button” that actually doesn’t do anything because the document auto saves as you type anyway.

To some degree I think this may be related. Bugs that crash the computer are a little bit easier to detect (If only because the users come screaming at you) compared to bugs where the app scrolls incorrectly once every million scroll events.

Overall I would say that software has become much more complex and much more reliable at the expense of being more inscrutable. People are able to operate in a degraded condition now because software is capable of working that way, whereas in the past it would just fail completely.

Is that better? Maybe not by much, but I think most normal users would say software is easier to use and more reliable today compared to decades past.


> Folks who lived through that have such a strong compulsion to hit CMD-S every few minutes, that a lot of cloud software today offers a “save button” that actually doesn’t do anything because the document auto saves as you type anyway.

This describes me.

And I often wonder if the presence of the "save button" is what causes the compulsion: in web apps that auto-save, I trust/assume they're auto-saving, and feel more free.

In web apps that have the save button, it's always occupying some part of my mental capacity to keep hitting it after significant edits.


As Douglas Adams famously said, "Technology is stuff that doesn’t work yet".


That is a quote I'll have to remember. Appearently it was Bran Ferren who said it, and was quoted by Douglas Adams (source with broken https: https://www.douglasadams.com/dna/19990901-00-a.html)


When i find bugs or 'featues' that make me waste time i blame developers. They clearly don't care anymore. So ordinary users have resorted to quiet desperation through learned helplessness. Like all those 'redesigns' that remove features and turn what used to be a few memorized keystrokes to an agonizing series of scrollers, popups and "UX" animations.

I just think developers don't care anymore. They don't yell enough at their marketing teams when they push for obvious bullshit features. They make so much money that they have no interest for the users, in fact nobody does.

There's no way out of this, until AI replaces developers. When users can talk to their apps and customize them to 100% fit their needs, without the need for any developers, the users will have got their revenge


I disagree. The role of software engineers has become so commoditized that engineer input doesn't mean much to management, and for every engineer that goes against a decision, there are another dozen who will do what needs to be done without complaints.

I see this as a symptom of the lack of democracy in the workplace.


Maybe if they cared more there would't be only one in a dozen who cares? It's my speculation but i think the culture of jumping from company to company every 2 years to maximize "total compensation" does not allow one to jump from being a code monkey to programmer.


I think jumping from company to company is another symptom of the lack of workplace democracy. Why stick around at an employer that only sees you as a commodity and won't take your concerns into consideration over their bottom line?

Unionized workers report higher levels of happiness, have better benefits and higher compensation, and get a say in how their work is done. They also stick around longer because they have better job security.


The grass was greener when I was younger, and the sugar is sweeter.

Most software has always been crap. There never was a golden age, when developers 'cared'.

The truth is, some developers produced polished, well-working, well-made software. Some developers still do that.


I think what GP is saying is that there has been a paradigm shift away from unintuitive maximalist apps that are fast once learnt to intuitive minimalist apps that are slow as treacle because everything is in size 16pt font and designed for the first time user. Both have their downsides so in a way software has always been 'crap' but the ways in which it sucks have changed.

I've worked on 'revamp' or 'refresh' projects and it's all about stripping out features with the misguided idea that somehow one particular piece of functionality is a vastly more important value proposition to the user than all the other features.


> the misguided idea that somehow one particular piece of functionality is a vastly more important value proposition to the user than all the other features

But isn't that mostly true? Doesn't have most software you use have a few core features, some you sometimes use and some you never use? A good UX revamp orders correctly these to appeal to most people (but of course too many are too aggressive about removing things).


Actually, the sugar was sweeter when you were younger, because as you age your taste buds become less sensitive:

https://www.comfortkeepers.com/info-center/category/senior-h...

"Between the ages of 40 and 50, the number of taste buds decreases, and the rest begin to shrink, losing mass vital to their operation. After age 60, you may begin to lose the ability to distinguish the taste of sweet, salty, sour, and bitter foods. The sense of smell does not begin to fade until after the age of 70; its decrease exacerbates the loss of taste for those affected."


I mean, there are actual measurable cases where “sugar was sweeter”: less chemicals, larger sizes, being produced closer to where you buy it and therefore being fresher, and smaller producers who delivered polished well-made product being squeezed out by large multi-national corporations.


Try using some old software. Like windows 7 or vim or emacs , or PHP or old reddit. Due to their age , they 've accumulated a lot of fixes to their bugs. They may look weird and old , but they waste less of my time.


They waste less of your time, because they are either doing different things from what their new versions do, or you have more experience working around their problems. [1]

I'm glad someone mentioned reddit in this subthread. I exclusively use old reddit, because I use reddit to read, and comment on text forums. The new reddit design is hot garbage for this usecase.

But what it's not hot garbage for is scrolling through image-heavy/meme-heavy subreddits. It's much better than old reddit for that use case.

New reddit doesn't suck for my use case because the reddit developers hate their users. New reddit sucks for my use case because reddit wants to be a 'scroll through memes' feed, not a message board. Complaining about it is like complaining that Mercedes-Benz doesn't really want to make airplanes anymore.

Oh, and old reddit wasn't even good for my use case! I'd much rather use a real forum, then reddit's opinionated-UX disaster, but since everyone I want to talk to is on reddit (because they spend most of their time scrolling through memes), I have to suffer through it.

[1] The parts of Windows that got overhauled visibly-to-the-user (mostly windows explorer + pre-installed programs) were always crap. The first thing I do on a new Windows installation is to install an orthodox file manager, because the OS is unusable for me without it.

The overhauls under the hood were largely invisible.

The new additions, like the search omnibar are, for the most part, decent.


i didnt say devs hate users, i said they don't care.

> because reddit wants to be

It used to be that devs would add features to please users and apps didn't have an agenda about what they "wanted to be". And what prevented reddit from introducing a "meme feed mode" for subreddits that are full of memes? People seem to have forgotten that they can make incremental changes, not every update needs to be a full rewrite.

> new additions, like the search omnibar are,

See this isnt even new, windows 7 had a much more compact start menu without ads and with a search bar. They didn't add to that, or improve or that, they literally made it terrible. And my Photos app keeps crashing all the time, it's ridiculous. It didnt use to crash in win7.

Another is the old Skype, the new one is still missing features that worked in the old one.

But most of all, what's missing is snappiness, not because my computer is slow, but because the new apps use frameworks in which it's impossible to have low latency.


old.reddit.com is the best, is there a plugin/extension or some script to force chrome to always redirect to old.reddit.com for all reddit urls?

edit: found https://chrome.google.com/webstore/detail/redirector/pajiege...

and https://chrome.google.com/webstore/detail/redirector/pajiege...

this one seems to be pretty popular (requestly) :

https://chrome.google.com/webstore/detail/requestly-redirect...

anyone have better solutions?



If you're logged in, you can just have your account stay on the old design.

preferences > beta options > uncheck "Use new Reddit as my default experience"


I can’t help feel like the agile in business mentality is somewhat to blame for this. Focusing on getting core functionality sprint after sprint is great for adding new features but there is never enough time to go back and fix annoyances that are minor. I was looking for a new microwave recently and since I’m a nerd I was looking into ones with smart features. So I read some reviews and guess what? The smart microwaves are buggy! Sure the core functionality works but the UI is terrible and prone to crashing. Another example is the new combo washer dryer my parents bought. Dryers have had a beeper go off at the end of the cycle since forever. So this dryer has the electronic equivalent of the old turny knobs and yes core functions all work fine. It dries clothes. It beeps when it’s done. But unlike every other dryer I’ve used before it keeps beeping forever. You have to open the door to make it stop. Oh but the washing machine in the same unit? It beep 3 times then stops. Different development teams I guess. I’m screaming inside just thinking about it. The only fix (and this was from the manufacturer support forum) is to cut the wire to the speaker.


Software sucked in this way long before Agile, and will continue long after Agile stops being a trend.

Of course, if is is truly a result of Agile, that is only because the team using Agile wasn't truly doing it right.


I’m sure you’re right. I’m probably projecting issues I see in my own organization since we switched to agile. Or attempted to. Or are continuously trying to attempt to.


It would greatly lower my shopping-anxiety if you shared the brands. Thanks!


For the microwave I was focused on Whirlpool since they had gotten high reliability rankings from a well-known reviewal magazine. However I did look at other higher end brands too and they all seem to have issues in the smart appliance space. I ended up buying a basic model instead.

The washer/dryer is a Frigidaire laundry center. (Stacked washer/dryer)


This is one of the strangest takes that I have ever read. I can remember my dad's turn on the waiting list finally coming up so that it was his turn to bring the HP calculator home from the government lab where he worked for the weekend. It was an absolute marvel. The whole family sat around the kitchen table for hours as he talked about RPN and showed us how the calculator worked. It was magic.

Almost 50 years later I have an incredibly powerful computer that I carry around in my pocket and I call it "my phone". It can solve all kinds of mathematical equations, display live video, and communicate in real time with the furthest corners of the globe. Saying that I "expect technology to suck" is just completely bizarre. I have witnessed amazing changes and have access to technology that were just Star Trek visions when I was young.


Being grateful for technology that didn't exist 20 years is not incompatible with criticizing real, large, and avoidable failures that are infused with all the tech we use. Likewise, I can criticize a car model that is constantly breaking down because of owner-hostile choices made by the manufacturer even though it's better to live now than 200 years ago when there were no cars.


Quoting from https://www.hpmuseum.org/hp35.htm:

2.02 ln ex resulted in 2 rather than 2.02. When the bug was discovered, HP had already sold 25,000 units which was a huge volume for the company. In a meeting, Dave Packard asked what they were going to do about the units already in the field and someone in the crowd said "Don't tell?" At this Packard's pencil snapped and he said: "Who said that? We're going to tell everyone and offer them, a replacement. It would be better to never make a dime of profit than to have a product out there with a problem".


> Almost 50 years later I have an incredibly powerful computer that I carry around in my pocket and I call it "my phone". It can solve all kinds of mathematical equations, display live video, and communicate in real time with the furthest corners of the globe.

And you use it mostly to browse cat pictures :).

Perhaps it takes a technologist to notice, but for every actual technological improvement that happens, the UI glitter gets cycled 20 times. Majority of "change" in computing technologies isn't improving anything, it's only inciting you to throw away your old device and buy a new one.

Technology is amazing. But if you think of Star Trek visions, you can realize that our technology could be 10x more ergonomic and better than it is, and could be solving 10x more problems than it is. That's what I read these complains are about - unrealized potential. But perhaps it's because I'm a techie and a trekkie that I see it this way.


Thank you, that was a very inspiring response.

I completely agree with you about the unrealized potential, but my greatest regret over the past 50 years is not about the technology, but our ability to harness it. I remember helping to setup a computer show at our local high school when I was still very young. The local vendors had agreed to donate an Apple //+ to the school in return for having space to show off all of their wares. We were thrilled and I can vividly remember my father saying that my generation would be the last one to learn programming as an add-on elective skill. He imagined a world where everyone could harness the power of digital technologies and that programming, building, tinkering, would be second nature to all of us.

We spend our time thinking about UI glitter because we gave up on that dream of having every student learn to program. We took the exercises out of math and science books where we asked students to write programs to use the Pythagorean theorem or convert Celsius to Fahrenheit. It became "enough" to learn how to enter a few numbers in a spreadsheet and perhaps add a trend line.

A few hard working people in a garage can still turn out wonderful things, but if we really want to see the promise of technology then we need to fix the bigger problem of building an educational system that empowers many more people.


"Technology sucks" in my eyes just means there is a mismatch between expectations and reality.

People expect their technology to solve their problems. But often their problems have nothing to do with their technology, and anything to do with outside aspects like how they use it.

E.g. during Corona one boss at a institution I know bought a incredibly expensive Video conferencing "thing" and expected it to run incredibly smooth, which it didn't, because he neither was willing to invest into the buildings bandwidth/latency nor did he make sure the people on the other side also had that bandwidth/latency as well.

People like to believe into technology, like they would believe into gods or magic words: they don't want to understand how it fits into the world they are part of: they just want it to make their problems go away.


This is just the "there are children starving in Africa" argument, but across time instead of space.


I'll point out that many of these annoyances don't have equivalents on the command-line. And the annoyances I do have on the command line I can mostly work around with wrapping things in shell scripts, aliases, or functions.

It's nice to have a simple interface and high extensibility on top of it.


Well, then there is also the annoyance of having way too many wrappers to wrap your head around, and it's one that I personally encounter somewhat regularly. Still has not figured out how to deal with that one.


Well, there is no problem that cannot be solved by adding a layer of indirection, except the problem of too many layers of indirection.


Use names that you can find easily via tab completion from what you were thinking about.

E.g., all my scripts for fixing transient problems are ~/bin/fix-*.sh


> I'll point out that many of these annoyances don't have equivalents on the command-line.

I'd argue the entire command line is nothing but annoyances. Everything is completely non-obvious and often requires reading man pages that can contain a hundred options that may or may not be in alphabetical order.

The fact people have to write scripts to really function on the command line is telling.

My personal nightmare is exiting vim. If I make a mistake typing, I enter recording mode instead of exiting. This happens to me several times per day. And, no it's not a matter of being more diligent, it's the result of a disability.

The command line is powerful and I absolutely love it but it is a pit of rusty razor blades people keep throwing ropes over. :)


The command line has suffered like everything else. Complexity creeps in and builds on itself. But scripts aren't the problem, scripts were part of the design as the intent was to glue together the many CLI utilities. Scripts bind them into performing a repeated task (versus one-offs).


Eh, it still happens. You'll find plenty of CLI utilities that can have small, but breaking changes between minor versions and require all kinds of hacks to get working correctly in a mixed environment.


meh. I’ve noticed recently that Bash for whatever reason seems to just randomly erase my entire history, which is infuriating (though obviously not enough for me to invest the time to investigate or fix it at all). Stuff breaks all over, CLI or otherwise.


rm -rf / bin/bar/bash unexpectedly bricked my system


"rm deleted my entire root filesystem because I told it to" seems along the same lines of unexpectedness as "i handed the homeless guy a $100 bill instead of $1 because i didn't look closely" :)


As soon as I clicked on this, I recognized the author's signature yellow background and font from their previous post about buying a new monitor (which I also found at the top of HN) https://tonsky.me/blog/monitors/

I enjoyed both posts. Great titles, well written, serious but witty. Props to the author; I'll be subscribing for more.


I had missed that one, thanks for pointing it out.


Don't get me started on my mother-in-law's tech problems. It never ends... almost anything she wants to do requires three things to work together that simply won't.

Even a simple issue like "which remote control do I use" is a disaster that keeps happening, pointlessly, again and again.

Yesterday we tried to get her iPhone to operate the Spectrum app on the Roku device connected to her Sharp TV - just so she could change channels by number. It can't work.

Can we all just get along?


My most trouble-free period was when I was using Xubuntu exclusively for several years with a single monitor. Now I've been using Windows again for several years and I've grown accustomed to things just not working, ever, with tons of visual lag, usually just a few hundred milliseconds but sometimes stretching into seconds.

On Xfce I would sometimes idly hold down win-t to create a bunch of terminals then alt-d to close them one by one. As I might idly bounce a pencil up and down back when I used pencils. On Windows/OSX/Gnome I would never dream of just idly opening a bunch of applications for fear of what might happen.


Woah, have you tried the night mode on op blog? I have never seen anything like that before.


Day mode: With the power of 1000 suns I now declare you blind!

Night mode: I've played this video game before. A monster is going to jump out and get me at any moment.


You haven't seen anything like that before because it's awful and useless.


But the guy is a UI/UX consultant!


It really is night mode, not just some dark mode for a virtual world. It's even night on the wild mode; the author could add some crickets sound, it would make everything better.

The author also seems to be one of those people that think that sunlight is yellow.


I kind of agree, but calling it technology is pretty broad (article it's mostly app related). Technology-wise, I also expect things ranging from a flashlight to a CT scan to not suck, and they usually do not.

Edit: flashlight instead of torchlight.


Consumer technology would be a better qualification. Or at least computers.

- The whole current USB ports fiasco; - Just look at any wireless headset and you'll find at least 2 major bugs, between connectivity, quality, or general usability; - Smart TVs suck terribly even at the high-ish level, both performance, image quality and usability; - Touchscreen stovetops get triggered by food splatter; And so on

Basically anything that has a microprocessor anywhere is bound to go haywire somehow at some point.


Yes, consumer technology is more appropriate. Better yet consumer software/electronics, I believe.


People literally have to get months of training to operate these machines because they are highly specialized, non intuitive and finnicky. Using medical instruments as an example of good software isn't a great idea


I don't get the point of your comment. Sorry. What about the flashlight?

Since the article talks about technology... Why is "technology = only software or apps or things related to a phone"?

Technology can be something that goes from fire, passing through a pencil up to whatever application of scientific knowledge for practical purposes.

Edit: flashlight instead of torchlight.


Jeez man, what kind of torchlight did you buy?!


The sad thing is that it doesn't have to be this way. Business schools, in fear of phantom product development teams that waste huge R&D budgets developing beautiful products that cost too much for anybody to buy, spend all of their time preaching against quality in any form. Business school graduates are trained to be suspicious of any attention to detail and are explicitly incentivized to compress schedules for nothing more than the sake of compressing schedules.


I don't personally have an issue with this, as it makes the market perhaps slightly more friendly to those who do take (a little) extra time to put in some quality.


It seems we had it right around 2006-2010 or so, then decided it's too boring and perfection doesn't bring in money anyway.


What do you think we had "right" in the time period and what made it "right"?


Computers were good enough to perform basically any task a regular person would need, but simple enough that they couldn't fail in too many ways. User interfaces were honest about what was going on instead of giving blanket error messages with dead help links.

I really wish it was just nostalgia, but my parents somehow were able to fully use MSN Messenger's features with not much hassle, but if I even as much as show a screenshot of Discord, Slack, or even modern Skype, they might just shriek at the sight, let alone be able to use those softwares.

So I don't see how we've improved at large. Yes, UIs are prettier to look at and we have more animations and higher image quality and all that, but if anything functionality is very much stagnated, though I wouldn't be as bold as to say regressed.


Mac OS 10.6, Ubuntu 10.04, Firefox, a pluralistic web based on progressive enhancement, music downloads, even Windows 7, all came to work towards what people actually wanted around that time. Then came the iPhone (not bad per se, but creating the mobile web crisis), AWS, Google dominance, fscking big data


That's a bit like saying that flying intercontinental sucks because the low aire pressur and humidity make the food tasteless. Of course it does, and of course it's annoying, yet almost noone crosses travels on ocean liners anymore, because for all the rough edges that air flight has is still beats spending weeks travelling.

People expect techonology to have countless minor flaws, which it has, because techies spend their time pushing tech forward at a crazy speed instead of slowly polishing it.


But most of the time we're not actually pushing anything forward, we're just adding more features that we think people need (not to mention many developers enjoy creating new features, not polishing existing features).


Me: Agrees with title but decides to read the post before commenting

WOAH my eyes!

Aha, there's a dark-mode switch. Maybe if I click that I'll be able to read this blinding page

* CLICK *

Nope


I should add: I do appreciate the creativity of the page design (and especially of the switch). But on an article about software UX... well.


I dove into the inspector interface, and changed the background color. No change. Also had to delete a 1 pixel background image of the same color. W. T. F.


Bugs are not our biggest problem, our biggest problem are deliberate decisions that privilege vendor control, tradition, and ease of implementation instead of good architectural design.

All three major desktop operating systems (Windows, MacOS and Linux distributions †) are so bad that it's infuriating at times. We can't have a single operating system to rely on and say: "This system works well and won't give you unnecessary headaches."

How can we develop good end-user products when the very tools software developers use to create those end-user products are broken?

† The kernel may be decently reliable, but the actual distributions are an ungodly mess.


Ah yeah. I recently switched from ADSL to fibre. Speed-wise, it's full joy. However the now incredibly advanced modem/set-top-box/DVR is an utter mess.

There are two boxes : a network server/DVR/router and a TV set-top-box. The server part so far behaves reasonably well (apart for a couple of interface quirks). But the set-top-box is a nightmare.

It all started with networking. The system ships with PLC adapters. It turns out that nothing reliably works with these. I switched to WiFi. It's worse. I finally gave up and now have a 15 m ethernet cable going around my living room.

You have random error messages every time you turn on the set-top-box. When switching from the TV function to some other (like chromecast, or youtube), half the time the sound from the last TV channel you visited can still be heard. You have to hard reset the box to solve the problem.

Once the bluetooth remote stopped working. I tried everything (swapping batteries, resetting everything, etc), no luck; finally I had to do a full factory reset of the set-top box and pair it again with the remote (takes about 30 to 40 minutes I probably could have enjoyed more doing anything else).

The box has an integrated chromecast. However it can't be seen on the network about half the time. I've found that by turning the box off and on, then going to some network app (like youtube), it's back.

Then half the time chromecast playback is choppy and unreliable. You have to turn it off and on, etc.

Then you have the same sort of weird behaviour with Youtube, Amazon Prime, Netflix, etc you name it. Not a single app works reliably (i. e. the first time, and most of the time).

I complained on the forum. People yelled at me because literally millions of people are using the same setup as I am, and don't complain, so it's probably that I'm too fussy. I'm not. It's really awful. Fortunately I don't watch TV anymore, but I must intervene at least once or twice a week to sort out bugs for my family.

What's surprising is actually that millions of people probably have a hard time now simply watching TV, because all of these "smart devices" are so broken.


Just judging the symptoms looks to my like one of those classical "cobbled together" pieces of software, where they only wrote the glue code and once it was OK shipped it.

The of intricate that have to work (read: that can go wrong) on a modern device is just baffling.


This made me realize that my default expectations for anything tech related are very low and when something works the way it should, I'm amazed... I guess it should be the other way around. The amount of bullshit I have to deal with all kind of tech made me pretty much numb to it, I have no hope of things getting better so I just deal with it. What's even more sad is that in many aspects tech has become so much worse than it was in the past...


On the other hand I don't want to go back to the time where I had to re-install Windows every 6 months because the registry was falling over or watching macOS kernel panic because I unplugged a USB stick.


Sounds like this might have been prompted by Jonathan Blow's talk, Preventing the Collapse of Civilization: https://youtu.be/pW-SOdj4Kkk

It's a shame that there isn't usually enough incentive in mainstream software practice (or business reality) to polish and make things work flawlessly - day to day life could be a lot nicer if we dived deeper and thought longer-term.


There are all sorts of problems with technology but technology "sucking" isn't one of those problems:

The problem is people:

1.) Don't remember far back enough to when everything was literally more difficult to do. They don't actually understand how much easier their lives have become due to technology. If you grew up before cellphones became ubiquitous you would understand how much of people's lives were devoted to traveling long distances and haggling with people just to get small things done.

2.) They are young enough that they expect everything to happen instantaneously because they grew up post ubiquitous cell phone. They won't ever be able to understand how easy technology has made our lives because they haven't experienced anything else.

Every time someone writes articles like this they should be forced to live without any technology for a year. It's not perfect but its made life measurably easier for every single person in most countries.

With that said there is a legitimate problem with technology:

People don't know how to use it responsibly, many of them are addicted to their laptops, tvs and cellphones in ways that are frightening.


While I agree, quite strongly, about your general point regarding the short memory and ungracious expectations of people at large, I also disagree, strongly, with your dismissal of the problem addressed in this article.

By and large these problems are all man-made and have known remedies. It is possible to be both grateful for the great advances, conveniences, and benefits of modern software and simultaneously frustrated with the very real, preventable, and fixable problems in most modern software.


I dismissed the article because I refuse to click on link-bait titles just to generate ad impressions.


Understandable, but please don't comment if you cannot add to the discussion (because you haven't actually read the article being discussed).

In this case I think you've done yourself a disservice. I've generally found Nikita's blog worth reading, and he doesn't serve ads.


Dear Techies, Thanks for educating me on Tech...actually almost none of your comments had much meaning for me, a user. I’m not even sure how I got to this site. Not your issue.

The comparison between “reality” and Tech was revealing. If Tech mimicked reality there would be no fewer glitches but I think the nature of those glitches would be more understandable to the average user. Traffic slowing a page loading, moose on the roadway (download won’t actually do anything), no power to signal lights (display goes dark), those sorts of things.

I imagine the real world mimicking Tech would be more alarming. One might wake up and head for the bathroom when your closet slides in, blocks your path, and asks if you’d like to re-paint it. You decline and it forces you to respond with “We can talk later...” In the meantime, your small appliances clamor for you to perform maintenance routines. You wade through them into your bathroom when you realize there’s no longer a commode, there. You no longer have the administrator permission to use that device. Your phone rings, it’s the developer that built your house asking you to leave the doors unlocked tonight so their maintenance crew can rebuild some parts of your house.

Wait until you try to get to your mailbox!

I, user, don’t really hate Tech. I just wish there was rule in your industry that required products to work WHEN they’re released. Not after endless bug fixes or security patches (Yes, I’ve read the standard BS notices) which make me think, “...this one fixes what they did last time...”

Tech is like having cats. Entertaining, sometimes inspiring, but requiring so much attention and resources that it leaves you wondering whether it’s really what you want.

NOTE! Cats won’t warn you when your power company’s Tech will decide its unsafe for you to have their services.


Y’all can’t even imagine how bad healthcare IT is. Multiply all these problems by the cost of human life.

If you think you’re frustrated imagine a heart surgeon trying to access a patients file but their password keeps getting rejected.

Or they were forced to change their password monthly for sECurItY and theY can’t remember it. And the reset process involves calling the Hospitals IT Helpdesk


The list of all annoyances in one day is an eye opener. I should try that just to see what I have been putting up with.

One of the main ones I have is using macOS. My work MacBook pro (an expensive piece of equipment!) will just randomly have scratchy audio. There's absolutely nothing I've found that will make it go away, even restarting the core audio service appears to do nothing. But, leave it for an hour and try playing audio again... and the problem fixes itself. Rebooting fixes it, but you just know it'll be back, so you learn to live with it, or use headphones if you absolutely must have audio for a meeting.

Another thing that's sort of in the same domain is websites being more user hostile. Popups, intrusive ads, auto-play videos, and so on. I'm at the stage where if I get too many of these I have to just close the app or web page and give up on the article I'm trying to read.


Have you tried disabling bluetooth?


Thanks for the tip. I have not tried that.

One thing that I will try as well is to plug the USB-C power cable into the right side of the laptop. I've read that surprisingly CPU utilization goes up when it's plugged in on the left.


Ive just set up a linux mint installation, added a vm, lots of little tweaks here and there. Restart it- and the stupid oem "feature" kicks in undoing all my changes.

Its a miracle that we are not chased down and made to answer for our criminal wasting of other peoples lifes with the same amount of time as punishment.

The abstraction safes.


- My macbook keyobard sucks, some keys dont press, other get pressed double or tripe. I deal with this using a bluetooth keyboard on top of the mac kb lol

- WSL Stopped working a few weeks ago after a windows update, I haven't been able to fix it, my only option is format and reinstall OS at this point...

- My phone screen broke a little first, then it stopped working for it's top half, Since it's an iphone x, I deal with this by using the little trick to reach the top parts of the screen with the thumb, and that's how I almost don't use my phone.

I can also get random stuff open if I leave it unlocked, since it's touchpad gets activated like crazy by itself, I luckily enough have an official cover which keeps it blocked if it's lid is closed.

LOL, and I'am a power user who can fix things and google for solutions, we're just f


> In macOS context menu, “Tags” is in smaller font

If you think technology is broken, don't take a look at your physical stuff.

The baseboard in my hallway is a little wavy. The road in front of my house has lots of potholes. The tile in my bathroom is slightly off of parallel from the wall. The front door of my house sticks. My takeout order was wrong. My air conditioning in my car stopped working.

Technology is amazing. A friend sent me an animated gif, from hundreds of miles away and I was able to take that same animated gif, replace the face with my own, and send it back, in less than 10 seconds. All from a tiny brick that fits in my pocket. I'll deal with my scroll position being reset on instagram occasionally.


Tech is bad because our hypothesis of "nature will help us adapt to our disregard for foresight" is proving false. Across the board: economic, social, manufacturing, education, environmental -- everything is giving us plenty of evidence that we are headed in the wrong direction.

All meaningful measures of human wellbeing are going down at increasing rates. We are making mistakes at a rate that outpaces our ability to learn from our mistakes and we doubling down on the strategy that got us here in order to get us out.

The line about us not knowing the future and needing to take the best guess we have and building more on it than we are confident in its correctness is illogical and clearly disordered.


Yes, it sucks but it's the very nature that lets tech be so flexible that gives us the bad tech experience. The things in our lives that are are very stable are the ones that do one thing and they do it very well. They do one thing and we know how to use them.

Tech, software, is so versatile that it's hard to know when to stop adding new functions and every time a new function is added we add unintended consequences.

We can start with the IC's in our gadgets. Some contain billions of different parts. It's a miracle that they even work. Add other components and software and we can see why tech sucks. Getting all those parts to work in unison all the time is next to impossible.


How do we, users, makers and managers alike, move away from this awful state of affairs? Some random ideas:

- If anyone manages to build some truly fantastic software for massive numbers of end users, maybe those users will start expecting a higher bar for other software. Consider for example search engines pre/post Google. Naive search engines just don't exist anymore, because they were several orders of magnitude less efficient at getting useful responses. So that was a sea change. (Unfortunately monetization, SEO, sabotage and things like ignoring search keywords have resulted in search engines regressing in the last several years.)

- Long term design. Commercial software seems to be redesigned massively every few years, which is terrible UX. On the other hand, the vast majority of F/LOSS is designed once and either never changes again or just gets added to until it drowns in clutter. Even though the latter is caused at least in part by a lower budget, and often ends up being pretty bad UX-wise, I actually prefer it. One good balance is Amazon's web site. From a quick glance it looks like everything is where it was a decade ago, and overall it just looks familiar.

- UI convergence. Web browsers have converged so strongly that being able to do things like navigating and filling in a form in a completely unfamiliar browser is very likely to succeed. And browsers are so enormously popular that they have set expectations for how UIs work in other software.

- Quality assurance in all its myriad forms. A lot of energy is being spent touting the One True Silver Bullet, when in reality every approach has diminishing returns and shortfalls. If you use only manual tests every version is going to have more bugs than the last, because you don't have $MM to spend testing every single detail for every release. If you only use integration tests the suite is eventually going to run for longer than your release cycle and will be skipped and trimmed in ways which mean you'll miss bugs. If you only use unit tests you can only release fizzbuzz-size chunks reliably. And so on. Instead use every technique you know of until you hit similar diminishing returns for each of them.


I try to train my systems to behave just the way I like them. It's no different than coding: you are bending tech to your will. I don't see the difference in configuring a brand new Virtual Machine and that of coding a new piece of software. For me, the definition of non-sucky technology is the ability to configure, tinker, and otherwise mess with systems to make them do your bidding.

This is why I love computers, because no two computers are in the same state, and I always laugh using other people's computers because you can be sure they have configured it in a weird idiosyncratic way, entirely different than one's own configuration.


The definition of non-sucky technology is not having to waste time on any of that.

If you think tinkering and configuring are fun you should stay as far away as possible from software design, because you have no insight into what most users expect from tech.


> If you think tinkering and configuring are fun you should stay as far away as possible from software design

You overlook that not giving the user options to configure their systems is bad practice as it can force all manner of dark patterns on them, and they are left constrained in that software, unable to bend it to their will. Let them tweak, I say!


Let them tweak, yes, but they shouldn't have to tweak just to get things to work as expected.


Technology usually only sucks where there is no immediate consequence of this. For example I am currently working with some tools related to robot control and machine safety. Crashes are expensive and errors in safety can get you sued. That part of the software works perfectly. Similarly, your engine control unit, airplane avionics etc. all work well because if they did not, the producer would be punished monetarily.

User interface stuff is crappy because it does not matter in the sense that you can immediately break expensive things. Unless people raise a stink to the point where the producers are scared for theri bottom line, this will not change.


A big part of the problem is how new everything is. We keep improving it and rebuilding it. You can many of his examples would have all their bugs removed if we heavily constrained the features added and didn't migrate to new platform. Of course real people also want new features! We can probably expect increased quality when the rate of change decreases and we either see more stability and incremental growth or stagnation. But if new technology is be invented (not just new applications or iterations) we are going keep seeing issues. I'd rather just ride the tech-accelerate curve myself, but does come at a cost.


As someone with a SaaS software startup, this is something I have to remind myself from time to time. Particularly because when you're so close to the product, you're intensely aware of the bugs and issues.

It's comforting to know people do have a tolerance for minor bugs and issues.

We were pleasantly surprised how much positive feedback we'd get, even when encountering bugs though and I believe overestimated how "perfect" it needed to be for the first release. From experience I've learned it's much better to ship it and improve it with incremental updates rather than to try and wait for "perfection".


For me, part of the "suck" is the sprawl of so many different technologies and specialization areas. Back in the 70's, there weren't so many technology choices, and there weren't so many different job titles and roles.

So, typically, a tech team had a pretty clear path on what technology to use. And, you could get 3 or 4 "tech lead" type people in a room, even for a huge application, and between them, they knew the system from top to bottom. And they all understood who was responsible for what.

I understand we can't go back to that. But it's hard to deliver quality with today's sprawl.


I have actually the opposite reaction. It is easy to be cynical and say "everything sucks". But if you look around, you'll actually see that many things work and do so surprisingly well. You can literally rent 1000 computers for an hour for a few bucks. It works! It's amazing. You can use a little device you carry around as a phone, for email, to look up information on machines running anywhere! There is a lot to marvel at. Sure, things could be better - so work on them and make them so. But don't pretend everything is terrible. It's not!


Upvoted for the headline before I even read the post; and in full agreement after reading. I'm somewhat inspired to keep a similar diary of techno-suck moments... but there would be so many!


> or if I couldn’t use my phone without hitting “Cancel” every five seconds .... [I'd just google it] .... That these people mostly just lived with it means that these problems couldn’t have been markedly worse than technology has already been for them historically.

Other people's Iphones and the nearby Wifi notification! hahaha

Iphones are great for me because I fixed annoyances so fast that I forgot other people tolerate or judge iphones from them.

The nearby wifi popup can be disabled folks! Also shift some of your budget around and get a data plan!


There is still great software being made. I would say that the interface in a Tesla is pretty great. There is just more software out there now, so the number of bad pieces of software are magnified that much more. But, I think great software is dependent on how willing the developers are in making it great. Yes they need support from management etc., but there is tons of great software out there, it is just that there is more and more mediocre and bad software being made as well.


Tesla's music player is pretty bad - the bug where it duplicates artist names for each album when you browse by artist has been there for years, if you try to switch between specific albums and back it frustratingly scrolls you back to the top each time. these are just my pet peeves see google for laundry lists and angry users.


It’s quantity * quality. If you use one piece of software that works 99% of the time it’s great.

But if you use 1000 programs a day that work 99% of the time then 10 of your programs are broken all the time.

We need to figure out how to increase quality and reliability somehow or it’s just going to keep getting worse.


I like to think about it like little minecraft characters beating on a piece of code until it works right. It's like smithing, you have to put it through it's paces and iterate as a developer, since thinking of all the complexity on large projects is infeasible. The only thing to do is to write it, test, redo it, test some more, test on a wider test bed, release, get complaints, add more tests, and finally you are at the start.


I don’t expect software to fail generally, but I’m getting used to a recurring stimulus quickly, especially if I underestimate its importance. Perhaps it’s a kind of habituation.

I feel like it’s getting harder to estimate importance properly as the complexity of an abstract system grows. People tend to click the same error message they don’t understand away each day, because it hadn’t had any negative impact (so far).


Recently Firefox just added it's own 5 cents to that with major update on Android. About 10 features I used and I liked where removed!


It's interesting that he uses "Phone at 5% brightness" as an example of a problem that needs urgent fixing. My phone has a perfectly functioning brightness slider and the only time I feel a need to put it brighter than 10% is when I'm using it outside on a sunny day. >60% indoors on a cloudy day or evening is obnoxiously bright.


I've used many of the same technologies he's talking about and have experienced almost none of these issues...

maybe I just got lucky but it seems we have the bigger problem that these systems are so complex that they've almost reached chaos theory levels of total randomness of behavior from small changes to initial inputs.


See "The mess we're in" talk by Joe Armstrong (Erlang creator) - https://www.youtube.com/watch?v=lKXe3HUG2l4 - from 6 years ago.

Your computer with 1TB storage has ~2^8trillion possible states it could all be in.

Number of states you could count through if you burnt up The Sun trying: ~2^128 or so, tops.

Atoms in the universe: ~10^80.


I couldn’t agree more - for better or worse, the expectation for a perfect “it just works” solution (of course, being relative to the exact use case) is at an all time high ( in my opinion anyway) - I’m very interested to how incumbents respond (or don’t respond) to this dynamic in the coming few years...


I enjoyed this article after I manually turned off the background image and background color using Firefox's developer tools.

At the bottom it says he's writes about UI design. That black on yellow was pretty hard on my eyes at the end of a long day. I just couldn't read it as-is.


The whole enterprise of writing software, from top to bottom, (historical) beginning to present day, has been based on the fallacious, machismo idea of perfect execution by the programmer of writing correct forms. This is actually impossible thus our software is terrible.


I don't buy that excuse at all. Sure, perfection is impossible, but there's a very, very broad spectrum of quality that spans between "perfect" and "completely unusable". To say "well, we can't be perfect, so everything is terrible" seems absurdly black and white to me.


I think tech is other people. Tech doesn't suck because technology is bad: it sucks because life is complicated, and other people are prone to mistakes, or prioritizes that don't align with yours, or have limited attention.

Case in point: I don't particularly like the design of this site, as I find it difficult to look at (even if I abstractly can see how the color choice is bold). The night mode toggle seems even harder to read; an uncharitable interpretation is the author thumbing their nose at people who want night mode. A charitable read is that it's a funny joke.

The author / designer of the blog has different priorities than I do. An uncharitable way of putting this is that the blog design is "bad," but I really dislike that outlook. I'd much rather that people look at things charitably, and try to understand what other people are going for, and be willing to say "this isn't for me." I think the article doesn't quite do this.


> I don't particularly like the design of this site, as I find it difficult to look at

As someone who didn't realize I had a permanent eye floater until browsing this site, I agree.


In the recent LTS version of Ubuntu, if I right-click something the context menu immediately disappears after popping up, soI have to right-click again until, for some unknown reason, it stays. I bet it's solvable, but I haven't bothered to solve it.


IOW the ROI on delivering quality is surprisingly (to some) low. Release early, release often.


The market encourages competition, not perfection.

If you want to see some high-quality but low-feature software, look in spaces where there is less market competition but the risk of failure is death (and the civil and criminal penalties attached for failure are high). You'll find programs that do only one or two things but definitely do them, as specified.


I know there are some people who won't like to hear this, but the reason technology sucks, generally, is because of capitalism. It is more profitable to release an avalanche of broken software than to spend the time to craft and maintain software that works well. A "software will always suck" mentality is only true if you hold capitalism as an invariant law of nature. But it is not.


my favorite quote... or way of thinking about software in general: "it's not a question of whether there is a bug in this code. it's a question of how impactful the bug is on the people using it."


Because companies treat developers like replaceable cogs... So developers don't feel invested in the products that they're creating, so they don't really care about the result and it shows.


> People expect technology to suck

What people?

Of the people I know not in the tech industry, most people like technology and talk more about the benefits than sucking. Including my 87 y/o grandmother who loves her ipad.


That's kinda the point. People expect technology to suck so they don't bother complaining about it because they're just used to it sucking.


I thought it was kind of a mental thing where you're so used to technology sucking you don't even know that it could be better.

There was another comment about a developer getting RSI and having to use an Android as their main device and not being familiar with the UI and feeling restrictive. Iff that's your default computing mental state and you don't expect computing devices to give you a lot more power you just expect and know them to be restrictive devices that you typically consume media from then you don't even have experiences to compare the suck to.

The suck is what it is and you expect it to suck why would you say that.

At least when you complain about the weather we both can acknowledge that the weather changes, unless you're a technological power user the weather doesn't change.


Every product of a human is not perfect. Only God can make perfect things, but even those things made by God don't seem to be perfect from our point of view - e.g. illness, brutal death.


I was about to click on the comments link for this article, but chrome on Android hung for > 10 seconds. I had to switch to another app and back for it to work.


I like the blog and all but WHY IS IT SO YELLOW?! It hurt my eyes enough for me to switch to Firefox's reader view which resets most of the page's CSS.


It "sucks" because we're constantly changing stuff and making new stuff. We do this because we want new stuff. There are plenty of industries where things don't change quickly and people complain about everything taking forever.

To use an analogy, you can have things work like NASA's SLS Rocket - expensive, slow, well-thought-out and reliable. Or you can have things work like SpaceX - impressive pace, bleeding edge, occasional explosion. There's no happy medium, so careful what you wish for. I'd rather deal with all the author's problems than deal with a Nokia phone.


Is the yellow background some sort of a inside joke or satire? For me it is unreadable without changing it through the browser's dev tools.


Ironic how on his own blog the night mode feature doesn't work correctly (the embedded videos are not affected)


we're talking here about the Total Cost of Ownership of software. people are learning to evaluate for themselves how well stuff works, and buy/notbuy or use/notuse accordingly.

someday there may be a market opportunity for software that is less broken than the usual.

arguably Apple used to occupy some of that niche.


Any software application has two states: "doesn't work" and "barely works".


The "Move fast and break things" philosophy bears a lot of blame.


Oh man, I could rant about this forever. I think the trick is to not annoy or interrupt the user and the only explanation I can come up with of why people do either when it would be simple to not do it is that an average UI designer or front end developer are either malicious or incompetent ;)

1) Do not ever block the UI thread except for updating UI. It's so simple! Well, except maybe in case of GC, but for most software it's not even close to worrying about that. Unless the software is literally running a surgery or a nuclear plant, there's at least one thing I can do at any time - close it (without resorting to kill -9). More likely, I could be doing something else with it. So, learn threading, it's 2020. I swear when we come to power, if you write code that causes a network request to block UI, you're going to be disbarred and can never make money in software ever again. Go flip burgers. I am looking at you, Outlook and Evernote. The same goes for excessive Javascript to display some text effectively blocking the UI, although that is a more complex subject.

1a) If you cannot do it, then do not do it. Timeouts on everything. If I hit autocomplete on a word in Eclipse and it takes 5 seconds to return, I'd rather just type it out myself. But the UI is stuck, and/or it will paste the autocompletion after whatever I've typed for 5 seconds when it finally finishes. Shut this crap down after 200ms.

2) If something can be done later, it can be done never. Literally make forced feature updates of paid software illegal. If companies claim to care for the users' security, they are allowed to have a separate security-patch-only line. If there's change to online APIs that are paid for, make it so that each version is supported for at least say 5 years, and/or mandate that a forced compat update should have a "no, instead cancel my subscription and switch to [competitor choice dialog]" button. That should align the incentives. Would force much better API design, too, if you are forced to support it for 5 years.

3) Never steal focus when the user is doing something. Actually I feel it was a much bigger problem ~15 years ago (in legitimate software, I'm not talking about popups), so not much to say here. These days it usually it happens when you break rule #1, and something takes a long time, so the user starts typing an email to aunt Marge in a separate window and then boom, your "would you like to delete this folder" dialog steals the keyboard.

4) Everything, and I mean everything, should be configurable, in particular it should be possible to turn every single feature off. Adding a feature? Add a flag to turn it off. It's dev 101 in backend software (nobody wants to redeploy if one small feature is broken, you'd rather just turn it off, at least to mitigate)... it should be in the frontend, too. It can be an obscure config file, but it has to be configurable. You are not smart; do not kid yourself (I certainly don't), you have no idea what users wants.

4a) Wishlist - have a checkbox for software to act dumb. Especially the IDEs. Sometimes your autocomplete/auto-compile/etc keeps doing something wrong, or something slow (or, more often, both). It would be really neat to just have a checkbox in front that would cause all the clever logic everywhere to go away and for the software to only do literally what you tell it to do, and no more.

5) Stop messing with the UI out of proportion to messing with the feature set (i.e. a major redesign of the UI is only justified if you made a major redesign of the user interactions), there's nothing wrong with your old UI. If your UI PMs have nothing to do now that you only add minor features that fit into existing menus, and suggest a redesign, just fire them instead.


Tech sucks for the same reason that Exxon buried their own climate science, or the tobacco industry hid their knowledge of polonium 210 in cigarettes. Capitalist corporations by and large have nothing but amoral contempt for their customers and the general public. Tech can be good, but not within this irrational and obviously decaying economic system.


Jonathan Blow recently talked about this in https://www.youtube.com/watch?v=pW-SOdj4Kkk. Maybe this is where the author got the idea of noting them down?

These discussions usually invite lots of hand-wavy complaints and oppositions without more concrete progress. Out of boredom and to further better dialogues, I've tried to address every bug in that list:

> iOS 14 discharged phone battery 80% to 20% during the night (no activity, much worse than iOS 13).

There's a new AI system since before 14 that monitors your battery usage and e.g. refrains from charging during certain times, among other features. It's likely that this system got tweaked (as opposed to sudden battery failure and recovery).

> YouTube.app randomly scrolled to the top.

Dunno about this one. Did you touch the status bar at the top of the screen?

> Instagram reset scroll position after locking/unlocking the phone.

Probably forgot to add that bookkeeping to the before-locking hook, and/or the hook before getting evicted from memory.

> Race condition in keyboard in DuoLingo during typing.

Don't use DuoLinguo anymore. Can't comment.

> AirPods just randomly reconnected during use.

Bluetooth?

> Shortcuts.app stopped reacting to touch for ~30 sec.

Hard to say. Undefined state/exception bugs maybe.

> Wondered why my apps were not up to date, found nine apps waiting for manual button click.

Push/pull model problem, battery conservation heuristics, server's notification scaling being best-effort, etc.

> Workflowy cursor obscured by Workflowy toolbar, typing happened behind keyboard

Workflowy's iOS app uses web technology. The keyboard + floating bar layout is a recurring problem with said tech.

> AirPods showed connected notification, but sound played from the speaker.

Bluetooth...?

> Passcode unlock worked for the third time only.

Never happened personally. Can't comment.

> Overcast widget disappeared while switching to another app.

That one's almost 100% in the animation system's bugs introduced in iOS 7. Tldr uninterruptibility + special thread/process causing extra undefined state + GPU.

> YouTube forgot video I was just watching after locking/unlocking the phone.

Same as the instagram diagnosis.

> YouTube forgot the resolution I chose for the video, keep resetting me to 360p on a 750p screen.

Dunno. No longer use YouTube app. Network? Sometime the UI can be misleading. The quality option might be just a best-effort option that it doesn't guarantee to respect. Someone else check this.

> 1 hour lost trying to connect 4k @ 120 Hz monitor to MacBook Pro.

Definitely not enough stress/integration testing, so it's unsurprising that anything might happen. Sometime provably impossible to get right due to neither party controlling the whole stack.

> Workflowy date autocomplete keeps offering me dates in 2021 instead of 2020.

See earlier. It's not using the native NSDate (workflowy uses Momentjs). Plenty of room for error. NSDate's api usually won't nudge toward things like off-by-ones (I think).

> In macOS context menu, “Tags” is in smaller font

Intentional.

> Transmission quit unexpectedly.

And slowy =P. Likely due to exception/mishandling of memory.

> Magic Trackpad didn’t connect right away after boot, showed the “No Trackpad” window.

Lots of preemptive races possible here.

> Hammerspoon did not load profile on boot.

Dunno. I don't use it.

> Telegram stuck with one unread message counter.

A few other chat apps do that too. Often from the denormalization of unread messages count in DB. That or something about the notification system.

> Plugging iPhone for charging asks for a software update.

It's a feature, not a bug ™ =).

> Dragging an image from Firefox doesn’t work until I open it in a separate tab.

That dragging is reimplemented using cross-platform tech I believe.

> YouTube fullscreen is disabled in an embed.

Intentional. This is an option the embedder needs to opt into.

> Slack loaded, I started typing a response, then it reloaded.

Depends by "reloaded". Without further description, it might be either a crash + browser-driven page reload, or some long in-app rerender due to React, state and network.

> Twitter was cropping important parts of my image so I had to manually letterbox it.

Quite a few pieces of drama surrounding this recently. Won't comment.

> TVTime failed to mark an episode as watched.

Never used it.

> Infuse took 10 minutes to fetch ~100 file names from smb share.

No batched api + other shenanigans. Happens to Reminders too.



Sounds like Apple software and its ecosystem has gotten worse than Windows.


It has but no one else wants to say that.

I'm totally surprised that I had to scroll down this far to find your comment.


Simple solution use Gnome3


Hi,

I never stopped to think about how much time I spend on this kind of request, just to get what I'm trying to obtain. Then, if we are worried about using our time wisely, what to do?


Am I the only one who completely disagrees with this premise and find that software is not only doing more incredible things than ever before, but is much more reliable than it ever has been? And when I run in to problems they don't prevent me from doing my work, are at worst mildly annoying, and are exceedingly rare considering I now have a running computer in my pocket 100% of the time 24/7 and do all of my work on a computer? I honestly don't get this grumpy attitude everyone has about software, are people expecting some kind of platonic ideal of software?


If Nikita stepped out in the real world and tried to get other humans to do their job right and recorded how often that gets screwed up, he'd run right back to his computer, because at least with computers, you can eventually get it right, with humans, you are always at the mercy of someone having a bad day, being born an idiot, being distracted, etc, etc...

It's just life and that's how it goes :)

When you measure everything in relation to how you would've handled it, you get posts like these. The more different you are from an average human being, the more frustrated/bewildered you will be, until you have the epiphany that others are not like you and cease expecting them to be.

One question to ask is, if everything is so broken, surely you can become rich through this massive opportunity to improve upon it and then proceed to solve a lot of the frustrations you have with money? Go on then :)


This might sound like a flippant question, but it's not.

Isn't the point of tech to tinker? When did problems become problems that you shouldn't try to solve, instead of a reason to learn about the tech?

I've really been down about technology lately, after working with 6-12th graders. They just don't get how anything works, if it doesn't immediately work for them. They don't want to take it apart and figure it out.

This blog piece just seems like more of that. Why bother with figuring out how things work when you can just throw them away?

Now get off my lawn, I guess.


> Isn't the point of tech to tinker?

What no extremely not. It's to... do things? Tinkering with tech is cool if that's your hobby but it's not mine and I'm trying to do shit not pick up another hobby here.

It's like the people that argue that cars have lost something intrinsic in recent decades and now no one knows how to do work on their own vehicles. That standard was an artifact of unreliability, not something to be applauded.

Tech should get out of the way and let you do what you're trying to do. If "tinker" is that thing then sure have fun. But for most people that's not the point at all.


I think the op was close. I don't expect everyone to tinker with something. Not everyone needs to be interested in cars, or computers, and want to build and maintain them. But, I think people need to put in a little more effort to have a general understanding of the tools they entrust with their life and use constantly. If you interact with a computer all day, every day, you should dedicate some time to understanding more about how it works. That's not saying you're tinkering with it or it's your hobby. But, when it's a main part of your life, not knowing anything is just bad planning. When things go wrong you're now screwed. It's just survival planning, you don't need to be an expert, but knowing a little bit puts you ahead of 90% of users.


How much do you know about metallurgy, plastics, concrete engineering, hydrocarbon refining, etc etc etc all the thousands of extremely specific domains of expertise that modern society is built upon?

The fact that you can possibly get more out of some tech by knowing how it works is because most of it is very new, which is basically just an anomaly that is fading. Eventually how computers work will be like those other things: expert domains that we rely on but users don't need to understand.

It's already largely like that and that is good. I get that everyone here is a nerd and grew up tinkering with their computers and this makes them sad.


I'm not sure there is an agreed point. The industrial revolution is still young. Most people made a substantial portion of the goods they use historically. Even when trading for goods, it often was for intermediate inputs like cloth rather than fully finished items.

But even then, I would say no, tinkering was the point of hobbyist tech, or professional/craftsman tech. But the recent history of "consumer goods" hasn't been about tinkering ... I think the key difference in the past few decades is maintenance. Even in the era of consumer goods in the past hundred years or so, maintaining your possessions was expected. Some might pay a professional for specialized work, but since most things were maintained a large fraction of the work fell on the owner ... and this also leads to tinkering, not as an expectation, but an outcome.

Consider darning your socks, or sharpening your hatchet. Shining ones shoes or replacing a sparkplug.


> Isn't the point of tech to tinker?

No.

We claim to be building tools, but nothing we build operates like a tool does. My hammers don't randomly lose their heads. My vise grips don't every so often fly apart into a collection of oddly shaped bits of metal, and require I painstakingly reassemble them by hand. My benchtop power supply doesn't require constant calibration. Even my oscilloscope, which actually is a computer that just happens to be built for signal processing, functions in a way that's predictable and reliable. And if any of these did misbehave in the ways I describe, I'd regard them as unfit for purpose and, while I would likely be able to repair them and put them back in service, the ideal outcome would be replacement with something that didn't require the same effort just to make it able to do the job it is sold as being able to do. Tools exist for the sake of making other tasks easier, not as a source of tasks in themselves.

There is definitely failure happening here. But it's not on the part of people who are upset to have been told what they're buying are tools, only to have them turn out not to be.


Nowadays most software doesn’t let you tinker with it. Want to tinker with your chat client? Too bad, everything (that any of your friends actually use) is closed source and proprietary now. Frustrated with social media? Unless you can convince all your friends to switch to mastodon, you’re stuck with whatever Facebook/Instagram forces upon you. Excited about the future of virtual reality? Hope you don’t mind playing by Facebook/Oculus‘s rules. Want to fix an annoying issue with how your email client categorizes email? Well there’s no way to integrate with that system, google barely supports standards like pop and imap anymore, and outlook is transitioning away from a powerful plug-in system to a silly new web-based model.

A very specific example: Something that’s been frustrating me for years is Google photos on iOS does not synchronize whether I have starred/favorited a photo on iOS‘s photos to google photos. I tried fixing this myself with a helper program, but the Google photos API does not let you set whether a photo is marked as a favorite, and you cannot manually add photos to the favorites album via API. I’ve tried several hacky workarounds but I simply cannot build the simple UX fix that seems obvious to me, nor can I get anyone from Google to acknowledge my many suggestions to them to implement this obvious feature. I looked into trying to use Plex or Microsoft onedrive as an alternate photo backup system, but they are much worse than Google Photos in terms of reliability and UX. I’ve looking into open source solutions to replace this whole workflow, but the amount of time and effort and money and compromises required for a system like that simply isn’t worth it. So I put up with the annoyance, since there’s effectively nothing I can do about it.

I used to love programming because it made me omnipotent, I could change things as I wanted and tie together systems exactly the way I liked. But now I keep hitting brick walls, most of which are imposed because of business models and not for purely technology reasons. The frustration adds up, to the point where I no longer enjoy bothering to try at all. Now I am only involved in technology to the extent that I make money from it, and I’m in the process of transitioning my career away from tech altogether. So much potential squandered.


that's why i use Free Software and Open Source. same bugs and same irritations, but at least i have the feeling that i could do something about it.

instead of feeling helpless i get to make a calculation: ok, so to fix this issue i have to get the source version, identify the problem, figure out how to fix it, submit the fix upstream.

hmm, 5 hours of work? do i have the time? can i pay someone to do it? is it worth it?

if the answer is 'no' then i made a conscious decision to tolerate this issue, because i considered the tradeoff.

because i have this choice, i feel empowered, even if i decide not to do anything about it.

with closed source software i don't have this choice and any issue i run into makes me feel helpless.


People want to drive cars, not all of them want to be mechanics.

People want their phones and software to just work. Or at the very least not get bricked every friggin update.


I think that is a terrible analogy. A car for the most part has a single purpose. Technology offers humans a lot more than just getting from point A to point B. If you only are capable of utilizing the bare minimum it's going to make life a lot harder. Take excel for example, the people who are able to reason about and write their own advanced excel spreadsheets in the workforce have a huge leg above those who only can do what the UI lets them do on their phone. There is a lot more to computing than treating it like a hammer.


You can’t say that they have a leg up in general because It Depends. I’m an engineer and my job is computational fluid dynamics (CFD). My dissertation involved developing highly parallelized solution algorithms for a specific type of problem. As such, I am quite well versed in low level Fortran/C code and MPI because I wrote code to do that, “that” being to solve an equation. I know this set of equations like it was second nature because of how much time I’ve invested in it. Not only that, but I would consider myself somewhat of an “expert” in the underlying mechanisms to solve these equations. Those being finite element methods. I can talk to you about tensor spaces and discretization and derivation of the numerical algorithms to solve these problems. I’d consider the code that I wrote/extended to be tech.

My current job role is “just” an engineer working for a company where all I basically do is run and analyze models using commercial software. Even though I know the underlying low level code/algorithms of what I’m doing, I would be the first to tell you that a 21 year old mechanical engineering graduate could do my job as effectively and as completely as I could. Because none of the above paragraph actually matters in what I do. Maybe it gave me a leg up in the hiring process, but the knowledge I am applying for my job (which I enjoy) is what you’d learn in a 4 year degree with electives in eg numerical methods and fluid dynamics.


> A car for the most part has a single purpose. Technology offers humans a lot more than just getting from point A to point B.

Sure, a car is just one example of technology. Technology in general is responsible for everything we have and everything we do. Here are some other technologies:

- farming

- weaving

- plastic

- glass

- ceramics

- steel

- cooking

To the nearest 0.0001%, 100% of people need these things to work and don't -- and can't -- understand why or how they work. That's the point of technology.


Way too broad of a definition. I clearly meant computing and how a person interacts with their devices as that was the discussion at hand.


Why would this differ from any other technology?


I'm really surprised to hear you question how computing is different from any other technology? General purpose computing is extremely broad in scope. It's used in everything from finance, design, manufacturing, communication, engineering, medical, entertainment - pretty much every field or industry in the world has been impacted by it. More so than any other technology in existence. Even something like gasoline has had less impact than computing. Being able to take advantage of this powerful tool is a huge leg up on life. There are countless applications of computers in our daily lives. To just ignore that and use them for their bare minimum function is quite sad!


> It's used in everything from finance, design, manufacturing, communication, engineering, medical, entertainment - pretty much every field or industry in the world has been impacted by it. More so than any other technology in existence.

Nope. Farming has this beat hands down. So does writing. (I unlisted writing from my earlier comment because, unlike other technologies, anyone capable of using writing must also have a good understanding of how it works.)


.. sure, until you have to do something for realsies and the tech is the only way to do it. Then it's just fucking your life over.


Explain further, please. I don't think I'm following you.


Here's an example. I use Discord for keeping in touch with friends, running D&D sessions, etc. Discord on Linux has a random crash whenever in a voice call. I've followed up on threads, and others have this same issue, possibly related to glibc, but there aren't any definitive solutions yet. So, instead, I tinker. I make a quick two-line python script to launch discord repeatedly every time it crashes. Discord reconnects to the voice chat on restart, and all is well. Usually, people don't even notice that I've dropped out of 1-2 seconds.

But Discord does more at startup than just reconnecting. It also checks for new updates, and refuses to start if there is an update waiting. So now I've dropped out of a call, I have self-opening popups letting me know that there is an update available, and I'm trying to download/install the .deb update before the D&D players get bored and start making puns. Nothing good happens when the players have started making puns.

I like tinkering. I really do. But I like tinkering in order to solve my own problems, and to make my life easier. I don't like tinkering on limited time in order to add yet another support to a house of cards that is gently swaying in the breeze while I'm trying to hold a D&D session in the top room of that card-house.


It's one thing to tinker with Linux to learn how the OS works and another thin to troubleshoot it just as you are about to give a presentation.


Got it. Thanks.


The point of technology is to enable us to do things that we either can’t or don’t want to do. This ranges from simple time savings (don’t want to do/make things easier) to things we literally can’t do (high precision timing for example).

Tinkering is a hobby, your hobby.


> That’s the world WE ALL are living in now.

No Nikita, this is the world you live in. I saw this happening five years ago and ripped Apple out of my life like a band-aid that's been there so long the skin's closing up around it. It sucked, but now it's a million times better.

There simply isn't enough manpower, capital, or willpower to create decent software when you also have to contend with the profit motive. The author of the article went down a massive list of annoyances that just shouldn't happen, and they all stem from the need to keep Apple's technology proprietary.

This need goes all the way back to the Apple II, when Jobs made the prescient observation that people want to buy full packages of hardware and software, not bits and pieces that they have to assemble themselves. To do that, you need to make a company to do it, then control the platform, hardware and software, to ensure a good user experience. This is the DNA of Apple and it's served them well and we got a lot of great tech out of it.

He was right, until he was wrong, and he didn't really start being wrong until technology complicated to the point where a profit-seeking enterprise stopped being able to keep up with market demand.

I'll tell the author or anyone else what they should do if they want to stop being constantly annoyed by technology. Use tech that is modifiable by the people that use them and not tech that is only modifiable by people you buy it from. If you do that, then you can lean on a community of people for support instead of a faceless corporation.

Apple is slowly morphing into 90s-era Microsoft as they keep holding on to the need for dominance over their platform.


Well, you're either an alien or you have grown a case of Stockholm syndrome for computing :). Absolutely everything is like this. Apple sucks. So do Windows PCs and laptops. Open source software stacks. And don't get me started on modern appliances with microcontrollers in them.

Personally, I can produce a daily list 3x as long as TFA on my Linux desktop, and probably 1-1.5x as long on my Windows sidearm. The only reason I don't run my desktop on Windows these days is because Emacs isn't a proper first-class citizen there. Otherwise, my life would be much better. As much as I love Open Source, the audio, video and GUI experiences on Linux are still pretty brittle.


Free software, not open source. The difference matters. Copyleft is important.

What OS are you running on that Linux desktop? I switched from Ubuntu to an Arch variant, Ubuntu was getting too corporatized for me what with snap and everything. Eventually I'm going to rip out the desktop environment and go window manager only.

When I said, "only use tech that the people that use them can modify" I'm not messing around. Any time I have an issue with tech, I go looking for where proprietary software invaded my life, rip it out, and go on with my day.


> Free software, not open source. The difference matters. Copyleft is important.

I very much agree! I've learned to strongly appreciate the FSF philosophy. And I too appreciate (and use, and advocate for) end-user-modifiable software.

However, when talking about software brittleness, user-facing Free (as in Libre) software is usually even worse than non-Free Open Source software.

I'm currently running nth version of Ubuntu, but in the past I've been running Debian, Red Hat, (briefly) Gentoo, Slackware, and further in the past some others which I don't recall now. In my experience, things are systematically getting better over time, but are still fragile compared to Windows (and Windows isn't exactly a paragon of stability either).

(Of course there are many places where proprietary = worse, UX-wise. In particular, just about any crapware preloaded by seller/manufacturer on your computer or phone, or stuff that gets bundled with printers, scanners and other peripherals.)

My point isn't that Free Software and Open Source software are bad things. Just that they suck too, and when talking OSes and popular tools, they suck on average a bit more than proprietary software.


Hmm, I recognize your personal journey with tech as very similar to my own. You're at that phase where you got sick of the churn and so just want to settle on something that's not going to change out from under you. You went through the "try everything" phase years ago. I understand the reluctance to go back to that life, I really do.

But it's better now, and Arch Linux is that better. I'm using the Openbox version of Arco Linux, because it gives me a usable desktop right out of the box. Another significant source of instability is desktop environments, the number of times I pulled my hair out over Gnome is many. It's hard to go against the grain when running Ubuntu, but it's really easy with Arch.

Eventually you'll hit that sweet spot of stable usability. But you have to build on a real foundation. Ubuntu can't be that anymore. Package management with Arco is stupidly easy, just 'yay anything' and it'll give you a list of stuff to install. You have to learn how to script it if that's what you want to do but it's worth it in the end.


But it's not just Apple. Apple has its own set of issues, but they are not really worse than others at this point, just not much better any more.

I have mostly old, 'dumb' appliances in my house. My car is more than 20 years old. I have become a total luddite with respect to things. As far as I'm concerned a lot of 'smart' technology is still in the unproven, experimental stage and I am too old now to deal with that shit.


Of course it isn't just Apple. But when you build the foundation of your computing life on Apple, you're asking for pain. I applaud your dumb appliances and car.


Technology sucks because it works.

Otherwise it would be in the junkyard and nobody would be complaining about constant annoyances.

All the annoyance cited come from having a pocket computer that is reliable enough that we use day in day out, charge it every night, and do important enough tasks on it to be annoyed when it doesn’t work.

Nobody complained that a mainframe’s batteries were depleting during the night.

In all seriousness, I think it’s important to acknowledge stuff that doesn’t work, and make efforts to improve them. But this need to come from the perspective that the app/device has a purpose, and understand how much of that is working.

Dealing with annoyances from something that fulfills 90% of its purpose is completely different from dealing with an app or device that is mostly polished but doesn’t help much. That perspective seems mostly missing from the rant.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: