Software updates! Guess what! Here's a new UI for ya. We moved all the stuff! It's like someone threw you a surprise birthday party, but not on your birthday, on their birthday, and their idea of the best gift evar is to hire an interior designer (for free! lucky you!) who completely rearranges your house inside and out and springs it on you after you return from the grocery store. And there's no going back.
At first it was exciting--when I was 15--then it's slightly bothersome, then downright annoying, then it's infuriating, then it's just tiring. Your brain learns that there is no point in learning anything anymore because they're just going to scramble it again anyway. Learned helplessness. People age out, ended up feeling old and useless and jaded because their skillset is completely inapplicable, after just a few years.
Yeah, I can understand why people hate tech.
Instead of the previous menu option words like Campaigns or Audience there were icons signifying each that I had to hover over to figure out what they might mean. Then when I went to my Reports the css breakpoints seemed to be wonky making that screen hard to read and use.
Half-jokingly It almost feels like constantly confusing people is a trick to boost engagement temporarily while people are forced to figure things out.
Yes, it's a definite dark pattern, but not so much an antipattern.
To add to that, now that I'm a retired lifelong techie I realize why "old folks" back in the day would hesitate to give up the old, outdated software that they knew how to use.
E.g. I'd prod older friends and family to give up wordperfect - which they knew and loved - in order to progress to the feature-rich-new MS WORD.
Now I'm a linux advocate with its archaic terminal commands and I can empathize with anyone who wants their laptop, phone, TV, microwave, etc. to stop evolving!!
Wordperfect had "reveal codes", so when the WYSIWYG gave you something you didn't want, you could pop open the actual document representation and wrangle the tags until you What You Want Is What You See.
MS Word has no such function, so when it screws you, and it does, you're good and screwed.
re: Reveal codes - not being able to press ALT F3 and cleanup the formatting mess that MS WORD would inevitably get into was torture!
I’ve always described it as “the design team justifying their own existence after the job is done.”
Let software get stable and boring.
I actually think that's really what is going on. Wish I had first hand evidence though.
I do know of a tangential phenomenon at a friend's work place. Her org has a dedicated build tools team. So every 6 months every project's build infrastructure needs to change to something entirely new, because the build tools team keeps having to justify its existence.
I don't know why a company would let this sort of thing happen. It's a massive waste of time for every team.
Some of the most annoying UX I've had is on Quora, Facebook, and the reddit redesign, which all spend a veritable fortune on it, while the best ones I've seen are something a non-specialist slapped together with bootstrap.
I do sometimes wish that there could be alternative (not "replacement") ways to do things we use "tech" to do today, where the alternatives only required UNIX (no GUI). This way if we get frustrated with a graphical UI, and myriad "updates", we can just do these things the "old-fashioned way", with comparatively smaller, simpler, command line UNIX programs.
To me, the people who would be very opposed to this idea are not users, they are developers. Having been raised on computers in the 1980's I can attest that computer users never cared about "UI" or "UX", they just did what they needed to do to use the computer. It is developers, especially contemporarary, who are actually care about "UI" and "UX", not computer users. In fact, some of them are passionate about these aspects of using a computer.
Before recommending it, however, he felt it important to mention that for people who don't machine very much, far cheaper scribes work well because unless it's your job, your tooling is less likely to be the bottleneck, and you have fewer resources. When you machine professionally, you're tooling is likely your bottleneck and you've more resources.
I think this holds for tech and software. Think of resources here as "time spent learning APIs, bash, and remembering tar mnemonics".
At first, dragging and dropping folders isn't going to be your bottleneck. Need to move 1000s of folders scattered on the hard-drive? If you're not using a terminal, you'll be in trouble.
Everyone cares about UX, it's their experience when using tech.
It's just that GUIs are better for some contexts than others.
~ $ tar --help
BusyBox v1.31.1 (2020-03-26 00:59:22 -00) multi-call binary.
Usage: tar c|x|t [-ZzJjahmvokO] [-f TARFILE] [-C DIR] [-T FILE] [-X FILE] [--exclude PATTERN]... [FILE]...
Create, extract, or list files from a tar file
-f FILE Name of TARFILE ('-' for stdin/out)
-C DIR Change to DIR before operation
-O Extract to stdout
-m Don't restore mtime
-o Don't restore user:group
-k Don't replace existing files
-Z (De)compress using compress
-z (De)compress using gzip
-J (De)compress using xz
-j (De)compress using bzip2
-a (De)compress using lzma
-h Follow symlinks
-T FILE File with names to include
-X FILE File with glob patterns to exclude
--exclude PATTERN Glob pattern to exclude
... what? Are you suggesting computer users in 2020 - which includes everyone from your nana on her iPhone to a toddler watching YouTube on a tablet - want to use CLIs, and are being forced by baddie developers into using apps?
I'd change that to: "For most people, corporate neoliberal technology is a haunted house riddled with unpleasant surprises."
Writing that recognizes that we live with the most un-free market of all time:
"We are in the middle of a global transformation. What that means is, we're seeing the painful construction of a global market economy. And over the past 30 years neoliberalism has fashioned this system. Markets have been opened, and yet intellectual property rights have ensured that a tiny minority of people are receiving most of the income." 
"How can politicians look into TV cameras and say we have a free market system when patents guarantee monopoly incomes for twenty years, preventing anyone from competing? How can they claim there are free markets when copyright rules give a guaranteed income for seventy years after a person’s death? How can they claim free markets exist when one person or company is given a subsidy and not others, or when they sell off the commons that belong to all of us, at a discount, to a favoured individual or company, or when Uber, TaskRabbit and their ilk act as unregulated labour brokers, profiting from the labour of others?" 
My gas car stinks, destroys the planet, needs yearly maintenance, crashes in everything the second I stop paying attention.
My house decays days after day. Floors need constant cleaning, wall have holes from small impacts, paint contains inedible fragments and disperse nocive gas.
Bees are building nests on my balcony and it’s definitely not what it was built for, nor where they should be.
How can we tolerate such a life ?
I guess my point is that there is a difference between things sucking because of the laws of nature, and things sucking because of incompetence, laziness or indifference.
It's a lot of work to dig a hole large enough for a water heater, I wouldn't be surprised if something similar happened (I probably would have also checked inside the water heater since if you wanted to bury something and keep it dry one might consider a water heater tank as a possible container, not sure it actually works but it's a natural idea).
When is the last time you flipped a light switch, and suddenly your pool disappeared?
Have you ever had French doors appear in your dining room because of a "Windows Update" on Wednesday morning?
Have you ever had to wait for half an hour for your house to boot later on that same Wednesday?
When is the last time you closed a door, and were killed by a hailstorm of bowling balls?
At least with a light switch, you know it's very unlikely to cause structural issues, or plumbing issues, or drain your bank account. Computers are singularly horrible in the ways things can fail.
To take your second example - if I could then flip the light switch back, and the pool reappeared, then I'd be miffed but not particularly annoyed (assuming I was able to fix that obvious-bug either myself or with an update in a timely fashion). If the pool stayed gone, then yeah, I'd be pissed.
Of course, that whole argument goes out the window when the tech in question isn't controlled by you. Which is often the case.
Or the folks who perished because of badly programmed software interlocks on the THERAC-25 radiotherapy machine.
Just knowing or figuring out to flip that switch may be an insurmountable barrier depending on the circumstances when a failure state occurs. Especially when the implementation is intentionally hidden so as to facilitate continued market value extraction opportunities from the happy accident of information asymmetry.
Yet your examples hint at something more.
Those massive failures are by people not by tech. Mismanagement and incompetence and systems designed to obfuscate accountability.
Which happens aplenty in non tech fields.
When you turn on a switch... it's part of a circuit which is current limited, and in fact there are several limits on that current, all the way back to the source... each designed to protect a link in the chain. Each of those breakers limits the capability to source current further downstream.
When you run a task in any modern OS, it runs with the full privileges of the user id with which it was launched. This is like hooking a generating station directly up to your floor lamp in the living room with no breakers. If the process has a fault, there is nothing the Operating System will do to prevent it from being used to subvert other parts of the system, there is no limit to what it can do.
There are systems that require you to specify how many resources a given task is to be allowed to access. It turns out that such systems can be just as user friendly as the ones we're used to, but they do require things be re-written because the ground assumptions in the security model are different.
Capability Based Security (also known as "Multi-Level Security) was born out of a need to have both Sensitive and Secret information shared on a computer that scheduled Air Traffic during the Vietnam Conflict. (If I remember the situation correctly) The flights themselves were sensitive, and the locations of the enemy radar were top secret (because people risked their lives spying to find them).
It was extremely important that information could not leak, and solutions were found, and work!
About 10 years ago, when I learned about this, and considered the scope of work required to make it available in general purpose Operating Systems, I estimated it would take 15 years until the need for Capability Based Security would be realized, and another 5 more or so until it was ready. I think we're on track.... 2025 people will start adopting it, and 2030 it will be the defacto way things are done.
Genode is a long standing project to bring this new type of security to the masses... I'm still waiting until the point I get to play with it... and have been for a while.
Things will get better... these types of tools, along with "information hiding", getting rid of raw pointers and other clever but dangerous tricks will help as well.
[Edit: Re-arranged to clarify, and improve flow]
Point being, I don't see a shift in the direction of security above usability or ease of mentally modeling doing anything but worsening the problem. I could be wrong on that though, but the last 20 or so years of further encroachment by industry on User's perogative to configure their machine as they like doesn't inspire great confidence in me.
I can say I'm totally reading up on that though. I hadn't heard of it before, and it sounds interesting.
But, _usually_, it's easier to reverse some changed-data somewhere than it is to reverse an actual change-of-state in the physical world. At least, the inherent effort required to do so is less - but policies or obfuscation may make it harder.
As a kid we had a gas range, and it was pretty easy turn on a burner and just leave it open without lighting it. Or just start cooking something and forget about it, depending on your situation your house is gone.
Or the pool just disappeared for no reason and you couldn't get it back unless you sold your house and rebought it?
The current state of computer security.... is like building a fort, out of cases of C-4 explosive.
How so? Almost every program has bugs, many of which can be exploited remotely. It is effectively impossible NOT to have a zero-day exploitable hole in any given computer. Thus, every single computer can be exploited... and then used to attack more computers.... in a chain reaction.... like a fort built out of C-4.
For instance books have been with us for centuries, and honestly most of them suck. Paper pages are thin and sometimes cut your finger (how many times did you get cut by an ebook ?), most are weak to liquids yet our world is filled with liquids everywhere, sometimes coming down from the sky. Updates are painful and costly and non scalable. Font sizes can’t be changed, you have to use an external device to deal with it.
Not saying there are perfect alternatives or that the tradeoffs don’t make sense. Just that we learned very early that books have these limitations and we’ll need to live with them to be a member of society. And we can agree all of these aspects could be and sometimes are fixed, but most people are just ok with books ‘sucking’ in those ways.
Although the weaknesses you cite seem like problems in search of a solution. No one ever expected books to have variable font size ... why would they?
Lastly let’s recall the book five hundred years ago is dramatically different from the book of today. For example your point about liquids is now in many ways resolved by the cheapness and ubiquity of books. 500 years ago, not so much.
Same for translations, with even books with dual languages side by side.
I find fascinating how the arrival of ebook readers made us rethink how we relate to books, and a lot of annoyances got surfaced only now because there was no comparison point before. My favorite is how you cannot ignore the length of a book while reading it: you can’t pretend not realizing there’s only a dozen pages left and the story must come to an end.
They can't be stolen back by the publisher or seller, can be trivially transformed into different formats, can take annotations, can be rebound with wear, and even if paper has it's faults, reading a page of a well maintained page in 1000 years is as easy as as the day it was written, even if significant swathes of technological backslide occur, and is only prone to the challenge created by human cultural evolution as opposed to loss of the processor or software required to decode/convert/display it.
An HDCP protected chunk of media may as well not exist in 1000 years.
Humans, as the makers of these systems, are part of that reality, which was not created by us. The reality is that we are great apes writing precise machine instructions with our general intelligence that was not purpose built for being that precise but selected for survival. Our cognitive machinery cannot exhaustively predict all the possibilities of failure of what we write, if we are working in teams, we have transfer most of our technical ideas still through natural language, in a combinatorially increasing manner as the team size increases etc. None of this is user hostile, it is just human fallibilities and limitations in play. And since we can't alter our cognitive capacity drastically, we can only make more machines against these (e.g. unittests) with their own limitations. I think the scale of what we have been achieving despite these limitations are just fantastic.
If anything users are becoming too egocentric, expecting the world to conform to their comfort, with a dash of construal level fallacy, underestimating from a mile away how easy it would be to write bug free programs with perfect designs in a real world, by real people, with real budgets etc.
Selection bias. You only hear from users who want new features. You rarely hear from users who don't want new features and just want software to stop being buggy and acting like a goddamn haunted house.
Most bugs are just annoying, consequences are not catastrophic if your favorite tuner app forgets your settings, your word processor messes up the formatting, your pdf reader crashes. You can recover with some frustration and wasted time. The perception of these being catastrophic failures shows the sense of entitlement users have because they are used to a certain smoothness in their experience and expect everything to go their way. This doesn’t match the underlying realities of the task; it is very easy to construe a sense of a working program in one’s mind but it is exponentially difficult to make the implementation actually bug free, usable and functional the way user wants.
So what? Users get upset when your crap doesn't work. Stop being flippant and pushing back. Pushing back is not your (our) job. Complaining how hard your job is not your job. Griping and moaning about irate users is also not your job. Delivering a product that does what is says it will do on the tin is actually your job. Believe it or not, you produce something people depend on!
Imagine your power steering goes out on left hand turns going north downhill. You take it into the mechanic and all you get is "That's just annoying, not catastrophic. You can recover with just some wasted time. It's exponentially more difficult to make the implementation actually bug free!"
Users quite rightly spot bullshit excuses. And we have none. Save the settings, fix the formatting, stop the crashing.
Please tell me more about my job internet stranger.
You’re making the same arguments without adding substance, just emotional rhetoric and unnecessary personalizing.
> Imagine your power steering goes out on left hand turns going north downhill.
Imagine that steering wheel stopped working depending on the highway you’re driving on (software & hardware platform). Why didn’t they test this on every single highway? Because that would be was combinatorially explosive.
I’m glad you’re making a physical world analogy. Comparable physical world projects have orders of magnitude less moving parts that need to interfit, and assembly gives immediate feedback whether they can fit. They also have orders of magnitude less inputs they are expected to operate on, which makes it easier to exhaustively test their function.
“Shut up and just make it work” might have been popularized by certain tech personas, but unless you have Steve Jobs levels of clout, pulling that stuff in most dev shops will quickly make you very unpopular whether you’re a IC, a manager or a product manager.
Users guffaw at this point. They do not understand why your stuff is so complicated and broken. They think you suck at your job. Both you in the collective sense (you engineers) and you in the personal sense. They start plotting ways to stop using your stuff because you are so frustrating to deal with.
> They also have orders of magnitude less inputs they are expected to operate on, which makes it easier to exhaustively test their function.
I think you still do not understand my point. Users fundamentally do not care about it. Everything, to them, is a problem of your creation and they'd quite rightly regard your long-winded explanations with complete skepticism. To you it always feels likes it's someone else's fault, but to users it sounds like complete BS. No matter how right you are about it being someone else's fault. Someone else's fault is the very definition of a lame excuse from their perspective. They are still getting nowhere and you are even more useless to them because you still can't fix anything and just confuse them and waste their time.
It's a very user-hostile attitude and makes for terrible customer relations. That attitude is also counter productive and helps no one. No wonder people hate tech.
Software has a nasty habit of iterating on yesterday's ideas instead of rewriting for tomorrow. Not that there's anything wrong with that, it seems to be the path of least resistance thusfar.
The problem is that we do engage in so much "rewriting", instead of leveraging known good, quality code (or at least fully-fleshed out algos, etc.) in our "new, modern, elegant, and trendy" software edifices of crap.
To me, this may be the one really good thing to come of the cloud (as opposed to the re-mainframe-ication of IT): the "OS" of the 21st century, allowing plumbing together proven scalable and reliable cloud/network services to build better software. (Again, not a new idea, this was behind the "Unix Philosophy" of pipes, filters, and making each program do one thing well. Eventually, it will be in fashion to realize this really is a better way...)
We need smaller, better software, not the latest trendy languages, insanely complex platforms that no one understands, and toolchains of staggering complexity that produce crap code so bloated that it requires computers thousands of times faster than the Crays of the 1990s just to do ordinary desktop stuff. (Seriously, folks, the Raspberry Pi 4 on the next table is a rough match for the Cray we had in Houston when I worked for Chevron in the early 90s! Think about that, and then think about how little data you really need to deal with to do the job, vs what you're actually shuffling around.)
“Einstein repeatedly argued that there must be simplified explanations of nature, because God is not capricious or arbitrary. No such faith comforts the software engineer.”
No. A hardware product like a car has predictable wear and tear governed mainly by the laws of physics. The fact that I can no longer use my smart speaker because the manufacturer decided to stop supporting it, went out of business, or got bought is not at all the same. My car will still work through all of those things in the exact same way. It also doesn't throw up random dialogs (or whatever a physical equivalent would be) that stop the product from working until I interact with it. Not the same at all.
Also, car companies have been tinkering with "electronic repossession" - remote kill switches due to nonpayment.
So ... get ready for other things to suck as we attach technology to them.
Thank you for bringing this point. The actual problem is not the software itself, but its proprietary nature and infinite hunt for profit without any limits. Consider free software instead and you will see that it is improving year by year, despite very slowly (which is logical, in the absence of infinite resources).
My Linux machine never forces me to reboot, shows any ads or suddenly changes its interface.
I see you never had a (EU) Ford.
These turn of phrases make me wonder what we are really expecting from software.
I can’t imagine you never slapped the door of your fridge and it didn’t properly close. You gave it a nudge when you realized it, and it was fine, but it must have happened. And your whole food supply would be rotten if you didn’t notice in time.
Or do we monitor energy consumption close enough to realize it’s eating much more than what should be expected, the same way people complain about chrome eating too much memory ?
It can also get pretty noisy but I’d assume most people just think it’s normal.
And we put the blame on ourselves for a lot of issues (didn’t put the bottle at the right place, didn’t put the right amount of force to close, didn’t set the right temperature, forget to leave space around the fridge for ventilation etc.). But few users blame themselves for not having understood the software and worked around its weakness, we just call it broken.
That benavior is normal, but I’d take a lot of “my applicances just work” with a grain of salt.
But if the fridge was software it would randomly turn off and ruin all your food. The light would sometimes stay on, except when you open the door. It would require you to decline an update before you could get the milk out for breakfast. During an update the fridge and freezer components would switch places and then give tips about efficient ways you could manually move everything. If you bought a new fridge, part of it would locked shut until you paid more money, but the one in the store was already unlocked. And god forbid you lose your 2FA device used to setup the fridge -- it will destroy everything inside (including irreplaceable heirloom tomatoes) upon reset. It will then update to a subscription model where custom temperature settings will require a monthly fee or you'll be limited in the number of items you can store in the fridge or number of times you can open the door per day.
Fridges have been with us long enough in a ‘pay everything upfront’ setting that we‘d battle to the bitter end if we had to do micro-payments or aggressive planned obsolescence.
To your point, I lived in long stay apartments where you put coins to have the fridge and air conditioning work because they didn’t bother having pay as you leave metered use. That’s super rare (I hope ? I’d expect the same in some predatory situations towards low income people), but it’s to say that alternative exists.
Otherwise fridges randomly turning off is just a matter of time and/or build quality. Sooner or later it happens (or it stops turning on, which is arguably better, but you wouldn’t say it’s great)
I think blaming software for this is a little naive. Take a look at consumer reports for any modern fridge, stovetop/oven, washer/dryer, etc, and you will see complaints about fridge motors dying, touch panels going on the fritz, etc. -- none of which involve anything more than low level firmware.
If you want to put a tinfoil hat on, you can consider that it may be planned obsolescence, but to put the blame squarely on software, I would disagree with.
You don't need tin foil hat when facing the truth :).
Also, while things you mentioned aren't software-related, they're modern tech-related. Like, last 20 years. Fridge motors dry out because they're being made cheaper, with not enough lubricant put into them and no option of opening them up and pouring in the grease. Touch panels are going on the fritz because touch panels suck (that's often very much software), and they shouldn't be put on appliances in the first place. But it's cheaper, so they are.
Worth noting that there wasn't some big price reduction happening from solid appliances 20 years ago to crappy ones today. Price remained roughly fixed, but appliances started to suck more and more.
The big deal, at first, was really with memory. Your alarm clock could ring at the same time reliably. If you invested in a VCR, it could record at a programmed time. If you had a synthesizer it could store and recall exact preprogrammed patches. Pinball machines could downsize in weight and keep truly accurate scores instead of relying on tempermental relays and score reels. And so on, with every category of gadgets getting the computerization treatment. Although not everything succeeded there were lots of straightforward cost and quality improvements, with the main downside being that IC designs are less obviously repairable.
And then pretty much every year afterward, the push was towards cheaper with more software, with decreasing margins of real improvement, with the "smart" device representing an endpoint where the product is often price discounted because its networking capability lets it collect and sell data.
What comes to mind is the Rube Goldberg machines and their expression of a past era of exuberant invention, where physical machines were becoming increasingly intricate in ways not entirely practical. Our software is kind of like that.
Every other week I read about someone's entirely-too-roundabout way of doing X via an IoT device (requiring packets to probably circumnavigate the globe). Meanwhile I'm sitting here opening my garage door with a physically wired switch like a pleb.
I just checked my fridge, it has six buttons and an LCD panel, and in all my (4) years of home ownership, I haven't touched the buttons a single time.
First, the "solid appliances" weren't 20 years ago, but more like 25-30.
And though there wasn't a big price reduction in the interim:
- Refrigerators are more energy efficient.
- Refrigerators have larger internal volume for a given size.
Equivalent improvements have been made to other appliance types such as washers and dryers, but not stoves, as far as I know.
Those improvements are largely orthogonal to declining design and build quality, but it should be noted that there are at least some ways in which newer appliances have been getting better (that aren't just gimmicky features) while prices remained the same.
Conveniently, because macos is ass, it's nondeterministic whether the balance controls display in Sound Preferences to fix the balance issue. You just have to open and close the settings panel in the hopes that it will display.
I'm a software engineer and I don't even know where to begin to debug this idiocy.
Duolingo regularly freezes audio in chrome. Once this happens, no audio will play in chrome until you restart or kill "Utility: Audio Service" with the chrome task manager.
That blew my mind, my $20 Amazon-purchased Bluetooth earphones just work™ with no delay.
books work really well until a pipe bursts in your attic. then you wake up and notice half your collection has been ruined (personal experience).
Things are getting more complicated, like you say, but they frequently aren’t getting enough better to justify the added complexity, especially given all the issues that come along with it.
To me, software is as if when I open a book to read it, then, the book suddenly snaps itself shut, hurting my fingers.
Thereafter, the book gets wings, tries to fly away, but bumps into my coffee mug on my desk, so coffee spills on the floor. Then the book does fly out through the closed window — smashing the glass into pieces — and gets larger and larger wings, flying higher and higher until it disappears into the blue sky.
It's as if software was alive, and does random things I don't always agree with.
But actually — the bees building nests on the balcony: That feels pretty close to misbehaving software. Or the cat, bringing in a snake from outdoors. Or a squirrel shewing through a power cable, shutting down a city.
Of course, people perceive that software sucks because it’s more complicated than people perceive. I forget what book said it, but an operating system has more separate components than an aircraft carrier and they’re more tightly coupled. (I’m not sure that’s true, but it conveys the idea.)
Another key difference is that in maintenance of your home, you have complete control. It's extremely easy to understand and act to improve or maintain it. When large software systems (like the IRS login) have problems, you are totally helpless.
Buy software the price of a house and you’ll be right to expect the same build level.
Then even at the price of your house you’ll have fun with mold growing inside the walls issues, soil that degrades in unexpected ways after heavy rain hits the hill you’re built on; rooms were fresh and bright enough on a hot summer day when you visited, but you realize overall orientation makes way darker and gloomy in winter that you expected. And you’ll pay for that house for your next 20 years.
Cars are the same at a lower level, and you see small issues creep up as you lower your budget (or go buy a fancy vintage italian car and you’re in for the wild ride).
> Another key difference is that in maintenance of your home, you have complete control.
In the good old days people had timers on their desk to remember to restart programs before they crash. Also saving stuff, making backups etc.
Of course online services are a different beast, but it’s more akin to fighting bureaucracy, which I see as a our society’s software in a way, with the shitty forms with not enough space for your name and other niceties.
Cars vary widely in their product quality. Houses vary widely in their product quality. Some things in life are inevitable facts of nature, but product quality is not. Quality is to a large extent determined by the time and care taken by the manufacturer.
That's not a good example, nor is it parallel to the dynamic the article describes.
Your car stinks a lot less than cars did 10/30/50 years ago (emits less in the way of pollutants or CO2 per mile driven), is less likely to kill you in a crash involving the same size cars/velocities (despite weighing less!), needs less maintenance, lasts longer, and can notify you of potential collisions and sometimes avoid them.
It's probably only worse in terms of education needed to perform maintenance or nominal sticker price.
And yes, there are some ways in which hardware has improved. But the claim is that, judged by what you're using it for, most UX-apparent aspects have gotten worse. Is there a clear way this is wrong? If you look at most UX-apparent metrics, it hasn't. Latency from keystroke to character render has gotten worse. App start time has gotten worse. Lots of other latencies have gotten worse.
None of the nightmares described in the article were typical of software UX.
These would be arguably fine if the additional features you get were a necessary cost, but they're not.
I'm also not sure that devices have gotten more efficient in all respects. Each version of iOS gets more energy-intensive, for example.
Do you have sources for this? I mean, I'm not sure there aren't rose-tinted glasses here.
> These would be arguably fine if the additional features you get were a necessary cost, but they're not.
> Each version of iOS gets more energy-intensive, for example.
I would argue that multitasking, camera postprocessing, widgets, background app refresh, and others are all features worthy of more resource usage. Many of those are things you can choose not to use if you want to save power.
>I would argue that multitasking, camera postprocessing, widgets, background app refresh, and others are all features worthy of more resource usage. Many of those are things you can choose not to use if you want to save power.
For all those features turned off (before and after), the usage increases with each version.
If you're any sort of power user, you likely know that you can backspace by the word instead of the by the character, using Ctrl + BS on Linux or Cmd + BS on Mac.
In the Messages app, the shorcut to delete your _entire chat history_ is also Cmd + BS, and it works even if your caret is in the text box. So if you type five words and then Cmd + BS six times, you will be prompted to delete your entire chat history.
I do this almost every day. So far I've never compulsively hit return but I am dreading the day it happens.
Nowadays, I would consider this a problem with the browser. How often does one navigate backwards with the backspace key?
Recently, I had some doubts over whether or not I should clobber the native browser behaviour for "ctrl-s", but then I realized that nobody anywhere EVER saves a web page to disk... and if they really needed to, the browser toolbar is right there.
ctrl-s is probably fine to break though. even when it does "save" the page, it rarely does so in a useful way.
I don't use Windows other than for gaming, so I'm afraid I don't know if there are other shortcuts other than backspace.
Your alternative is handy. I wonder if it also works on Linux.
they may work on Linux in an attempt to support Mac users.
The emacs-derived keybindings use Control on macOS, the Mac ones use Command
Do you know at all times what element has the focus? An error there can be of high consequence. (Even though browsers do make an effort to refill forms on page forward, it doesn't always work.)
It is a very bad shortcut, and there's always an alternative one anyway, because it's not always available.
Some people do it all the time. I was emailed a saved page the other day.
I was responsible for a single page web app, and the error detection code was stored in a <script> tag within the page, so I got plenty of “errors” logged for people trying to access saved pages.
Literally murdering children over here. I just knew somebody was going to come along and prove me wrong!
My dishwasher, which has only buttons to select what to do during the next wash cycle, has a firmware bug.
Sometimes when the door is closed, it will start one of the pumps. If I cycle "heated drying" on then off again, the pump will stop. I figured this out because, well, I've worked on firmware and I understand the how of how software can be stupid.
After I learned to recognize watchdog resets, I started seeing them more and more often, and became even more terrified of how bad software is.
Yup, sounds like my TV. It's not even one of the smart ones, I was careful to avoid those. But once every few days, it stops responding to the remote control when performing some action (opening the EPG, switching channels). I then have to wait about ten seconds for the display to go dark and the TV to "reboot" itself, so I can continue channel surfing.
So why do I tolerate bugs in software like that? Because I know I can fix them. And I also know I won't always. Small gods have handed me tools to remake the world as I would see fit and I do not use them. Are they at fault for not having made the world as I would prefer? Or am I at fault for not using the tools?
In any case, I've noticed a sort of dichotomy among users in their reaction to tools that fail. There are those who go "this tool sucks how can I do my work" and there are those who go "my work is what I want to do which tool can I use instead". The latter set get a lot more done. Once observing this I have attempted to modify my behaviour to be like the latter and have effectively become better.
But they didn't give you the only tool you really needed: Time.
Having meaningful access to the source is very important, but its value is limited because even small improvements often take a large amount of time especially to code you're unfamiliar with. Once you've made that improvement, maintaining it (or up-streaming it so someone else might maintain it) can take a tremendous amount of time.
I can't wrap my head around this, could you explain further?
They don't have to be engineers. They'll pay $100 / mo to solve it or hire a guy on Upwork to solve it or cobble something together on Zapier + Airtable. The thing is, the tool is insignificant. They don't really spend an appreciable amount of time on complaining about it because it's faster to stop using it.
What we've got here is a question of cost and choice. If my choices are all equally bad, IE: vendor one is not any worse than vendor two, then inertia or cost become determining factors. In terms of consumer software - consumers have been conditioned to have low expectations, and these costs are further reduced because prices are so often free or very low-cost. In regards to commercial-focused packages - again, so often we put up with it because the systems we're using are so complicated and specialized that the pool of options are limited and/or the domain is so complicated that problems are inevitable.
So long as this is the landscape, few software producers have incentives to do the things necessary to improve, and/or believe they can spread the cost of improvement over a long period, IE: don't make the investment until the pain is too great.
Deaths, a lot of deaths.
Software Engineering needs a PE type licensure and a union. We need a way to stand together to advocate for better working conditions, practices, and tools.
Really not sure what's the way out of this corner that we've all collectively painted ourselves into.
I coined it watching the robotic soda vending machine crash and reboot frequently.
Everyone is in such an irrational hurry, it's been built into the "culture" such that rushing and making messes is acceptable. And by extension, customers expect things to be shit so you don't get in much trouble for doing it.
It's a feedback loop that only stops if companies (and individual programmers) start taking pride in craft > careless speed and money like they used to do back in the 50s/60s.
Most programmers, and many companies, want to produce something of quality, well crafted.
The drive for low quality kibble comes directly from consumers, and the inability/cost of judging value.
A consumer can’t be expected to be a UI expert, and a slightly better UI might not drive sales because other factors are more important. I try to buy hardware with good UI, but I often make compromises for other factors.
Thankfully that can be disabled, but I find it to be one of the most infuriating 'features' of Firefox. If it weren't something that could be turned off, it would be a deal killer all by itself.
Maybe it was copied from IE to keep "closer to the platform"?
My FreeNAS, Arch Linux and my Android phone as well.
I think, we get paid because we are building new stuff and have to maintain shitty stuff. If my job would literaly just desigining it high level, clicking it together and then it works, no one would need me.
Yes its frustrating sometimes.
I would like to cure cancer instead of debugging why this update broke our system.
I’ve microwaved a burrito by mashing the start button hundreds of times.
If I design a bracket for a TV mount, do you blame the bracket when someone hangs a 3000 kg bookcase on it?
You’re expect software to be perfect, yet ignore the massive limits everything in the world has.
Not saying there aren’t quality issues with software. I’m saying software development is really difficult.
You mean where 99 is greater than 100?
Everything is broken, and nobody is upset. 
Some of the software I use is so unreliable that I expect it to fail. I expect the Vodafone login page not to work properly. I expect one of my airpods not to connect on the first try. I expect my banking app to show random error messages, even though it works just fine. Most online stock brokers have issues at the worst possible times. My bookkeeping app is frequently wrong, per my tax advisor. Since everything is broken, the best I can do is to mentally assign all those apps a trustworthiness score, and avoid betting too much on them.
The worst part is that support for all that software has been largely automated. If you have a problem that can't be fixed by a chatbot or a crowdsourced support community, you are largely helpless. Google can wipe everything you love, and there's no one to punch in the face (to borrow from Grapes of Wrath).
So far, my only solution to this is to be a late adopter, and to favour simplicity over sophistication. I was recently considering going from paper notebooks to a tablet. That initiative stopped at the electronics store. The Surface Go wanted me to go through a setup wizard (after dismissing a few notifications). Two of the 4 iPads had working pencils. The ReMarkable reviews mention a host of issues. I never encountered any bugs with my Moleskine. It pairs flawlessly with any pencil I want, including older models.
My Mac sometimes unpairs them and worst it doesn’t find them. Some times while my baby is sleeping I put my AirPods and play a loud video just to realize they were not connected. My wife’s right side AirPod just stopped working after one year of use...
If apple is considered top tier in reliability, then technology in general really just sucks!
I'm using a BT Jabra headset, with noise cancelling I got for about the same cost, 16+ hours of battery, easy pairing and super useful phone app, great ANC, and solid audio quality, at least to a non-audiophile. My biggest complaint is the closed back design leaves my ears a bit irritated after 4+ hours of use. Not an airpods competitor but for the cost I am way happier.
Surprisingly, iCloud syncing works fine. If I pair my AirPods with one device, it always pairs with all of them.
The main issue is with the right pod not always turning on when I take it out of the case. The solution is to put it back in the case for 5 seconds and to try again.
The second most important issue is the airpods falling out of sync with each other. It seems like the signal from my Samsung S9 in my pocket is choppy. Looking left or right for too long will make the signal drop. Putting my hands in my pockets also will. If I put the phone in my backpack, it's okay.
This is still more pleasant than wired headphones, but it's far from a magical experience.
Personally, I hate ear buds and, as such, never bought ear buds. Rather, I spent ~$20 on SoundBot bluetooth headphones starting some five years ago (long before air pods, methinks) and haven't had problems with them at all.
I also have a seven year-old phone (HTC OneMax) running custom (unofficial/ported by a random hacker) Android, and it pretty much works.
Sure the battery life has degraded since 2014, but that's to be expected, no? I wish I could replace the battery (as I did with my 15+ year-old Panasonic cordless phones), but there really aren't too many mainstream mobile devices that allow that any more.
As for poor quality software/hardware, if you don't like it, vote with your feet and/or wallet.
If stuff doesn't work, why use it? Even more, if stuff doesn't work and you can't/won't fix it yourself, then don't use it.
Software devs and hardware manufacturers don't care about whiny blog posts or complaints on HN, they care about the bottom line. Impact the bottom line and you may have a chance at improvement.
Stuff that actually addresses the issue is useful. A great example is the lack of Android support after 4.4/KitKat on the HTC OneMax mentioned above and the abandonment of it on Cyanogenmod/LineageOS in 2017, where those (myself included although I'd never hacked on Android ever -- and failed miserably -- thankfully someone else did not) impacted by this took action to provide the latest Android on an old, unsupported, discontinued device.
If you're not taking positive action toward making things better (whether that's fixing the problems or voting with your feet/wallet), then you're not going to have any impact.
While whinging about it on your blog may be a way to relieve the stress you feel about whatever issue(s) you may have, it's not constructive or useful.
That is unless your goal is to get lots of comments on HN where the Apple Fanbois sagely agree, and lament there's nothing to be done about it because Apple is the pinnacle of tech and since no one could possibly do anything better than Apple (or the apps that run on their gear) therefore all technology sucks.
And that's objectively false. There's lots of tech out there that's quite good. I suggest using that and shunning rather than using, then whinging about the stuff that sucks.
Edit: Fixed typos/formatting issues.
Good, you dodged a bullet there.
I mean, I love my 2-in-1 Dell (a slightly cheaper but still high-end Surface-like device). The pen, as much as it's useful (I'm not even considering buying a touchscreen-enabled device without solid support of a pen anymore; it's so much better UX than fingers), still has lots of subtle and annoying bugs. Maybe in 20 years people will work out the kinks. More likely, the concept will be abandoned in favor of some new modality that will also never be perfected.
Most software is still net positive in productivity. We tend to place more emphasis on failures as users.
Remember you're running millions of lines of code that talks to other computers running millions of lines of code that communicates over a network running millions of lines of code to deliver some information on the order of seconds to minutes -- and then something responds to that information and everything happens all over again.
All day, every day, trillions of packets of information get delivered just fine. Try doing that as a human, delivering letters. You probably won't even approach a million packets delivered in your life time. And people have the audacity to say, "oh my, some things didn't work, this is completely broken"
In only a single generation, we went from voice communicators to super computers in our pockets. The utility vastly, vastly, vastly overshadows the glitches that come with frenetic advancement. How long did it take humans to invent basic numbers?
Not to mention that it:
- doesn't need charging
- never freezes or crashes
- is much cheaper than a laptop or tablet
- distraction-free (no Internet, no apps, etc.)
The iPad seemed pretty solid, but I'd have to turn it on and unlock it to see my notes, unlike a notebook.
The Remarkable seemed nice, bht there are lots of complaints that paper doesn't have.
The Supernote A6X was the most promising, but it was hard do get in Germany.
Most notes I take only need to exist for a few weeks and then I erase...so transferring it to "long term storage" is rare.
I do have an iPad and note taking apps like Notability if I know something will need to go to "long term storage" but I find I use the Rocketbook more.
I thought it would be nice to access my notes when I don't have my notebook on me, and to have layers, zooming, undos etc. However, the more I look into it, the more absurd it seems.
I'm replacing a 15€ notebook and a 2€ mechanical pencil by a 400€ gadget that doesn't quite work. Why? So that I can spend my time organising notes in a digital space. Why? I don't really know.
It would be cool to have layers, zooming and an undo button. It would also be cool to have access to my notes even when I don't have my notebook. However, it would just be cool. It doesn't actually solve a serious problem.
I'd also be replacing the piles and piles of legal pads I go through every year. Most of the time the notes are ephemeral except when I'm working across from someone in which case I really need them to exist digitally so I don't lose them.
I just wish I didn't have to wait 6 months for the second version.
There's always an adjustment period, where people have to spend time learning a new technology, and any issues with the new technology need to be resolved. The gains in productivity happen mainly after the adjustment period. But we've eliminated the periods of stability and are constantly pushing for more "innovation", which means we're in constant periods of adjustment and resolving problems, where the promise of increased productivity is never fully met.
The worst idea ever in technology is regularly scheduled updates. Innovation has never and will never happen on a schedule. This is simply greed-driven, promotion-driven, pointy-haired-boss-driven development.
Produce something new and great... but then let us all enjoy the new thing for a while. Novelty for its own sake is not productive.
this is sort of uncharitable. the development/maintenance cycle for software is incompatible with the traditional way of monetizing a product (ie, design it up front, manufacture at scale, and then the buyer gets what they get, barring severe safety defects). buyers of software expect the product to at least mostly work in the first place, but they also expect bugs to continue to be fixed after the sale, even if the bugs are introduced through unforeseeable interactions with other software.
imo, subscriptions are actually the ideal way of aligning incentives for products that involve ongoing maintenance. but buyers tend to consider this a ripoff if they don't actually see a stream of new features in development. while it introduces some unfortunate constraints in the dev cycle, bundling up features in a scheduled update is a good way to make it visible to users that their subscription dollars aren't just falling into a black hole. trickling out new features "when they're done" earns the respect of engineers, but results in the average user simply not noticing that progress is being made.
Contrast that to traditional physical goods, where buyers expect the product to work as advertised, right out of the box, or their money back. Software in the Internet era has it easy, because it gets to release shitty half-finished versions, and then keep charging money while never quite finishing the software.
> subscriptions are actually the ideal way of aligning incentives for products that involve ongoing maintenance. but buyers tend to consider this a ripoff if they don't actually see a stream of new features in development
Because software does not decay on its own (despite the misleading term "bitrot" being popular in tech circles). That's literally why digitizing data has taken the world by storm: digital data does not decay; as long as the physical medium is readable, you can make a perfect copy of the data it contains. As a buyer, I don't expect my software to need maintenance. I expect it to work out of the box (just like I expect every physical product to work out of the box), and once I find software that fulfills my needs, I expect it to work all the way until computing technology moves forward so much that it's no longer possible to run the software. Which, in the era of virtual machines, may take decades.
So yeah, there's a need to clearly justify to the customers why you're charging subscription, because software in its natural state does not need maintenance.
Software is far more complex than most physical products. There are only so many failure modes for a screwdriver or a couch and they're all pretty foreseeable. The most complicated physical systems, like a car, house, or even a human body, do need maintenance.
I'm frequently frustrated with software bugs like everyone else (my building and apartment have this awful smartlock system that's riddled with bugs and which bricked me out of my own home due to a bad app update a few weeks ago--shoutout to Latch) but I'm not sure I'm on aboard with an anti-maintenance attitude. If there are bugs I'd like them to be fixed!
Me too! I'm not trying to be anti-maintenance (though I do wish technology will be developing towards less and maintenance required, but that's another topic). I'm pro-quality. The impression I'm having is that maintenance burden on software is being created in order to justify subscription model - and that the ability to do post-release updates made vendors and devs no longer care about delivering reliable and quality software (customers become the new QA, bugfixes can always be added latter, except they tend to be deprioritized in favor of new features).
Note I'm not postulating a conspiracy theory, just a spontaneous coordination of the entire industry due to market incentives. But the effect is still there, and I feel it needs to be countered.
in theory yes, in practice no. I work on a product that targets windows and macos. on windows yeah, a version of our software from 2015 probably works as well as it did the day it was released. apple deprecates stuff in their API every year that we have to go back and update. they also break a lot of stuff that isn't formally deprecated and we have to find workarounds for that to. "our software will work forever as long as you never update your OS" is not acceptable to most customers.
Still, as a Linux and Windows user, I've absolutely grown to expect my desktop software to work 10 years or more without updates. After that, I can always spin up a VM with an older Windows version.
I work in a hardware company, and for any important function I usually set up a dedicated computer, install the software, and then never touch it ever
This is how the more sophisticated oscilliscopes etc. work. They often have Windows XP installed if you buy them used. Simple, doesn't break, no internet, if it's mission critical or expensive it's worth a dedicated and frozen computer
I wasn't disagreeing with that. I mentioned "long periods of stability and refinement" — refinement including bug fixes — and "any issues with the new technology need to be resolved". But again, bug fixes don't magically happen on a schedule either. Maybe fixes are easy, maybe they're hard, you never know in advance.
> bundling up features in a scheduled update is a good way to make it visible to users that their subscription dollars aren't just falling into a black hole
This is exactly why it's not true that "subscriptions are actually the ideal way of aligning incentives for products that involve ongoing maintenance". Instead of maintenance, subscriptions incentivize continuous delivery of new features, and consequently continuous delivery of new bugs.
Did consumers demand subscription services? Or did vendors (led by Adobe) decide to change to subscriptions to get uniform cash flow?
At agencies I have worked at all creatives I worked with would prefer to spend $200-400 and have a permanent software license. Perhaps this isn't a representative group.
that said, I think consumers would prefer subscriptions if they understood how it aligned incentives. one way or another, a product will stop receiving support when the money stops flowing in. with a permanent license, it ends when people stop buying licenses. with subscriptions, it continues as long as enough people keep paying.
Funny how consumers tend to despise them though.
The term "subscription" itself a typically a euphemism for "rental". There are a small number of companies who offer a year of updates that you get to keep forever (which makes consumers play the game of when exactly to buy to maximize the new features in that year), but most so-called subscriptions disable the software entirely if you stop paying. In other words, rental.
Long-term rental is almost always a bad deal for consumers. One of the few exceptions is housing, because many consumers can't afford to buy a house, and also houses are one of the least liquid assets you can own, if you have to move it (and yourself). Otherwise, rental is going to cost you a lot more in the long run.
Financially, rental can work well for the seller, of course, but we end up with "subscription fatigue", where the market can't sustain as many sellers, and the few rich companies get richer (which is exactly why they were "pioneered" by BigCos such as Adobe, Microsoft, and Apple).
sure, and as an individual I behave the same way. I always want to solve my problem in the cheapest possible way. still, I can't help but notice that products with stable ongoing revenue tend to get much better support.
I think the clearest example is with games. most games get released with a pile of bugs. a bunch get fixed in a release day patch and then there are a few more patches over the next few months (when most of the sales happen). once the initial wave of sales subsides, you tend to be stuck with whatever bugs remain. cs:source had several game breaking bugs for years (defuse kit over bomb blocking defuse, defusing through walls, etc.) despite being one of the most popular FPS titles of its time. AFAIK, most of these still exist fifteen years later. csgo, which is monetized through microtransactions, gets bugs fixed almost as fast as they can be posted to reddit/youtube. microtransactions aren't quite the same as subscriptions, of course, but they generate revenue proportional to the current userbase, rather than the rate that people buy the game for the first time (which will inevitably dry up).
Actually, I think subscriptions misaligns incentives. With subscriptions, it becomes important for the vendors to keep releasing updates (so that the customers feel like they're getting value out of the subscription), which means having bug-free software is a terrible idea. You'd need to either release intentionally buggy software (so you can ship a follow-up version to fix it) or go on a feature treadmill (in which case trying to stabilize has rapidly diminishing returns and high opportunity cost).
As a consumer, software that was developed knowing it would never be fixed and has to be perfect the first try is much better (even if it still has bugs). Mario64 had bugs (e.g. backwards jump going really fast); but the bugs weren't really noticeable in normal gameplay because they couldn't just ship an update most of the size of the whole game before you start to play.
Almost all customers care about quality. The problem is that many customers have only very limited information about products, so they have a hard time judging quality vs. competitors before (or even after) purchase.
This reveals a general problem with the market: it doesn't select for quality. Otherwise we wouldn't be having this conversation. The market is really good at producing cheap crap. So the truth is, yes, engineers have to care about quality. The motivation for quality has to come from pride in your own work, not from outside market forces. If you care about quality, then you have to strive for that over quantity, and also charge sustainable prices instead of trying to lowball. You may not be the market leader, but there are many profitable niches. Some customers are definitely willing to pay for quality.
yes, but not to the exclusion of features. I work on a B2B product where our customers bill their customers by the hour, so they tend to have a pretty good idea of how much time a feature saves them. if a competitor adds a feature that cuts the time needed for a project in half (not unrealistic) but crashes and forces them to start over a quarter of the time, the customers will still buy their product instead of ours. they'll complain incessantly on the competitor's forums about the crashes and threaten to switch back to our product, but they won't actually do it unless we come up with something new that saves them even more time.
customers care about saving time and/or money; they only care about quality to the extent that it furthers that fundamental goal. if there is some bug-ridden alternative that solves their problem faster, they will do their best to find it and purchase it.
edit: to be clear, I mean "reliability" when I say "quality"; their are many other "qualities" a product can have, one of them being "cheap".
The question is, why do we make consumers make that tradeoff? Why are we shipping junk at all? There shouldn't be a reliability tradeoff. All products should be reliable. It ought to be a bare minimum standard.
fair enough, I probably overstated my point with some of the wording.
> The question is, why do we make consumers make that tradeoff? Why are we shipping junk at all? There shouldn't be a reliability tradeoff. All products should be reliable. It ought to be a bare minimum standard.
everything in life is a tradeoff. we could make software more reliable, but then we would have to spend less time adding new features, or we would have to hire more/better engineers and charge more. maybe we even get to pay down tech debt and gain the ability to add features faster in the long run. doesn't really matter if someone else dumps a bunch of buggy new features in the meantime, converts our customers, and forces us out of business. in the absence of some industry-wide gentleman's agreement or regulation, we have to observe the behavior of customers and do what their behavior (not words!) indicates they want.
This is always presented as the doomsday scenario, but how often does it actually happen?
The story of Apple in the Tim Cook era is unrelenting annual releases, more and more "subscriptions", massive return of cash to AAPL shareholders, but decreasing product quality. Did Apple make that tradeoff because they were scared of going out of business? No, they were doing very well before Cook took over. Cook simply has lower standards than Jobs, there's no other reason. He's been great for investors, not so great for customers.
No, users pick a finished product.
For consumer goods, it is the hype cycle.
New updates and releases get press. They also restore consumer confidence.
If Samsung announced that next year they weren't releasing a new Galaxy phone, the entire industry would freak out. Consumers would lose confidence in buying Samsung phones, journalists would write articles questioning if Samsung was pulling out of the market, a lot of bad things would happen.
Give it 18 months without a release and people would start to think of Samsung as "that company that used to make phones."
They would have to fight like heck to restore their image.
Software is the same way. In the Vista/7/8 era, Microsoft looked like they were falling behind because their competitors started releasing yearly, or even bi-yearly, feature updates.
Sure every Android version up until 7 was kinda-sorta-terrible, but it kept Android in the news. Likewise, Apple got huge free press every time they announced a new revision of OS X (now MacOS), and every time they came out an announced a new version of iOS.
The result? "The desktop is dying, phones are where the real innovation is at!" articles being published even faster than those software updates came out.
You can of course release too fast, rarely do Chrome or Firefox's releases get any press (unless there is a controversial UI change), but in general frequent updates are free advertising.
Tesla is also great at this, I'm nowhere near being in the market for a Tesla, but at least a couple times a year I still end up hearing about software features they are rolling out!
The smartphone industry has only themselves to blame for setting up this expectation. But it is possible to get off the train. I remember when Apple announced they were dropping out of the annual MacWorld San Francisco conference, because they didn't want to constrain their product release cycle. Apple survived that just fine.
> In the Vista/7/8 era, Microsoft looked like they were falling behind because their competitors started releasing yearly, or even bi-yearly, feature updates.
Competitors? Windows and Mac had near 100% market share on desktop. There was only 1 competitor. Vista was released in January 2007, but Mac OS X 10.5 Leopard was infamously delayed until October 2007 because of iPhone, so this telling of history doesn't seem entirely accurate. Moreover, Mac OS X releases were already slowing. Here's a list of months since the previous major .0 release:
Thus, major Mac OS releases were slowing down year by year — which is totally sensible — but then Steve Jobs died after 10.7 was released in 2011, and only then did they switch to a yearly schedule.
Of course that didn't come to pass, but everyone acted like it was the future and the market responded accordingly.
One week, the update could be a relatively minor bug fix. The next week, a major feature upgrade that's been in the pipeline for months.
You also remove the ambiguity of "Is this worth pushing out? When should I push this out? Should I do some more fixes or push this one out first?". You got fixes, push them out in the next update.
Your criticism also assumes a small team. If you have a large enough team where you can split them into new feature development and current bug fixing, they're going to work at different rates and be ready at different times. If instead your entire team just works on "the product", then there is no effective difference between fixing issues and creating functionality.
I constantly encounter broken functionality, buggy or unpleasant UIs, just as the author has. It feels like many of these problems could be avoided if you just had one person whose job it was to sit there and look for broken stuff. (I'm sure I'm biased as someone whose first job out of college was to sit there and look for broken stuff.)
I would say that effortless, automatic updates are to blame.
When you can always just push an update, the impact of a given bug goes way down. It's no longer mission-critical to exterminate flaws before shipping; a totally broken feature becomes a mere annoyance. So project prioritization shifts from polishing an artifact to outweighing the (presumed inevitable) constant stream of little annoyances with fixes and features. I think the shift towards automated testing is just a symptom; an attempt to bridge the gap in this brave new world.
For a clear-cut example of this phenomenon, look to the video game industry. Until around 2007, games received no updates. Ever. Once a game shipped, it was shipped. There wasn't even a mechanism for installing an update from physical media.
Right around that time, "glitches" went from very rare unicorns that people would spend lots of time actually seeking out, to nearly everyday occurrences. As long as it doesn't corrupt someone's save file, they mostly laugh it off and upload a clip to YouTube to show their friends. This is just how things are now.
(Edit: I should have scoped this to "console games")
Sure, but they still (sometimes) released (a few) extra revisions of a game. They were just targeted at people who bought physical copies after the revision date, rather than at existing customers.
Or said updates came on the 1.0 version of the game as shipped in markets that got the game later than others. (Just imagine — per-market release versioning. Every market effectively got its own fork of the codebase!)
Or said updates came in the form of a re-release port. There are patches made to the emulated downloadable app-store re-releases of some games, that never made it into any physical edition of the game.
Also, before home consoles, arcade game machines did receive bug-fix updates regularly. Arcade machines were essentially provided “as a service” from their manufacturers, with support contracts et al. Sort of like vending machines are today. If you reported a bug, they’d fix it and send you a new EEPROM chip to swap out in your cabinet. If there was a critical bug that affected all units, they’d send techs out to everybody’s machines to swap out the ROM for the newest revision. (For this reason, it’s actually kind of hard to do art-conservation / archiving of arcade games. The cabinets almost never have fully “original-release” components inside.)
Now, it's just "LOL just ship it, users will just deal with it until the next release!" Now, it's "Do experiments on N% in prod and use end users for A/B testing. If something's broken we'll update!"
In several industries, it's actually totally expected that v1.0 of the application simply won't work at all. It's more important for these companies to ship non-working software than to miss the deadline and ship something that works! Because who cares? Users will bear the cost and update.
Once internet updates became the norm, it all became pretty much like the rest of software industry. (At least game companies still have QA departments, a lot of mainstream web companies have dispensed with those as well.)
One system has no patching, and updates incur some non-trivial amount of effort on the part of the installer. Releases are a few times a year, at most.
The other system has patching, updates are lighter weight, and as a result, the system has THOUSANDS of patches released over the last decade, north of 2 per work day.
Guess which system is higher quality? The former.
Much, much higher quality.
Warcraft II, from 1995 received multiple patches. So did many other games from that era.
Do you perhaps mean console games when you say "games"?
Software has always sucked and had these issues. It has nothing to to with automated QA. The reason you see more issues is 'way back in the day' your software had a very limited number of things that it did, and in general it did not involve accessing a network or chugging down massive volumes of data from untrusted sources.
I work for a company that has a lot of individuals that test for QA issues, they have list miles long of things to check and write reports on.
The problem is more of "It's much easier to write mountains of code than it is to ensure that it works in all cases"
I agree with your last point a lot, though I would modify it slightly: it's much easier to write mountains of code now than it as, and it's now common and much easier to import external dependencies (especially at system level) than ever, and those dependencies have tens of millions of lines of mediocre code all by themselves.
Games are also massively more complicated today. It's one thing for three people getting a 2.5 MB single player DOS game reasonably bug free. It's an entirely different thing to get the 5 GB game made by a team of 100 or 1,000 people.
30 years ago, software bugs might interfere with you professionally, but they wouldn’t stop your ability to get money from the bank, cook food, or do any other day to day tasks.
Yup, and in most cases this not only did not improve them, but made them less useful and more fragile. Let's be honest: the software is there only because it can save on manufacturing costs, and sometimes can be used for extra marketing benefit. No attention is being given to providing value to the customer.
Car engines are vastly improved in both reliability, cleanliness, and efficiency by the introduction of computers into them. You might not like that when it goes wrong, but we all appreciate not breathing in pre-computerized car engine exhaust.
And that’s the rub. While shoddily written software shoehorned into cheap consumer goods obviously degrades the experience, there are tons of places where well written software has massively improved the quality of the goods that they’re added into. Objectively car engines are just better for the addition of software both in design and in operation. They’re smaller, more powerful, more reliable, cleaner burning, and more efficient than they were before we computerized them.
I disagree, subjectively it had its ups and downs and we are in a down phase right now. YMMV.
And despite the issues lamented in this think piece, it is _Apple_ that the author should blame for setting technology expectations impossibly high.
No organization had been remotely as successful in understanding and releasing tech products that were truly great.
Everything since is just a comparison to expectations Apple set. Even when Apple fails, it is in comparison with an Apple that does not.
There are a number of hats developers are expected to wear today:
1. Developer of new features
2. Sustainer of prior code and features
3. Tester of all of this
4. Constant student (outside work because who'd pay their employees to learn?)
The priority for the business is (1), so 2-4 get neglected. This compounds over time to mean that old code isn't properly refactored or rewritten when it should be, and none of the code is tested as thoroughly as it should be, and none but the smartest or most dedicated are really going to be perpetual students (or they'll choose to study things that interest them but don't help at work, like me).
When the old code and poor tests create sufficient problems, you get a business failure or a total rewrite. Which strips out half (or more) of the features and the whole process gets restarted.
5. The mentor of younger developers.
It's another thing some companies expect you to do, but don't allocate time for it, so at the point you finally know enough to not do a bad job, you're suddenly being pulled out of 1-4 and expected to do 5.
- Infrastructure design
- Monitoring / log-based debugging and fault tracing / handling customer issues
In addition to wanting a "full stack" developer of course...
At some point we have to accept that specialization isn't just for insects. It's helpful to have a proper sysadmin or DBA or whatever (appropriate to your domain) within the team, and not just diffuse those roles amongst the developers themselves.
The level of complexity in modern day software is orders of magnitude greater than that of even a decade ago.
What has changed is our reliance on that software. We are now so deeply embedded into our software existence we see these flaws up close.
Now, our computer ("phone") is with us everywhere we go and we'll use dozens of complex applications per day, connected by dozens of APIs, networks, protocols, and hardware features. It's a miracle any of it works sometimes! Thank you to everyone for making this stuff seem like magic; my twelve-year-old self would be amazed at how well it works.
I do feel like there are more UI bugs as we optimize for certain metrics over others. Timing updates has become far more complicated, so we get weird UI refreshes as new data comes in, stale caches, missed notifications, etc. Turning it off and on again often works, surprisingly (probably because devs start with clean environments often, so that's the functional baseline).
Lastly, it is probably far more lucrative for a technology based business to use their most valuable minds for the Next Thing, rather than iterating on the current thing. Incremental revenue improvements just don't cut it in a capital-driven world; everyone is trying to escape the local maxima to find billion/trillion dollar businesses.
But the worst part is that this is an issue with established products that have secured their market, and will even receive money every year from their customers. They have both the money and the time to pace themselves and test things properly, but they don't.
But not the motivation, because "it works" and improving UX would cost money and doesn't have an immediately visible return.
An MS bug in an enterprise piece of software (so they get paid every year for it) is Skype for Business. Perhaps my org hasn't updated their installation, but if I drag a contact from the chat window (say you message me and aren't in my contacts yet) and then drag it to the contact list (in status view) it will reliably crash the program. If I drag it to groups view it'll place the contact in a group. My guess is that dragging it to status there is no "default" group and so there's some kind of null pointer exception occurring (it tries to add it but with group given as either junk or null).
I used to see more bugs in Outlook, but it seems somewhat better with the last update so I haven't noticed them. Though it was fun when I had negative 2 billion messages for a week or so (I certainly didn't have enough to cause overflow so I have no idea how it wrapped around like that).
Google's is an issue of usability of their webapps (IME), not strictly buggy but not sufficiently tested. Behind the proxy at work Maps is incredibly unreliable. It takes several reloads for it to actually start working "correctly", but don't change where you're looking too much (you can zoom in, but do not pan around). That's not the only unreliable one in this situation, but it is the most easily demonstrated.
Apple's Messages and Mail constantly tell me (they've been better the last few months, but it still happens) that I have unread messages, I'd search and search and never find them. Then I'd pull it up on a different device and finally see the unread message (which was both visible and marked as read on the original device).
Some of these may be shallow or seem petty, but it's an unpleasant experience that after so many years and with so much money should've been resolved for each of them. I'm willing to tolerate an indie game crashing on me. I'm not willing to tolerate an enterprise software solution (MS) crashing for a natural user behavior.
EDIT: I think resolved, but Apple's iOS calculator bugs were annoying for such a simple program. Not strictly a bug, but Windows' calculator, these days, is an unusable mess in many ways. It shouldn't require so many system resources to add some numbers together (the same could be said of many small utilities that were rewritten for, I think, Windows 10 or Windows 8).
Gmail showing me email meant for <first><last>@gmail.com instead of <first>.<last>@gmail.com. The fact that they ignored the . at all in the user names. The Youtube app on iOS when I first installed it and quickly uninstalled it years ago was an incredible annoyance. It wouldn't reliably show me the video I'd actually clicked on that caused the app to open.
Adobe is an almost perfect example of holding their market captive. I've run into a number of Adobe Acrobat issues over the years, though fewer recently (but I use it less often now). But especially Acrobat Reader on Mac OS X was awful, I actually once had to reinstall the OS to get it to stop fucking up PDF display even after uninstalling the software (I never found out what it had done to the system, and gave up). I needed it because I couldn't find anything else (at the time) that supported digital signatures in PDFs on the Mac (in the sense that it actually worked, I think Preview let me do it but what it made wasn't usable by the people receiving the file).
EDIT2: Another Google one, with Chrome on macOS in full screen. Hiding the location bar means you straight up can't get to it. You have to reenable it, versus a sane behavior like autohiding and moving the cursor to the top restoring it.
I think that product is cursed. It was always a dumpster fire, like IBM's Lotus Notes. It's there so you their clients don't accidentally try Slack or any of the sane alternatives.
All products of MS, Apple, Google are in a constant churn mode. New design. New integrations with whatever platform changes happened in the back, new features to match the competitors, new trends, new mobile apps, new browser features, new framework.
Those are the products that are not the real products and not real money cows, so they get very limited attention.
I agree all of these are horrible. GMail is still a slow piece of shit. I recently tried Thunderbird, and .. it slowed down too. Wtf. Slack is slow too. Typing has become slow for some reason in a lot of "apps", maybe too fancy fonts?
Anyway, these companies have huge opportunity costs. Just look at Google. They try whatever crosses their mind and nothing is good enough compared to "ads". And so they shut them down because opportunity costs. (Not because upkeep, but because then your attention is not on the next big thing, whatever that will be.)
Possibly because more companies are moving desktop apps to Electron so they can run JS everywhere.
Now we are in 2020 and iPhone updates STILL cause battery issues. The iPhone has been out for 14 years...
Our expectations have changed. Tools like Excel should just work - and yet, when I try to save a file, sometimes it freezes and crashes. How is that acceptable now?
1. Manual testing that should be manual (exploratory).
2. Manual tests that are new and haven't been automated yet (but will be).
3. Manual tests that should be automated.
(3) is the one many people see and suffer through (I know I have). They need to be automated to free up time for (1), which is where many issues are actually discovered. But if (3) dominates your time, you can never get to (1) and you'll constantly ship broken things (or more broken than they should be).