It doesn't matter!
The funny thing about human-computer interfaces is that they are used by humans. Humans are adverse to change; ergo, a constantly-changing interface is also a bad interface.
When you change something—be it a GUI or an API—you need to take into account not just whether the change is better, but also whether it's worth the very high inherent cost of change. I don't think the modern tech industry does this anywhere near enough.
My computer is a workstation, not a playground, and I don't need things randomly rearranged for me.
I resisted for a long time, because Linux seemed like a "hassle", but once I'd had to reconfigure everything several times after upgrades, I realized that I was essentially going through the same process, only on someone else's schedule and not by choice.
After trying out a couple of WMs and choosing one to become familiar with, I've NEVER AGAIN had that experience. Even if a distro has a different default setup, it's a matter of copying the config file I already have saved from a previous install.
What a load off my shoulders!
For what? Linux is even worse on that aspect.
First for the OS GUI (Desktop manager, finder, launcher, and config apps) - both Gnome and KDE and Canonical and others have the tendency to remake it (usually worse) every few years and every major release. Then there's the fact that key things keep missing in action (like image preview in the file selector in Gnome iirc).
And that's for the core desktop (launcher + config apps). It gets worse with regular apps. Some being GTK and others KDE/QT based, some use bizarro or legacy frameworks, some with custom cross-platform UIs that kind of match the look of just GTK apps usually, and so on.
Sure, as you did, you can always go with a minimal setup (just some terminals and a very simple window manager), and that will stay stable, but that's really retreating from UI, not a holy grail of stable UI. And it doesn't include your UI apps (which suffer from what I described above).
I've also tried Ubuntu, Debian, and a few others, and had to abandon them for the same reasons.
Eventually, I settled on [redacted], which is pretty raw and immature, but once I get it tweaked out, I can move the config file to another distro and have it work roughly the same.
Basically, I doubled down and committed to cutting out all the software which pulls that kind of shit, which means I no longer use Chromium or Firefox (except for testing), Gnome, etc.
I found a lightweight WM which seems to be relatively stable in "poor emulation of Windows 95 plus a few additions" mode, which is available across several different distros, and I keep a repo with my config for it so that I can readjust it from whatever the distro defaults it to.
It's not a day's worth of work, and not even a year, but now I'm fairly confident that I can sit down at my WORKSTATION and something hasn't randomly changed and I can't get it back to how it was before.
In fact, I've been able to get this experience while not buying or using any new devices, since I only use secondhand hardware for ethical/moral reasons.
To be fair, I don't think Windows has had this advantage over Linux for over 20 years. Maybe macOS has, but in Windows, almost every application is built with a weird custom proprietary toolkit. Even Office and Visual Studio, Microsoft's flagship Windows programs, are. That desktop Linux only has two major toolkits is a blessing. With the rise of UWP and official Microsoft Electron apps like VS Code and Teams, it's not even clear what the look and feel of a native Windows program is supposed to be anymore.
You're not, but then you're left behind on functional, security, and functionality updates as well (which you do want), not just on useless UI changes (which you don't).
The trouble with that epistemology is that it makes it impossible to discern (or at least pointless to discuss) any qualities of a user interface except how popular the interface is or how much humans enjoy using the interface.
> When you change something—be it a GUI or an API—you need to take into account not just whether the change is better, but also whether it's worth the very high inherent cost of change.
This sounds dangerously close to the awful epistemology that is the precautionary principle. Both are saying that if there is some proposal for which we don't have any good knowledge about its potential downsides, we ought to reject the proposal. Your variation seems to be that we should assign the proposal some baseline cost (and thus reject the proposal unless we have knowledge that the benefit exceeds the cost). I simply don't think there is any way of establishing that baseline cost.
But I believe that UI updates, when they happen, should be infrequent, well-considered, and purposeful, with a bias towards leaving things as is. What really bothers me is when feedback is considered invalid because it's assumed to reflect some natural adversity to change. Maybe it does—but that pain your users are feeling is real!
I almost wish software updates still had to be distributed on discs, so that vendors would be required to think and plan instead of essentially performing perpetual experiments on their customers.
In this regard, I find Linux XWindows and WMs to strike a good balance. It allows me to choose Xfce, or even i3. It gets most of the bad bits about modern computing (notifications, desktop sounds, panels, popups, windows-fiddling, desktop icons, shortcuts, ads, whatever will they come up with next), to just go away and not torment me anymore.
Has anyone ever counted the number of times that Firefox has changed its UI. I would bet they have people working on a new one right now.
What do we call it when people keep "fixing" things that are not broken. When people misuse the word "broken" to signify their dislike for something.
If we dig down deep enough in the source code of things, I am sure we can some interfaces that do not change much.
Because it duplicates options, separates the users in two camps, adds 2 (or more if you keep doing it) interfaces for the dev team to keep supporting, and so on...
IME, observing other users, it is the forced change that is offensive to some. They do not complain about "lack of support" (many want to be self-suficient and do not want to keep asking for "support"), they complain about changes that they see as unnecessary and that will just require more effort to learn.
That's not an "ergo." Initially not liking something doesn't make it bad.
You seem to be extremely overestimating the cost of change. A few people can complain loudly for 24 hours, and then everyone's practically forgotten what the old UX was even like.
Kids can be averse to eating their vegetables, it doesn't mean we stop giving them to them. Similarly, when consumers actually benefit from the changes, it's not always immediately apparent, so they complain until they realize that son of a gun, the app is actually better/faster/easier now -- and then they rarely go back and talk about how great the new way is, they just use it without comment.
Using "humans are adverse to change" as an argument to not change is a pretty terrible way to go about things.
I'm not saying change for change's sake is good... but that people are actually happy for positive change even when it's small, even if they gripe about it at first. Remember, actions (continued usage) are louder than words.
(If people are leaving your app, on the other hand, it's probably not because of change in itself -- it's because the change was bad. We all make mistakes. Just fix it!)
How do you know how many have forgotten the old UX vs. simply given up after being dismissed or ignored? How do you distinguish between someone who has learned to like the change, and someone who is quietly enduring it or has left the conversation to migrate away from the problematic software?
I don't know a way to measure those two groups better than simply talking to them, but I can say with certainty that the latter exists. I have felt the impact of short-sighted, change-happy, dismissive UI development more than a few times.
This bit from the article resonates with me:
> What matters to [most regular people] is continuity and reliability. Again, this isn’t being change-averse. Regular users typically welcome change if it brings something interesting to the table and, most of all, if it improves functionality in meaningful ways.
I like continual improvement as much as the next creative mind, but I also have great respect for the users of my software and understand that disruption is frustrating and costly to them. I therefore avoid releasing workflow-breaking changes until I am fairly sure they make things objectively better and I can also provide backward compatibility and/or a smooth migration path to folks who need it. If I can't, then I seriously consider redesigning or discarding the change I had in mind.
To put it a different way: Once I have encouraged people to use my software, I feel a responsibility to them. It no longer belongs exclusively to me, and I am not necessarily the most qualified judge of what is "better".
I think you're projecting/generalizing here. I for one love change, I get bored of interfaces if they stay the same for too long, stuff starts to "feel old". I install OS updates as soon as possible and I love to use the refreshed designs. The world around us changes constantly, it's not static. Fashion evolves, aesthetics evolve, and hardware and software design
Now whether your preference for no change or my preference for change is more common in the population is a different question.
A change in UI must generally be so good, as to completely overshadow the costs of change. Hence, most changes will NOT pass that high bar, and are thus bad changes.
Unless some UI/UX feature you're considering received overwhelming support from your users (both regular and power-users), it's probably not worth doing and you should be directing your efforts anywhere else.
And as far as dumbed-down tools, it really feels like literally no one inside Apple even uses their tools (or they have access to better stuff and don’t care). For instance, “Console” - once the simplest imaginable concept with an implementation that everyone would expect - is now a pointless mess. An app that should have obvious behavior and usability is now just confusing every time I open it, leaving me feeling defeated because I cannot even find what I need. They messed it up and moved on, like many other things.
Would you be willing to elaborate, or link to articles supporting this? I'm sure that they have them, but since I swap between windows and mac daily I might just not be noticing them.
A collection of comments on reformatted alert messages: .
Comment on icons, translucency and other quirks: .
Not to mention posts on Reddit, Twitter, blogs, etc. about lots of these UI quirks.
I so intensely hate this design trend of hiding UI elements until you move the mouse cursor into a “magic” area that makes them appear. I work a lot with tools and websites that do this and it’s simply maddening.
I am not too sure about that decision. I am increasingly thinking those that are stuck or are comfortable with a PC or Mac, are very different group of user to those who use iPad and iPhone. Most of them are only using PC when they are forced to use it such as work.
But then a lot of the UX design expert left when Jony become CDO and had crashes during iOS 7 era. And it is the same that current design chief is from an industrial background, with little to no software ux expertise.
It seems it's a trend to make everything require more clicks for 'cleanliness'. In the end, it just makes everything frustrating. You end up having to add search to make anything usable. Windows 10 and Android settings are both great examples of this.
Eventually the problem did go away because of larger monitors in a higher resolution. But well, nobody would ever think about going back into those ugly interfaces!
Oh, and count me as another previous Namecheap customer that isn't anymore mostly because of their interface. "Mostly" because I had one domain that needed DNSSEC when they didn't support it, and that's what started my migration, but what made me move everything was the interface.
Some of us would like to get our toolbars back.
The old ugly interfaces were perfectly good for simple software. We could use something better on the more complex kinds of software, but what he have today isn't.
The #1 rule I used to follow is
I am on this control what can I do here and is it clear.
The #2 rule is
What is my next step from step #1 and is it easy to find where I currently am.
Break either of those and the GUI will be confusing. Disappearing controls makes you think something is wrong. I would typically go with disabled and a clear indicator of the something else needing to be filled out. If that was not achievable a tooltip on the disabled control to tell the user what to do.
The way to achieve it is to pretend you are on 2 hours of sleep and not very computer savvy.
Now you can go too far and make the whole thing a busy mess and then it is unclear where to even begin. A good rule of thumb with that is left to right top to bottom, keep scrolling to a minimum. Most people get that. One I was using just yesterday was jumping around in the page and the 'finish' control was in the top left and scrolled off the page.
Namecheap's old UI was perfect, info dense, you could do everything. Now they try and hide everything useful from me in the name of looking like every other hipster inspired UI. It's not bad enough I've bothered moving off of them but I've stopped using Namecheap for new names simply because of how much I hate their new UI.
All they had to do was literally nothing and I'd still be a happy customer.
This is such an ignorant and oversimplified argument. The only reason one could assume that it’s the designers at Apple who “need to show impact” instead of PR project management shenanigans is if you’ve never worked with design at a large scale. I’m pretty sure some random designer at Apple has already redesigned something like Mission Control (which has been pretty much unchanged in the last few OS iterations) but it just isn’t the right time to launch said upgrade so it just sits on the back burner for years.
Had GNUStep or something like it been able to marshal the resources that have been poured into multiple rewrites of Gnome & KDE, I think we’d have the basis for the low-churn system we wish Apple had provided us.
Maybe if someone like this had come along 20 years ago things would be different https://github.com/trunkmaster/nextspace
If that's the case, then OS developers should spend less time making it fit some designer's aesthetic of beauty and instead make it super mailable, so users can make it look pleasing to them instead.
Not surprisingly, 90s OS did that better too.
As a designer and as a software engineer we should strive to deliver a good out of the box experience.
But, we should also try to make it simple to reuse our works: stick to standards, provide APIs, make sure theming and extension is as simple as possible.
Yes, this leads to complexity, but if people could do it in 1995 surely we can do it in 2021 as well?
The author of the article is on to something with regards to Apple focusing on what demos well rather than what is good to use.
Apple has always been focused on superficial interface changes. Every single release of OS X has had minor, sometimes major, revisions to the interface.
Remember when Sheets were gonna be a thing? And Drawers? And Metal (the UI, not the GL)?
Show me an OS release and I'll show you things Apple moved around.
Go back further, and you'll see it's always been the case.
System 7 was a massive change from System 6. Mac OS 8 was a HUGE departure from System 7. Mac OS 9 had substation changes from Mac OS 8.
This whole "remember the good ole days when nothing changed?" schtick people pull to make bad points is insanely frustrating if you actually pay attention to things as they change.
That's exactly how I feel about the crappy design changes that Windows and macOS have been increasingly forcing on us. We get used to it not because we like it, but often because there is no other way.
As a long time Windows user migrant to macOS, I find this more true of macOS now (it really pisses me off that you still can't maximise a window consistently on macOS). Both get a lot of things right, but I feel Windows UI still has an edge (if you choose to ignore Windows 8). (For the Apple lovers - I say this confidently after I deliberately chose not to customize macOS to be like Windows And really embraced the whole "Apple way of doing things" to really experience if it is better. I can honestly say no, it isn't.)
But just as the Americans stubbornly decided to drive on the right when the the rest of the world drove on the left, Windows and macOS too deliberately choose different UI paradigm and approach more for business reasons than functional ones.
It's often irritating because sometimes it feels as if the changes in UI are just a way to try to differentiate the products without any improvement in usability - it took me nearly a year and half to get as comfortable with macOS as I was with Windows. But I still feel Windows UI is more productive, and less irritating, than macOS. (The only thing I like on the macOS desktop environment is the unified Menubar on top and Spotlight.)
I guess that’s 15% to 20% of the world’s population.
> First, we’re going to have a single OS strategy at Apple. We’re not going to have a dual or a triple or a quadruple OS strategy like some others. We’re going to have one OS, and that’s very important to us.
And twenty years later, Apple still doesn't have a single OS.
PLUS there were multiple versions of each of those.
There were actually rumors and concerns that, moving forward, Apple would do the same with the NeXT stuff. Because they did do the same with the NeXT stuff.
For a year, Apple had Mac OS 9 (Classic) and Mac OS X Server (which was based on NeXTStep).
Amazingly, saying, "We're just gonna ship one thing to everyone, and change the features depending on the distribution" was actually novel.
That same basic strategy persists today, since macOS, iOS, iPadOS and tvOS are all ostensibly the same operating system with different interfaces.
That seems a stretch, MacOS, iOS, iPadOS, tvOS seems like the Windows ME, Windows 2000 problem but 100 times worse.
For instance my mother searches for a covid tracking/checkin app on the iPad app store because she only has an iPad and it isn't even there despite running fine on iOS, it's just the developer is expected to put in the extra work to specifically support iPads so they never bother.
Maybe there is some secret option somewhere to show iPhone apps but she certainly can't find it, or even begin to understand why my Dads iPhone can have the app but not her iPad.
That an application doesn't support iPadOS, but does support iOS is a developer choice, and has nothing to do with the underlying operating system.
All for all, I wholeheartedly agree with what's being said here. After Mojave was released, I get the feeling that Apple started to panic, and began some unnecessary design regressions that really drove people like me away. They started gutting 32-bit apps and libraries, bloating one of the most space-efficient UIs on the market, and undoing a lot of the personalization options that I loved MacOS for. One day, I hope that Apple can put this all aside and make a genuinely great computer. They're holding a lot of the cards, but I still can't recommend the M1 to others yet, much less integrate it into my own professional workflow.