Hacker News new | past | comments | ask | show | jobs | submit login
Habits, UI changes, and OS stagnation (morrick.me)
87 points by naetius 15 days ago | hide | past | favorite | 58 comments

> "Is this really bad UI, or is it just you who are averse to change?"

It doesn't matter!

The funny thing about human-computer interfaces is that they are used by humans. Humans are adverse to change; ergo, a constantly-changing interface is also a bad interface.

When you change something—be it a GUI or an API—you need to take into account not just whether the change is better, but also whether it's worth the very high inherent cost of change. I don't think the modern tech industry does this anywhere near enough.

This is why I abandoned Mac and Windows...

My computer is a workstation, not a playground, and I don't need things randomly rearranged for me.

I resisted for a long time, because Linux seemed like a "hassle", but once I'd had to reconfigure everything several times after upgrades, I realized that I was essentially going through the same process, only on someone else's schedule and not by choice.

After trying out a couple of WMs and choosing one to become familiar with, I've NEVER AGAIN had that experience. Even if a distro has a different default setup, it's a matter of copying the config file I already have saved from a previous install.

What a load off my shoulders!

>This is why I abandoned Mac and Windows..

For what? Linux is even worse on that aspect.

First for the OS GUI (Desktop manager, finder, launcher, and config apps) - both Gnome and KDE and Canonical and others have the tendency to remake it (usually worse) every few years and every major release. Then there's the fact that key things keep missing in action (like image preview in the file selector in Gnome iirc).

And that's for the core desktop (launcher + config apps). It gets worse with regular apps. Some being GTK and others KDE/QT based, some use bizarro or legacy frameworks, some with custom cross-platform UIs that kind of match the look of just GTK apps usually, and so on.

Sure, as you did, you can always go with a minimal setup (just some terminals and a very simple window manager), and that will stay stable, but that's really retreating from UI, not a holy grail of stable UI. And it doesn't include your UI apps (which suffer from what I described above).

You're right, Linux distros are not immune to this, and I've omitted a portion of my journey in my summary.

I've also tried Ubuntu, Debian, and a few others, and had to abandon them for the same reasons.

Eventually, I settled on [redacted], which is pretty raw and immature, but once I get it tweaked out, I can move the config file to another distro and have it work roughly the same.

Basically, I doubled down and committed to cutting out all the software which pulls that kind of shit, which means I no longer use Chromium or Firefox (except for testing), Gnome, etc.

I found a lightweight WM which seems to be relatively stable in "poor emulation of Windows 95 plus a few additions" mode, which is available across several different distros, and I keep a repo with my config for it so that I can readjust it from whatever the distro defaults it to.

It's not a day's worth of work, and not even a year, but now I'm fairly confident that I can sit down at my WORKSTATION and something hasn't randomly changed and I can't get it back to how it was before.

In fact, I've been able to get this experience while not buying or using any new devices, since I only use secondhand hardware for ethical/moral reasons.

> And that's for the core desktop (launcher + config apps). It gets worse with regular apps. Some being GTK and others KDE/QT based, some use bizarro or legacy frameworks, some with custom cross-platform UIs that kind of match the look of just GTK apps usually, and so on.

To be fair, I don't think Windows has had this advantage over Linux for over 20 years. Maybe macOS has, but in Windows, almost every application is built with a weird custom proprietary toolkit. Even Office and Visual Studio, Microsoft's flagship Windows programs, are. That desktop Linux only has two major toolkits is a blessing. With the rise of UWP and official Microsoft Electron apps like VS Code and Teams, it's not even clear what the look and feel of a native Windows program is supposed to be anymore.

Do you miss out on a lot or are otherwise forced to upgrade the OS Gui or WM? As far as I can tell there are a ton of forks of gnome 2 specifically because of dislike against gnome 3.

>Do you miss out on a lot or are otherwise forced to upgrade the OS Gui or WM?

You're not, but then you're left behind on functional, security, and functionality updates as well (which you do want), not just on useless UI changes (which you don't).

> The funny thing about human-computer interfaces is that they are used by humans. Humans are adverse to change; ergo, a constantly-changing interface is also a bad interface.

The trouble with that epistemology is that it makes it impossible to discern (or at least pointless to discuss) any qualities of a user interface except how popular the interface is or how much humans enjoy using the interface.

> When you change something—be it a GUI or an API—you need to take into account not just whether the change is better, but also whether it's worth the very high inherent cost of change.

This sounds dangerously close to the awful epistemology that is the precautionary principle. Both are saying that if there is some proposal for which we don't have any good knowledge about its potential downsides, we ought to reject the proposal. Your variation seems to be that we should assign the proposal some baseline cost (and thus reject the proposal unless we have knowledge that the benefit exceeds the cost). I simply don't think there is any way of establishing that baseline cost.

Those are actually great points! I certainly don't want to hand waive away any and all UI discussions (particularly as I personally enjoy discussing UI design).

But I believe that UI updates, when they happen, should be infrequent, well-considered, and purposeful, with a bias towards leaving things as is. What really bothers me is when feedback is considered invalid because it's assumed to reflect some natural adversity to change. Maybe it does—but that pain your users are feeling is real!

I almost wish software updates still had to be distributed on discs, so that vendors would be required to think and plan instead of essentially performing perpetual experiments on their customers.

This is exactly right. Producers get hyper-optimized on change, while users want either no change, or obvious improvements. Majority of incentives and goals of producers rarely align with every individual user.

In this regard, I find Linux XWindows and WMs to strike a good balance. It allows me to choose Xfce, or even i3. It gets most of the bad bits about modern computing (notifications, desktop sounds, panels, popups, windows-fiddling, desktop icons, shortcuts, ads, whatever will they come up with next), to just go away and not torment me anymore.

Why not instead of "change" just make a new interface and give users the option to switch if they want. This approach seems too rare.

Has anyone ever counted the number of times that Firefox has changed its UI. I would bet they have people working on a new one right now.

What do we call it when people keep "fixing" things that are not broken. When people misuse the word "broken" to signify their dislike for something.

If we dig down deep enough in the source code of things, I am sure we can some interfaces that do not change much.

>Why not instead of "change" just make a new interface and give users the option to switch if they want. This approach seems too rare.

Because it duplicates options, separates the users in two camps, adds 2 (or more if you keep doing it) interfaces for the dev team to keep supporting, and so on...

Give users the option to keep using the "familiar" interface, but without any "support". I suspect many user actually require "support" because of interface changes.

IME, observing other users, it is the forced change that is offensive to some. They do not complain about "lack of support" (many want to be self-suficient and do not want to keep asking for "support"), they complain about changes that they see as unnecessary and that will just require more effort to learn.

The UX staff have to do *something* while they’re employed. If their training is best suited to the development of brand-new products over maintaining established ones, and if they want to enjoy both job security and career advancement, the UI is in a constant state of flux. On the business side, I think there might be the belief that these updates could have some competitive or PR value.

> Humans are adverse to change; ergo, a constantly-changing interface is also a bad interface.

That's not an "ergo." Initially not liking something doesn't make it bad.

You seem to be extremely overestimating the cost of change. A few people can complain loudly for 24 hours, and then everyone's practically forgotten what the old UX was even like.

Kids can be averse to eating their vegetables, it doesn't mean we stop giving them to them. Similarly, when consumers actually benefit from the changes, it's not always immediately apparent, so they complain until they realize that son of a gun, the app is actually better/faster/easier now -- and then they rarely go back and talk about how great the new way is, they just use it without comment.

Using "humans are adverse to change" as an argument to not change is a pretty terrible way to go about things.

I'm not saying change for change's sake is good... but that people are actually happy for positive change even when it's small, even if they gripe about it at first. Remember, actions (continued usage) are louder than words.

(If people are leaving your app, on the other hand, it's probably not because of change in itself -- it's because the change was bad. We all make mistakes. Just fix it!)

> A few people can complain loudly for 24 hours, and then everyone's practically forgotten what the old UX was even like.

How do you know how many have forgotten the old UX vs. simply given up after being dismissed or ignored? How do you distinguish between someone who has learned to like the change, and someone who is quietly enduring it or has left the conversation to migrate away from the problematic software?

I don't know a way to measure those two groups better than simply talking to them, but I can say with certainty that the latter exists. I have felt the impact of short-sighted, change-happy, dismissive UI development more than a few times.

This bit from the article resonates with me:

> What matters to [most regular people] is continuity and reliability. Again, this isn’t being change-averse. Regular users typically welcome change if it brings something interesting to the table and, most of all, if it improves functionality in meaningful ways.

I like continual improvement as much as the next creative mind, but I also have great respect for the users of my software and understand that disruption is frustrating and costly to them. I therefore avoid releasing workflow-breaking changes until I am fairly sure they make things objectively better and I can also provide backward compatibility and/or a smooth migration path to folks who need it. If I can't, then I seriously consider redesigning or discarding the change I had in mind.

To put it a different way: Once I have encouraged people to use my software, I feel a responsibility to them. It no longer belongs exclusively to me, and I am not necessarily the most qualified judge of what is "better".

And something Microsoft has completely lost sight of with Windows 11. I guess they don't do any UI/UX testing with power users anymore. Explorer now has TWO context menus, because? I don't even.

> Humans are adverse to change;

I think you're projecting/generalizing here. I for one love change, I get bored of interfaces if they stay the same for too long, stuff starts to "feel old". I install OS updates as soon as possible and I love to use the refreshed designs. The world around us changes constantly, it's not static. Fashion evolves, aesthetics evolve, and hardware and software design evolves too.

Now whether your preference for no change or my preference for change is more common in the population is a different question.

Great, then there's no reason to ask the original question! Just take feedback as feedback. If your users say they're annoyed, they probably are.

I'm very glad to see this line of reasoning being the top comment.

A change in UI must generally be so good, as to completely overshadow the costs of change. Hence, most changes will NOT pass that high bar, and are thus bad changes.

Unless some UI/UX feature you're considering received overwhelming support from your users (both regular and power-users), it's probably not worth doing and you should be directing your efforts anywhere else.

To be blunt, while the macOS 11 UI is not all bad, it contains very serious design mistakes, ruining things that everyone has no choice but to use (like notifications and alerts). Somehow Apple doesn’t get how terrible this trend is, and that it is damaging their entire brand. I can only assume the hierarchy at Apple is broken to the point where the people in charge are either unreachable, or unresponding to feedback, while having entirely too much control.

And as far as dumbed-down tools, it really feels like literally no one inside Apple even uses their tools (or they have access to better stuff and don’t care). For instance, “Console” - once the simplest imaginable concept with an implementation that everyone would expect - is now a pointless mess. An app that should have obvious behavior and usability is now just confusing every time I open it, leaving me feeling defeated because I cannot even find what I need. They messed it up and moved on, like many other things.

> macOS ... contains very serious design mistakes

Would you be willing to elaborate, or link to articles supporting this? I'm sure that they have them, but since I swap between windows and mac daily I might just not be noticing them.

For notifications, here[1] is a pretty good overview of the problem.

A collection of comments on reformatted alert messages: [2].

Comment on icons, translucency and other quirks: [3].

[1] https://tyler.io/240-invisible-pixels/

[2] https://mjtsai.com/blog/2020/07/03/big-surs-narrow-alerts/

[3] https://medium.com/macoclock/why-apple-has-broken-some-basic...

Not to mention posts on Reddit, Twitter, blogs, etc. about lots of these UI quirks.

> Worse, though, Big Sur hides the (X) until you mouse over the literal bounds of the banner - not even the area where the hidden (X) will appear is initially valid.

I so intensely hate this design trend of hiding UI elements until you move the mouse cursor into a “magic” area that makes them appear. I work a lot with tools and websites that do this and it’s simply maddening.

I dont think they are design mistakes, but it is clear the whole macOS 11 UI was designed with 1 Billion + iOS / iPad OS user in mind and not regular macOS users. The possibility of them buying into the Mac Ecosystem. From consumers, to enterprise.

I am not too sure about that decision. I am increasingly thinking those that are stuck or are comfortable with a PC or Mac, are very different group of user to those who use iPad and iPhone. Most of them are only using PC when they are forced to use it such as work.

But then a lot of the UX design expert left when Jony become CDO and had crashes during iOS 7 era. And it is the same that current design chief is from an industrial background, with little to no software ux expertise.

>But making previous features or UI elements less discoverable because you want them to appear only when needed (and who decides when I need something out of the way? Maybe I like to see it all the time) — that’s not progress.

It seems it's a trend to make everything require more clicks for 'cleanliness'. In the end, it just makes everything frustrating. You end up having to add search to make anything usable. Windows 10 and Android settings are both great examples of this.

It's a reaction to the Word-like screen-filling toolboxes of the late 90's. And while the toolbox not taking your entire screen would be a real improvement, what we got was a smaller set of larger buttons with plenty of negative space to make them look nice, taking approximately the same area as before, but with added clicks and discoverability problems.

Eventually the problem did go away because of larger monitors in a higher resolution. But well, nobody would ever think about going back into those ugly interfaces!

Oh, and count me as another previous Namecheap customer that isn't anymore mostly because of their interface. "Mostly" because I had one domain that needed DNSSEC when they didn't support it, and that's what started my migration, but what made me move everything was the interface.

> But well, nobody would ever think about going back into those ugly interfaces!

Some of us would like to get our toolbars back.

Well, that part was sarcastic... The new space wasting interfaces are worse in many ways, and better in none.

The old ugly interfaces were perfectly good for simple software. We could use something better on the more complex kinds of software, but what he have today isn't.

Well said.

The #1 rule I used to follow is I am on this control what can I do here and is it clear. The #2 rule is What is my next step from step #1 and is it easy to find where I currently am.

Break either of those and the GUI will be confusing. Disappearing controls makes you think something is wrong. I would typically go with disabled and a clear indicator of the something else needing to be filled out. If that was not achievable a tooltip on the disabled control to tell the user what to do.

The way to achieve it is to pretend you are on 2 hours of sleep and not very computer savvy.

Now you can go too far and make the whole thing a busy mess and then it is unclear where to even begin. A good rule of thumb with that is left to right top to bottom, keep scrolling to a minimum. Most people get that. One I was using just yesterday was jumping around in the page and the 'finish' control was in the top left and scrolled off the page.

"More clicks for cleanliness" nails it exactly.

Namecheap's old UI was perfect, info dense, you could do everything. Now they try and hide everything useful from me in the name of looking like every other hipster inspired UI. It's not bad enough I've bothered moving off of them but I've stopped using Namecheap for new names simply because of how much I hate their new UI.

All they had to do was literally nothing and I'd still be a happy customer.

> Now you have overpaid “““designers””” that need to show “““impact””” every year, so they have to reinvent the wheel over and over.

This is such an ignorant and oversimplified argument. The only reason one could assume that it’s the designers at Apple who “need to show impact” instead of PR project management shenanigans is if you’ve never worked with design at a large scale. I’m pretty sure some random designer at Apple has already redesigned something like Mission Control (which has been pretty much unchanged in the last few OS iterations) but it just isn’t the right time to launch said upgrade so it just sits on the back burner for years.

As a software engineer, who hasn't felt like "I definitely know how to make everything better by rewriting the whole thing!"? UI/UX designers feel the same way. It's never the answer, but it's not the end of the world if someone does it; after some time and feedback the product becomes useable again.

I feel like Apple has generally done a good job containing churn to the superficial parts of the OS. That said, I wish the Linux/BSD mindshare had chosen to invest its desktop efforts into creating a rock solid OpenStep implementation.

Had GNUStep or something like it been able to marshal the resources that have been poured into multiple rewrites of Gnome & KDE, I think we’d have the basis for the low-churn system we wish Apple had provided us.

Maybe if someone like this had come along 20 years ago things would be different https://github.com/trunkmaster/nextspace

Ironically, the thing that first got me really into Linux (which I consider a functional and pragmatic system at its core) was having lots of window managers, some of which did frivolous things like having waves lapping on my desktop. I think there are worse things than operating systems looking pretty. Your computer might be a tool, mine’s the spaceship I spend most of my life inside.

> Your computer might be a tool, mine’s the spaceship I spend most of my life inside.

If that's the case, then OS developers should spend less time making it fit some designer's aesthetic of beauty and instead make it super mailable, so users can make it look pleasing to them instead.

Not surprisingly, 90s OS did that better too.

I'm reminded that Windows 3.1 had dark mode in the early 1990s, and not simply inverting colors. Theming was so easy compared to today that it's embarrassing.

Heck, the main reason I originally figured out how to root my high school's 3.1 setup was so I could enable the "black leather jacket" theme.

But the designers really, really want that aesthetic: https://stopthemingmy.app/

IMO: Perfect example of designers thinking they are something they are not:

As a designer and as a software engineer we should strive to deliver a good out of the box experience.

But, we should also try to make it simple to reuse our works: stick to standards, provide APIs, make sure theming and extension is as simple as possible.

Yes, this leads to complexity, but if people could do it in 1995 surely we can do it in 2021 as well?

I felt a bit queasy reading that.

Absolutely. I love that about Linux: it offers the user choice. That's why I use i3, no frivolous waves for me!

At work I mostly use Linux and at home mostly a 2017 Macbook Air. I do have a Windows PC that I put together in ~2012. I recently used the PC again because there was some Windows only software I needed to use and it was shocking going from Mac to that. Windows gets a lot of shit, but it felt way more responsive than Mac. I think due to the lack of animations on everything.

The author of the article is on to something with regards to Apple focusing on what demos well rather than what is good to use.

It's like fashion. When is skeuomorphism coming back?

I do not understand what this author is talking about.

Apple has always been focused on superficial interface changes. Every single release of OS X has had minor, sometimes major, revisions to the interface.

Remember when Sheets were gonna be a thing? And Drawers? And Metal (the UI, not the GL)?

Show me an OS release and I'll show you things Apple moved around.

Go back further, and you'll see it's always been the case.

System 7 was a massive change from System 6. Mac OS 8 was a HUGE departure from System 7. Mac OS 9 had substation changes from Mac OS 8.

This whole "remember the good ole days when nothing changed?" schtick people pull to make bad points is insanely frustrating if you actually pay attention to things as they change.

My friend was trying to pitch anal sex to me, and one of the things he said was - "Well, the anus might adjust and accommodate to you inserting stuff in it, but it doesn't mean it likes things shoved into it."

That's exactly how I feel about the crappy design changes that Windows and macOS have been increasingly forcing on us. We get used to it not because we like it, but often because there is no other way.

As a long time Windows user migrant to macOS, I find this more true of macOS now (it really pisses me off that you still can't maximise a window consistently on macOS). Both get a lot of things right, but I feel Windows UI still has an edge (if you choose to ignore Windows 8). (For the Apple lovers - I say this confidently after I deliberately chose not to customize macOS to be like Windows And really embraced the whole "Apple way of doing things" to really experience if it is better. I can honestly say no, it isn't.)

But just as the Americans stubbornly decided to drive on the right when the the rest of the world drove on the left, Windows and macOS too deliberately choose different UI paradigm and approach more for business reasons than functional ones.

It's often irritating because sometimes it feels as if the changes in UI are just a way to try to differentiate the products without any improvement in usability - it took me nearly a year and half to get as comfortable with macOS as I was with Windows. But I still feel Windows UI is more productive, and less irritating, than macOS. (The only thing I like on the macOS desktop environment is the unified Menubar on top and Spotlight.)

Option click plus window button in top left of window to maximise without going into full screen. Everywhere except Britain and Japan drives on the right.

We’re way of from the subject at hand, but driving at the left is not that rare. A couple of former British colonies drive at the left, too (e.g. Australia, India, and South Africa) plus a few countries such as Indonesia, Suriname and Thailand (https://en.wikipedia.org/wiki/Left-_and_right-hand_traffic)

I guess that’s 15% to 20% of the world’s population.

My point was that sometimes these decisions are made for other reasons than safety or practicality. Just as Windows OS keeps it window control on the top-right corner while macOS chooses to keep it on the top-left corner. Driving left or right seems to be more of a political decision (when Americans fought the British) just as the UI choices on Windows and macOS sometimes seem to be just to differentiate their product and make it difficult for the users to easily migrate to the other platform.

India alone is ~15% of the world's population, and we drive on the left. Although the number of drivers may be significantly less as a percentage.

Try it with Safari - it doesn't work. Safari only maximises the window vertically and not horizontally. That is why I said there is no consistent way of maximising windows on macOS.

Bit odd to see the author of this article seems to be so vested in all things Apple and he still spells macOS incorrectly.


> First, we’re going to have a single OS strategy at Apple. We’re not going to have a dual or a triple or a quadruple OS strategy like some others. We’re going to have one OS, and that’s very important to us.

And twenty years later, Apple still doesn't have a single OS.

Jobs was specifically targeting Microsoft there. Back then, Microsoft had two distinct operating systems … Windows 2000 (based on the NT kernel) and Windows ME (based on Windows 9x and MS-DOS). Windows 2000 was the "professional" operating system and ME was the "home" version.

PLUS there were multiple versions of each of those.

There were actually rumors and concerns that, moving forward, Apple would do the same with the NeXT stuff. Because they did do the same with the NeXT stuff.

For a year, Apple had Mac OS 9 (Classic) and Mac OS X Server (which was based on NeXTStep).

Amazingly, saying, "We're just gonna ship one thing to everyone, and change the features depending on the distribution" was actually novel.

That same basic strategy persists today, since macOS, iOS, iPadOS and tvOS are all ostensibly the same operating system with different interfaces.

>That same basic strategy persists today, since macOS, iOS, iPadOS and tvOS are all ostensibly the same operating system with different interfaces.

That seems a stretch, MacOS, iOS, iPadOS, tvOS seems like the Windows ME, Windows 2000 problem but 100 times worse.

For instance my mother searches for a covid tracking/checkin app on the iPad app store because she only has an iPad and it isn't even there despite running fine on iOS, it's just the developer is expected to put in the extra work to specifically support iPads so they never bother.

Maybe there is some secret option somewhere to show iPhone apps but she certainly can't find it, or even begin to understand why my Dads iPhone can have the app but not her iPad.

Windows ME and Windows 2000 were fundamentally different operating systems, as they were built on different kernels.

That an application doesn't support iPadOS, but does support iOS is a developer choice, and has nothing to do with the underlying operating system.

It's pretty ironic to review Jobs' goals to "Make the next great personal computer operating system", since they precisely highlight some of the biggest issues with MacOS today. Apple is no longer shipping a "single OS", they're maintaining a series of progressive LTS releases that seems to be a middle-ground nobody can appreciate. MacOS's metaphorical "plumbing" is far from state-of-the-art, too: the networking APIs are continuing to be gimped, while the userspace continues to be overhauled in confusing, highly abstracted ways. Oh, and killer graphics? MacOS has the least-supported, most esoteric graphics interface available today. I've heard people say that it's easier to write graphics code for the Nintendo Switch, because at least that doesn't force you to use Metal.

All for all, I wholeheartedly agree with what's being said here. After Mojave was released, I get the feeling that Apple started to panic, and began some unnecessary design regressions that really drove people like me away. They started gutting 32-bit apps and libraries, bloating one of the most space-efficient UIs on the market, and undoing a lot of the personalization options that I loved MacOS for. One day, I hope that Apple can put this all aside and make a genuinely great computer. They're holding a lot of the cards, but I still can't recommend the M1 to others yet, much less integrate it into my own professional workflow.

Apple is still killing it in the hardware space, but I think the writing was on the wall for Apple software when Forstall was forced out.

... oh damn, that does match up to when I felt iOS abruptly lost its way and headed down the road to becoming a complicated, full, multitasking GUI OS, rather than a light & simple thing-becomer.

Applications are open for YC Winter 2022

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact