
The Decline of Usability - arexxbifs
https://datagubbe.se/decusab/
======
Animats
Ubuntu got worse at 18.04. Logging in on desktop now requires "swiping up"
with the mouse to get the password box. The "swiping" thing is to avoid
problems with unwanted activation when the device is in your pocket. It's
totally inappropriate to desktops.

Then there's icon mania. I've recently converted from Blender 2.79 to Blender
2.82. Lots of new icons. They dim, they change color, they disappear as modes
change, and there are at least seven toolbars of icons. Some are resizable.
Many icons were moved as part of a redesign of rendering. _You can 't Google
an icon._ "Where did ??? go" is a big part of using Blender 2.82. Blender
fanboys argue for using keyboard shortcuts instead. The keyboard shortcut
guide is 13 pages.

Recently I was using "The Gimp", the GNU replacement for Photoshop, and I
couldn't find most of the icons in the toolbar. Turns out that if the toolbar
is in the main window (which is optional), and it's too tall to fit, you can't
get at the remaining icons. You have to resize the toolbar to give it more
space. Can't scroll. There's no visual indication of overflow. It just looks
like half the icons are missing.

(I have no idea what's happening in Windows 10 land. I have one remaining
Windows machine, running Windows 7.)

~~~
kristopolous
Gimp is a great example. If you look up screenshots from the late 90s of gimp
1.0 you think

"Hey wow, that looks pretty great! I know where the buttons are, I can quickly
scan them and it's clear what they do! It isn't a grey on grey tiny soup, they
are distinct and clear, this is great. When is this version shipping? It fixes
everything!"

Apparently almost everyone agrees but somehow we're still going the wrong way,
what's going on here? Why aren't we in control of this?

~~~
loopz
Krita matured nicely over the years and last time I found it quite easy to
use.

UI is hard. It got replaced by "UX", but nobody agrees what that really is. So
it boils down to whatever impracticality designers dream up. When UI was easy,
there were real research, data backing up claims of improvements and laid down
rules to enforce some consistency. This became "unfashionable" and was
removed.

~~~
xg15
My impression is that modern UX is data-driven alright, it just follows
radically different paradigms and goals.

It's not at all anymore about presenting consistent mental models, it's solely
about the ease or difficulty with which particular isolated tasks can be
performed.

It's also not automatically the goal to make all tasks as easy as possible.
Instead, discoverability and "friction" are often deliberately tuned to
optimize some higher-level goals, such as retention or conversion rates.

This is why we have dialogs where the highlighted default choice is neither
the safe one nor the one expected by the user, but instead the one the company
would like the user to take. (E.g. "select all" buttons in GDPR prompts or "go
back" buttons if I want to cancel a subscription.

You can see that quite often in browsers as well, often even with good
intentions: Chrome, for a time, still used to allow installing unsigned
extensions but made the process deliberately obscure and in both Chrome and
Firefox , options are often deliberately placed into easy or hard to discover
locations. (E.g. a toggle on the browser chrome, vs the "settings" screen, vs
"about:config", vs group policies)

~~~
kristopolous
Data driven ux seems to put all users in a single bucket.

I will readily admit in collective number of clicks and screentime, 37 year
old men with advanced degrees in computer science are a super small minority.

But who is the majority then? Who spends the most time on say Reddit and
YouTube? Children! Yes, people who we know are dramatically cognitively
different than adults.

Why does YouTube keep recommending videos I've watched? That's what a child
wants! Why does reddits redesign look like Nickelodeon?

There isn't one user and one interface that's right for everyone when we're
talking about 5 year olds, 50 year olds, and 95 year olds.

We can make them adaptable to the screen, we should also do work to make them
adaptable, at fundamental interaction levels, to the person using the screen.

And not in a clever way, but in a dumb one.

For instance, here's how you could ask YouTube: "We have a few interfaces.
Please tell us what you like to watch:

* Cartoons and video games

* Lectures and tutorials

* Other "

And that's it. No more "learning", that's all you need to set the interface
and algorithms.

Let's take Wikipedia, it could be broken up into children, public, and
scholar. Some articles I'm sure are correct but are way too wonky and academic
for me to understand and that's ok. There's nothing to fix, I'm sure it's a
great tool for professionals. However, there should be a general public
version.

~~~
majewsky
> here's how you could ask YouTube: "We have a few interfaces. Please tell us
> what you like to watch: [...]

This proposal quickly falls apart because your categories are ill-defined
based on your preconceptions. I watch a ton of lectures about video games on
Youtube (e.g. speed run breakdowns or game lore theories). Do I choose the
"Cartoons and video games" bucket or the "Lectures and tutorials" bucket?

~~~
kristopolous
yeah it was off the cuff. If you ask a 9 year old online if they're an adult,
some will say "yes". I mean I guess it's their loss. Maybe a more direct
approach is better.

"We've found adults and teens like different parts of youtube and use it
differently. We want to make it the best for you. You can switch at any time,
but tell us what best describes you:

* I'm an adult

* I'm not an adult.

"

youtube has this "for kids" app which came out after I first started pointing
this difference in earnest around 2013,
([https://play.google.com/store/apps/details?id=com.google.and...](https://play.google.com/store/apps/details?id=com.google.android.apps.youtube.kids&hl=en_US))
but it's not right and they clearly still cater their main interface to the
habits of children who watch the same video hundreds of times - the insane
repetition is a part of learning nuance and subtly in the context of content
they don't have to actually pay attention to. It's all about learning the
meta, super important. They know what happens, it's the silence in between
they're excited about - that's the nature of play.

This app instead silos the kids into a playskool interface, great for people
under 7 or so, but like our playground reform, we've made it completely
unappealing for the 8-22 or so demographic (when I was a kid and there were
ziplines into a bank of tires, you bet there were 20 year olds lining up to
have a good time on those, we all have a need for play; freedom to err wrapped
in relative safety).

Instead, it's data-driven UX for adults and data-driven UX for children - it's
about separating the data, not a PTA-acceptable UX for overprotective parents.

~~~
wolco
The best thing a parent could do is download a set of approved videos and use
a local playlist.

The easiest thing to do is just allow them on youtube no filter.

The middle ground is the play app. Weird stuff sometimes get through but
usually it's more someone dressed as a pretend princess. The good thing it's
never really a murder scene or something equally as horrible (which could
popup on youtube.com).

What would you do as a parent?

I would avoid youtube unless you setup the video until 7 or 11. After that it
depends on the child.

------
red_admiral
It's not only that every app has a different style these days, but some of
them change their style or add new features via auto-update every few weeks.
Even office 365 (desktop version) does this.

It's not just a usability nightmare, it's an accessibility one too (although
the two go hand in hand most of the time). Imagine teaching some elderly
neighbour how to write a word document, and after weeks of practice they get
it into their muscle memory that the thing they want a lot is the 5th button
from the left ... then microsoft adds another one in the next update so it's
now the 6th.

This would be one place where free software could really shine - you could
convert a lot of people with "every application works the same, and we promise
we won't change the UI more than once every two years unless we really have
to.

~~~
wwweston
If you think resume-driven development is bad for developers (and it is),
consider the career incentives for UI and product people. If there's a
"maintain" incentive, I'm not sure what it is. "Didn't change anything about
functional and satisfactory interfaces" may be a real value-add in some cases,
but it's not a sizzling narrative for selling yourself on the job market.

~~~
Ididntdothis
Very true. Most people get measured by how much churn they create. The more
the better. Even if it’s 100% correct for the business you are digging your
own grave if you leave things the way they are.

~~~
userbinator
Most, not all; if you have the right skills, you can find a comfortable job
maintaining legacy code and (actually) improving it. Then again, most
developers seem more interested in chasing new and shiny rather than polishing
a stable system.

~~~
Ididntdothis
" comfortable job maintaining legacy code "

That's a very dangerous career path though. If that legacy system gets
replaced you are usually out of a job and job search is hard with outdated
skills.

~~~
userbinator
To give you an idea of how reluctant they are to replace things, some
equipment in the manufacturing industry is over a century old and still in
continuous use. They will certainly be extremely unwilling to get on the
"upgrade treadmill" that seems to be getting faster these days.

Also, problem solving and creative thinking are never outdated skills. ;-)

------
oftenwrong
It is quite sad how often I end up having to help older relatives with their
computers on account of unintuitive UI. One memorable, recent example was my
mother, who could not figure out how to get her GMail sidebar to un-collapse
itself.

Here's a screenshot of a collapsed sidebar:

[https://storage.googleapis.com/support-forums-
api/attachment...](https://storage.googleapis.com/support-forums-
api/attachment/thread-2548880-18054002412364674965.png)

and a screenshot of an un-collapsed sidebar:

[https://techcrunch.com/wp-content/uploads/2019/02/RC-
Convers...](https://techcrunch.com/wp-content/uploads/2019/02/RC-Conversation-
View-On.png?resize=1536,960)

It took me some time to realise out that it is the hamburger-menu-like icon in
the upper-left corner. It has a tooltip that says "Main Menu", but it is not a
menu. It controls collapsing of the sidebar. Confusingly, it is positioned in
the top panel, separated by a line that would make one think it is not related
to the sidebar, and closer in affinity to the logo, search box, etc.

~~~
FridgeSeal
I just started a new job and they use Gmail for the email, it has been
probably well over a year since I logged into gmail on web.

That’s side hamburger button throws me for a loop every. Single. Times. For
some reason I keep thinking it’ll bring up the other gsuite apps, but instead
the whole page shifts awkwardly aha then the sidebar disappears, “that was not
what I was expecting” is my reaction every time.

~~~
roboyoshi
You may want to try this one: [https://simpl.fyi/](https://simpl.fyi/)

~~~
saagarjha
The white-background screenshots give me borderline anxiety. It looks like
it's perpetually half-loaded.

------
EEMac
Part of this is because the sensors are winning.

Sensors (about 70% of the population) use an application by mapping: a click
_here_ does _this_. Based on literature and my experience with my husband,
maps are made separately for each application no matter what the similarities
are.

Most computer programmers are intuitives: we want things to work the same way
in one application as they work in another. That makes it easier for us to
learn new things.

But we're only 30% of the population. Blame whatever trend you want: phones,
touch screens, microcomputers, Eternal September . . . we've been increasingly
outnumbered and out-spent by the 70% as time goes on.

~~~
wbl
Please explain dd then.

~~~
LargoLasskhyfv
Datenduplikator, data duplicate, duplicate data, dump data, data dump?

~~~
quickthrower2
Due diligence, Danny DeVito, double dare, direct debit?

------
Pmop
I'm young (25) but my first (family) computer used to run Windows 98 (that's
what we could afford). And I can recall well, UI had one "meta-language": menu
bar (file, view ..., help), toolbar, and if you'd hover your mouse pointer
over a widget, a tooltip would show up with explanation about what the widget
does and the keyboard shortcut for that action. Once you learnt how to use one
application, say Paint, you'd probably pick up any other quickly. Also, there
was the always helpful Help Window (F1) with rich explanation about everything
you'd want to know.

I feel that Modern UIs are awkward to use. Many applications have their own
way of doing the same thing other applications do. Oftentimes their new way of
doing something is badly documented (tooltips are too ugly I guess), so now
you have to search the web for help; the help you found is full of useless
text, ads and browser-locking javascript; soon enough you find yourself
longing for Win 98 era UI.

~~~
mopsi
It's all still there in Win32 Apps microcosm, available for anyone to use. I'm
currently working on a pure Win32 application and Microsoft's tips are
delightfully helpful for establishing a consistent design language across the
whole platform. They call it "powerful and simple" in
[https://docs.microsoft.com/en-
us/windows/win32/uxguide/how-t...](https://docs.microsoft.com/en-
us/windows/win32/uxguide/how-to-design-desktop-ux) Win32 Guidelines are also
very illuminating: [https://docs.microsoft.com/en-
us/windows/win32/uxguide/guide...](https://docs.microsoft.com/en-
us/windows/win32/uxguide/guidelines)

My favourite aspect of Win32 is built-in behaviour. Things like keyboard-based
navigation and screen readers work without needing any dedicated effort on my
part. Visual elements are drawn by the operating system and remain consistent
with system theme colors, font size, contrast and other user preferences. I
always think about that when I see "Now supports dark mode" in a changelog...
why don't they just use standard controls and leave all the finer details to
the OS to deal with. Windows has supported system-wide theming for decades.

~~~
userbinator
_Windows has supported system-wide theming for decades._

...and more recently has been castrating that ability greatly, seemingly in
favour of the horrible bland-and-flat trend, which is unfortunate because the
ability to customise is highly desirable for usability/accessibility.

(Long-time Win32 programmer here, I agree that the built-in behaviour of the
standard controls is highly consistent and also very usable.)

------
bob1029
For me the worst isn't even that the UI composition/presentation is bad. It's
that the performance of these interfaces is getting monotonically worse over
time. No one wants to take the time to learn the hard native UI development
processes, so we just wind up throwing some shitty angular SPA application
into electronjs and calling it a day.

Those of us with non-sloth-like reflexes now have to experience torture as
every keypress or mouse click takes an extra 50-100ms to register. Microsoft
even figured out a way to make things that didnt use to run like shit run like
total shit. Of all the things, mspaint is now somehow a little bit slower and
more annoying to work with. I don't know how they managed that one.

~~~
austincheney
The problems with reliance on a giant framework are many. Yes, a noticeable
decay of performance is one of those.

The biggest problem I see though is lost imagination. Usability concerns seem
all but completely and generally forgotten unless a given framework
deliberately provides a dedicated convention for a specific usability concern.

Worse still is that many developers reliant upon a giant framework absolutely
cannot imagine developing anything that is not the exact same SPA as their
last SPA regardless of any usability concerns or business requirements. It’s
like when you’re a hammer and everything is a nail mentality meets the most
myopically crippled tunnel vision.

I used to have great disdain for large frameworks because they result in
degraded application performance with limited capabilities and substantial
bloat. Now I primarily loathe them because they are the primary excuse for
weak under developed talent to self qualify progression as a lack of career
maturity under the perfection of inept disdain. The weakness and lowered
maturity is self evident because their response to any negative mention of
their favorite framework is contrived hostility often expressed by calling the
target of that hostility _arrogant_ without any consideration for the
technical concerns present.

~~~
ativzzz
Bad UI design has nothing to do with your coding framework of choice, it has
everything to do with the design. Most programmers are not good designers. I
cringed a bit at the background color of the OP's blog.

There's a reason that most people are unimaginative, because just like coding,
design is hard. Design for multiple platforms (the whole point of using UI
frameworks is easier multiplatform development) and multiple screen sizes is
hard and takes a long time to both design and implement.

~~~
austincheney
Creativity and originality have tremendous amounts to do with solving UI
problems. Here is an example of developers literally lost without a framework
to tell them what to build:

[https://news.ycombinator.com/item?id=22470179](https://news.ycombinator.com/item?id=22470179)

I wish I were making that up.

There are many aspects of design that are at first challenging. I just watched
this video about inventing a new basketball backboard and it took a lot of
work:
[https://news.ycombinator.com/item?id=22898653](https://news.ycombinator.com/item?id=22898653)
There will always be some minimal thought work to creating and testing any
creative or original concept, but with practice the effort involved reduces
considerably. Even though some minimal effort is required (as with any task)
is not an excuse to avoid effort in the entirety. At some point it is just
mental laziness.

~~~
ativzzz
Again, your choice of JS front end framework has nothing to do with the design
of your UI. You can code the same design with React, Angular, vanilla JS, QT,
etc. From what I've seen, most programmers are awful designers who
overestimate their ability to design "usable" software. In particular, we
forget that the vast majority of the time, we are not creating software for
programmers.

------
jl6
I find the entire Windows/Mac/Linux desktop experience has regressed terribly,
with inconsistency the primary offender.

I suspect this is because usability testing is only ever (a) app-specific and
(b) short term. Nobody is studying the collective desktop experience across
multiple applications, so every vendor thinks they have nailed it, but never
notices that their version of “nailed it” is different to everybody else’s.

The commercial nature of most desktop software would seem to render this
problem insoluble as there is no incentive for vendors to cooperate and every
incentive for them to churn their UIs to push new versions out.

~~~
Frost1x
Lack of transference is a feature for businesses. The more locked in you are
to a workflow or infrastructure, the less likely you are to switch.

It's a new world of lock-in. It's not in a business's interests to encourage
you to jump to competitors, it is however to their advantage if the transition
is difficult in any way. Consumers buy into it and a lot of new developers
wanting to make their mark or do something new/different enable this. It's not
sexy to implement UI concepts used for the past 20 years, I'm going to
reinvent the wheel.

It's perfectly fine to improve the wheel or reinvent it if you can provide
increased productivity. Instead I have so many UIs now, going through what
should and have been simple workflows is like stepping through a box of
puzzles.

------
pcurve
We have army of ux professionals these days. But I think engineers do better
UX than most so called ux professionals and I say this as a ux professional
myself.

we've invented a colossal industry with good intent in the beginning, but over
past 10 years, I've seen it degenerate into desperation for relevancy by
constantly introducing new things. (things aren't much better in SE)

Can we get back to just doing things? It is extremely frustrating working in
software design space, from start to finish. Everything sucks. Process. Tools.
Speed. Complexity. Politics. Pretending.

Is it just me?

~~~
omniscient_oce
I don't get these fake persona things I see in portfolios and on Behance and
stuff. Are these a legit thing or just something that is taught in school so
students do and put on their portfolio but isn't used in practice. I get the
basic idea behind it but a lot of the time it comes off as quite pretentious,
or filler work.

~~~
pcurve
Not just persona. There's journey mapping, storymapping, design thinking,
storybrand exercise, designops, formative/summative/ generative research,
ethnographic study, service design, heuristics, lean ux, contextual inquiry.
I'm not saying these are all useless, but many are just old things re-
packaged. And I've noticed that UX people are rarely challenged by other
stakeholders because no one can possibly keep up with what any of these mean.
It's part FUD and part obfuscation.

If you you have good handle on who your customer base is (current and target),
then no, you don't need personas. Just use real data. Where I become
dumbfounded is when large companies with mountains of customer data with
complicated segmentation and profiles continue to rely on just 5-6 same
personas.

~~~
danielscrubs
Worked at a company where they decided to have a consultant design firm do the
designs. 20 designers and 3 coders to do the actual things.

The money dried up really fast (they where extremely expensive) and the design
just sucked but boy did they have meetings like nobodies business.

------
Semaphor
I had some reflexive reaction of wanting to disagree because there are also a
lot of things that got better. But inconsistency? Hell yes. It feels like
every company tries to run their own experiment, getting more and more
erratic, and apparently all getting great feedback from their users (not sure
if all the feedback systems are broken or something else is going on). Of
course, Microsoft who in recent years started following the "roll a die for
what UI-style we use today" paradigm is one of the worst offenders.

~~~
outworlder
> But inconsistency? Hell yes. It feels like every company tries to run their
> own experiment

Get off my lawn!

More seriously, old apps were way worse. Specially on windows, as soon as APIs
for creating non-square windows became available, everyone wanted to use them.
Nevermind that performance was horrible.

Even widely acclaimed apps had zero consistency with the OS. Remember Winamp?
[https://repository-
images.githubusercontent.com/26149893/956...](https://repository-
images.githubusercontent.com/26149893/956fcc80-612f-11e9-9c6a-cd120bc50de1)

Trillian? Microsoft's own MSN? [https://static.makeuseof.com/wp-
content/uploads/2009/10/Tril...](https://static.makeuseof.com/wp-
content/uploads/2009/10/Trillian-Views.jpg)

And frankly, every single printer, scan, or motherboard utility. Some are
bizarre to this day.

We can't even say that Microsoft apps followed the rules. They were one the
first to break with paradigms, mainly because they could ship their own
version of Microsoft's common controls library. This is how detachable button
bars came about, or the infamous ribbon.

~~~
bsdubernerd
I absolutely do remember this. Horrible. I think, during that time, Windows
was the absolute worst offender.

I always used windows/mac and linux together during that time.

Early versions of OSX on PPC were pretty consistent. I didn't particularly
like some of the "candy" design styles, but the UI guidelines seemed like a
breath of fresh air. Note that Apple themselves started to destroy the
consistency by introducing questionable things like "sheet metal" windows,
sheets, and abuse all of them in iTunes first. Consistency went down pretty
fast.

Looking back, GTK2 for me represented the pinnacle of consistency under Linux.
As a toolkit it enforced resizable UIs (at a time when both OSX and Windows
used fixed-width all the time) and decent components, not to mention that it
supported system-wide themes to a degree never seen before. You can even set
Qt4 to render GTK2 style widgets.

I have to absolutely laugh when I see that apps today "support a dark mode",
where you could (and partly still can) switch THE ENTIRE UI to a dark theme in
seconds in gtk.

But I don't want to defend Linux either. This has too regressed in GTK3 and
Qt5 as well. The internal support for skinnability with CSS has caused most
UIs to override the system theme irreparably. Many UIs ship with hard-coded
themes that you simply cannot change anymore or break horribly when switching
to a non-default theme. There are a ton of widgets which have incredibly poor
consistency and often bring UI paradigms from phones that have _no_ reason to
exist on the desktop. Qt5 QML widgets are so bad I cannot even describe how
frustrated I am every time I see a good UI being converted to downright crap
for "reasons?".

Ubuntu keeps following the latest fads with absolute zero consideration for UI
customization, consistency _and_ performance. We have LXDE, but they too will
have to inherit all the inconsistency on the programs running on top of it and
since they too inherit GTK, there's no escape on the long run.

Still, Android beats the crap on all three easily.

It seems like nobody is even trying anymore when even developer tools gets
rewritten in electron UIs with appalling performance and glaring bugs, yet
they receive praise (and excuses).

------
ridiculous_fish
The web puts us in a usability death-spiral. It's easy to use an onClick div
to make a beautiful pop-up menu, but harder to support much more than clicking
on an item. This in turn trains users to only click, which further erodes the
case for any sort of richer interaction.

This is bleeding into basic browser functions. Find and scroll bars are
routinely broken by the infinite scroll paradigm. Undo/cut/copy/paste are
broken in customized rich text editing. Eventually these features will atrophy
and fall off.

~~~
floren
If you want a vision of the future, imagine a finger scrolling on a
touchscreen -- forever.

~~~
entropicdrifter
How is that the future and not just a description of current social media apps
on mobile/tablet devices?

~~~
hyperdimension
Just in case someone doesn't get the parent post's quote, it's a riff off of a
famous George Orwell quote, "If you want a vision of the future, imagine a
boot stomping on a human face, forever.

------
shureluck
I love this post. I have been screaming at my monitor now for several years.
Something shifted about 5 years ago in UX and everything has definitely gotten
worse since.

When Google Docs started to bring back a traditional menu display in the top
bar, I was so excited. Everything felt normal and it was easy to find what I
was looking for.

But the rest of Google Drive is a disaster. Sometimes buttons are in the upper
left, sometimes lower left. Recently they moved the “Add new document” button
to the lower right and I spent forever trying to find it. It is infuriating
that there is not intelligent person at a company of that size who can put a
stop to this crap.

I really hate to say this, but I think a lot of UX designers are trying to
justify their existence by reinventing things that have already been solved.

The reality is once you decide how a dropdown or a text input works on
desktop, there is very little reason to reinvent it. Ever.

Stop reinventing things UX engineers: your small usability study with 3 of
your friends who got confused for 5 seconds is not a sign that you should
reinvent how to select things in a list.

/endrant

------
Wowfunhappy
I agree with 90% of this article. However, I differ on one point: as far as
I'm concerned, the "File, Edit, View" categories are anachronisms from another
era. They make sense in Microsoft Office†, but fail to cover the breadth of
software in use today.

I'm currently using Firefox (on OS X, where it still has a menu bar). The
first three options under "File" are "New Tab", "New Window", and "New Private
Window". Does it really make sense for any of those to be under "File"? I
understand, historically, why they ended up there—each document used to
correspond to a new file—but tabs fundamentally _are not files_.

I'll switch over to OS X's Messages app‡. The first two options under "File"
are "New Message" and "Open". The former starts a new conversation, and the
latter let's you attach a document to the current conversation. Those actions
aren't related at all, except in that they kind of relate to the concept of
the word "File", depending on which metaphor you're following.

So, I don't think there's anything wrong with mpv grouping its menus
differently from evince. They're doing different things and shouldn't have to
follow the same categories.

\---

† Which is definitely (sarcasm) why Microsoft Office decided to replace the
traditional menu with a ribbon. Again, I agree with most of this article.

‡ I'm running OS X 10.9; Apple may have made changes in newer OS's.

~~~
doubleunplussed
Tabs and conversations aren't files, but it is nice to have a grouping of
actions that create/restore/manage whatever the primary context/data-structure
the app deals with is, as opposed to making changes to it (edit) or modifying
how it's displayed (view). They're very logical categories, just 'file' is not
generally enough named now.

------
dijit
I'm typically a person who would agree with an article like this; I think we
_lost_ something with the modern age, even if we gained a lot. (especially in
terms of developer "velocity" (I hate that word)).

However, I really feel like context is important. Computers today have a
context given to them over time, users don't need so much hand-holding these
days because the expected paradigms are ever so subtly changed. New entrants
to computers understand these new paradigms innately because they are already
surrounded by the new context.

It's only when we look back we think how much usability has suffered.

Language is a good example of what I mean. Travel back 100 years and the
linquistical choices that are made would not only be slightly alien to us,
ours would be absolutely muddy to them.

I think you can make a case that a lot of the new paradigms like electron do
not promote usage of native UI styles and accessibility.

But the Title bar being an overloaded UI element in todays context is
generally ok I think.

~~~
bloomberg2020
Agreed.

Personally I’ve given up on mouse GUIs

Why?

Photoshop pros use macros

Unix pros use text editing macros

Why teach new users single point and click methods of computing when the pros
think it’s a waste of time

It’s from an era when computers couldn’t multi task and were largely business
focused data entry terminals

Photo manipulation can be automated from a terminal and results displayed in
real-time now

Why care about file menus? That’s just a set of keyboard macros unrealized.

The desktop metaphor is finally dying. Let it

~~~
jcelerier
> Photoshop pros use macros

> Unix pros use text editing macros

pros are like, 0.01 %

~~~
msla
And they're the ones who use your tool most of the time.

Do you want to make it "easy" for someone who's never used your thing before,
or do you want to make it easy for someone who uses it a lot?

(Hint: A new UI is _never_ easy.)

~~~
timw4mail
A new UI is much easier if it follows convention.

------
bhauer
Bravo to this article.

The phenomenon is exemplified by Slack and open platforms that follows its
design lead, such as the Riot client for Matrix. Legacy chat clients from the
1990s and 2000s could fit a hundred rows of chat text on a normal 30 inch
monitor. Slack and Riot can display perhaps as many as 30 lines.

The reason is a bunch of excess padding between lines, the injection of
handles/names in the vertical space (because width is precious on small
devices?), unreasonably large circular avatar images, and a host of other
"mobile first" design quirks. Taken all together, we have a user interface
that squanders screen real-estate with abandon.

While a legacy chat user might have chat in a small side window, a Slack or
Riot user will more often than not have their chat application fully maximized
or using a substantial portion of a side monitor. It's a regrettable pattern
but I don't see much momentum on the reverse course.

~~~
Aeolun
Probably because a lot of us actually like this. I cannot deal with things
like IRC where the whole screen is just a big blob of text. I need something
to visually distinguish who the message is from.

~~~
bhauer
Which is fine as a user preference. But modern software no longer allows for
controlling information density. Even "compact" layouts, when available in
modern software, are typically large and sparse by historic standards.

------
mixmastamyk
Today I was using Win 10 in a VM and wanted to put a shortcut in the startup
folder. Used to right-click on the start button. Doesn’t work any more. Can’t
right click on menu icons either, needed to change one shortcut to minimized,
can’t be done. Those were a few of the tiny GUI features that made Windows
superior to every floss desktop, gone.

After ten mins of googling I find that you have to open a separate Explorer
window and type the secret “shell:startup” into the address line to get there
now.

Between that and the control panel mess, what the fuck is going on in Redmond?
(and everywhere else)

I gave up on Ubuntu after Lucid I think and eventually settled on Ubuntu Mate
because it keeps nonsense to a minimum.

It’s as if a bunch of man-bun wearing devs never heard of Chesterson’s fence.
:D

Edit: From the article I realized the fascination with overloading window
titlebars was driven by 16:9 screens, making vertical space _precious._

------
postalrat
[https://new.reddit.com](https://new.reddit.com) vs
[https://old.reddit.com](https://old.reddit.com)

~~~
StartupTree
New Reddit is so so bad, I feel like I'm in a hallucinogenic nightmare when I
accidentally click into it. Kudos that they kept the sane old option for
people who want to use the website.

~~~
b3kart
Question is, for how long? I am dreading the day when I can no longer switch
to old.reddit.com whenever my laptop is begging for mercy.

------
munk-a
Oh yes gods slack in particular got a lot worse with the most recent update,
but the site I think takes the cake is one we Canadians use for food delivery
- skipthedishes.com . Gods this site breaks so many things, options to update
things are hidden among menus that switch the current page while not being
ctrl-clickable into a new tab (which can trash your current in-cart order) and
the whole site loves "minimalism" i.e. let's put soft grey text on a white
background to drive contrast into the ground.

Given that this service's value proposition is basically "As a company we have
a highly usable website and a fleet of delivery drivers" the fact that their
website is trash is super annoying.

(The site is blocked behind address entry, if you'd like to try it out may I
suggest 1 West Georgia Street Vancouver BC)

~~~
lisper
Wow, you are not kidding. I went to this site to see for myself, and the _very
first_ thing I see is re-Captcha. And it doesn't even work! I have to pick out
photos of tractors before I even get to see what is on the site!

Holy cow, why does anyone use this? It's a hot mess.

~~~
exprez135
Didn't even get the chance for a re-Captcha with me. It outright blocks me
because I'm using a proxy.

~~~
userbinator
I wonder if that's because they had far too many pranksters ordering food for
someone else.

------
RedShift1
Ironically this page only uses 70% of my screen's width on mobile, the font
size is uncomfortable and paragraph sentences are broken into 6~7 words which
is really annoying. Chrome suggests "show simplified view" and that definitely
makes it better.

~~~
nxc18
Double tap to zoom to paragraph has been a standard touch gesture for the last
~10 years at least.

~~~
mceachen
Only on iOS. On Android and all desktop OSes, double-tap selects a word.

~~~
majewsky
Huh? Not on Firefox for Android at least. I don't recall much from my pre-
Firefox days on Android, but double-tap to zoom in on a paragraph has been in
my repertoire for a long time.

~~~
mceachen
Wow, we're both right.

Double tap on Android 10 in both Chrome and Firefox will select a word in
mobile-friendly websites (I tried on
[https://PhotoStructure.com](https://PhotoStructure.com) ) and will do the
iOS-zoom-to-fit-bounding-box on websites that aren't (like TFA).

This certainly speaks to an increase in complexity/inconsistency.

------
pragmatic
The designers took over. They finally gained power and with it, unfortunately,
users lost out.

In my unified theory of organizations, the organization is eventually taken
over by the employees/"servants" and instead of serving the constituents it
becomes about serving themselves.

Think Congress, teacher's unions, USPS, IT depts.

In programming it's resume driven development, playing with new toys
constantly instead of delivering solutions.

~~~
MattGaiser
All of that lies at the feet of management (defined broadly).

The people re-elect Congress consistently. Employers reward resume driven
development on the open market. UX people win prizes for how their UIs perform
on high-resolution large monitors even when they will be used on mobile.

I think it would be more correct to say that management exerted greater
control and deferred to the UX designers who were focused on keeping
management happy as management doesn't ever go back and see whether the pretty
mockup works.

~~~
Aeolun
I love how the pretty mockup always assumes that text or images fit perfectly.

It’s probably the single biggest cause of issues we have.

It’s interesting, because the one time a designer asked for my feedback before
handing the design to management (the best one I’ve met, but they left the
org) they hadn’t even considered it, but were all too happy to take it into
account.

------
rpm91
I'll give the author the complaints about modern scroll bars, which drive me
up a wall, but the complaints about Firefox and Chrome really feel like
grasping at straws to find something to dislike.

"This new take on the [Firefox] URL bar pops up when you least expect it, is
very hard to get rid of and, as a bonus, covers the tried, tested and highly
usable bookmarks toolbar below it."

What? It pops up when you click on it, or when you hit Ctrl+ L. That seems
perfectly expected, and is how URL bars have worked as long as I can remember.
And it only covers the bookmarks toolbar (should you choose to enable it) when
you're using the address bar and the dropdown is visible. This is like
complaining that the Save File dialog covers the tried-and-true document
editor in Microsoft Word, but even more ridiculous, because you can, in fact,
access your bookmarks through typing into the megabar. It's an excellent UI
that provides keyboard access to a wide variety of browser features, all in
one location - search, URLs, bookmarks, and even a fuzzy search of open tabs.

And the Chrome tooltips...behave exactly like normal tooltips, but with
slightly different styling to make them more useful for showing the title and
domain name. How is this a "crime[] against basic concepts of desktop UI
design"?

(disclosure: I work for the company that makes one of those applications)

~~~
Drdrdrq
I don't know about Chrome (rarely use it), but I _hate_ megabar in Firefox. It
covers my bookmarks and is completely useless. Why would this input be any
different than other input? I wonder whoever thought that covering other UI
elements is a good idea.

Hopefully now that I know the keyword ("megabar") I will be able to find a way
to disable it. I have searched for an option before but found nothing.

EDIT: just noticed:

> And it only covers the bookmarks toolbar (should you choose to enable it)
> when you're using the address bar and the dropdown is visible.

No, that's the problem - it (partially) covers bookmarks even when the url bar
is empty and there is no dropdown.

------
b0rsuk
The whole "desktop metaphor", as usually implemented, is trash. I'm a happy
user of i3 window manager (a tiling window manager). It's not the first and
probably not the last, but it's the first time I can quickly and efficently
arrange application windows on my screen. I think this will become the default
eventually, tiling WM are the way. The way it uses the screen is beyond
anything. They will just make it more intuitive and comfortable for first-time
users. i3 requires you to memorize, but preferably _define_ your own hotkeys.

Meanwhile applications like skype, other instant messengers, slack, music
players have grown and now are fullscreen by default. Non-blog websites are
usually large and can't be displayed in a simple window. People are
complaining about 80 character rule for code, and go to 120 characters and
beyond - which again means you can fit fewer windows on a screen. I think web
browsers and websites are largely to blame. Because that's like most users
interact with computers today, that's what they expect and don't know it can
be any other way.

Every single application wants to be THE fullscreen application. I think it's
an admission of defeat! Over the decades, they've tried - and failed - to make
smaller application windows that people consider useful. And it's not the
fault of application makers - it's the broken "desktop metaphor" where you're
supposed to move windows like physical objects. It works on a desk because you
have two hands and 10 fingers. Imagine working at a desk (no computer) using
only 1 finger! That's how it feels using mouse. The default window managers
are crap at actually managing windows and arranging them usefully. Dragging
corners, window borders, moving windows feels miserable in the long run, and
when you close one of your windows you need to repeat it when you want another
app window to fit into your layout. So many people just don't bother, get a
bunch of fullscreen windows and alt-tab through them.

And applications with tabs are a symptom of the disease, too. Web browsers,
the blue Microsoft Word, IDEs, and so on. It's alt-tab fullscreen windows in
sheep's clothing. Nothing particularly wrong with alt-tab method, but it
doesn't scale to a large number of windows we have nowadays.

~~~
Paianni
Tiling window managers are a pain without a keyboard.

~~~
ohazi
What is this wacky computer that you and apparently all of the Gnome
developers are using that doesn't have a keyboard?

PCs have keyboards! It's the best part of the computer.

~~~
Paianni
Mainly PoS machines.

------
jacobsenscott
In the old days you had no choice but to use the widgets provided by the OS,
unless you wanted to build everything from scratch, which was very hard. So
everyone used the widgets provided by the OS and things were good. Those
widgets were build by experienced hci experts, so it was hard to go wrong.

Now you have no choice but build all the widgets from scratch - a web browser
doesn't provide any beyond a few form controls. So it is bad - every
application must rebuild all the widgets from scratch, usually by designers
with limited experience and skill.

------
intrepidhero
Relevant quote from my reading yesterday: "Time itself is on the side of
increasing the perceived need for usability since the software market seems to
be shifting away from the features war of earlier years. User interface design
and customer service will probably generate more added value for computer
companies than hardware manufacturing, and user interfaces are a major way to
differentiate products in a market dominated by an otherwise homogenizing
trend towards open systems." \- Jakob Nielson, Usability Engineering - 1993.

I think he partly got the prediction right, that usability would be big
differentiator. Apple and MS over the following years had big efforts focused
on consistency in their interfaces and we had what many consider a golden age
of UI usability, at least from a consistency standpoint. I think what happened
next is that two things came along and basically reset the state of UI design
back to zero: Mobile and the web.

Both platforms were so radically different that Apple and MS UI guidelines
were useless. We got a horde of web and mobile designers experimenting with
all sorts of novel interfaces. Experimentation is a great thing but
consistency has definitely suffered. I've long thought there was big money to
made by somebody wrapping up a complete linux distro, with a set of common
applications (libreoffice, et al) but putting in the (very significant) effort
to standardize _every_ interface, write good manuals and provide customer
support. Sort of like the service that Red Hat provides for servers but with a
desktop focus. Maybe they couldn't eat MS's lunch, but if they could
demonstrate big productivity gains for businesses, maybe they could.

In the last decade I think we've seen the (much needed) injection of artistic
talent into the UI design space. UIs today are much more beautiful than in
1995. That's because businesses realized that users value beauty and hardware
improved to the point where more visual effects could be provided without
sacrificing performance. In the next decade I think we'll see a resurgence of
focus on accessibility and usability centered around design guidelines that
coalesce out of consensus in disparate communities rather than corporate
policy. I think especially that as Moore's law continues to flatten out, and
network connection speeds start to platau we're going to see a renewed focus
on responsive UI design and application performance. I am excited about these
trends and feel optimistic about software design going forward.

Too bad Nielson was totally wrong about customer service though. :-(

~~~
hamaluik
You might be interested in Elementary OS?
[https://elementary.io/](https://elementary.io/)

~~~
chacha2
That has most the issues that this article complains about. It gets rid of
both the minimise and maximise button, leaving only close because that's how
the iPhone does it.

~~~
saagarjha
Their rationale is that apps should just be closed or opened; there shouldn't
be a need for intermediate states.

------
MattGaiser
I work on a project meant to sell a service and meant to manage the service
being sold to the consumer. However, the button to actually buy the service is
hidden by a scroll bar on all but the widest of screens. Unless you scroll the
widget or have a 22 inch monitor, you will not see the purchase button.

Why? The UI was designed in on a wide screen and we developers are just the
implementers of the picture. UI is quite often taken from a drawing and little
else. It looks great in a mockup, but it isn't all that practical.

~~~
franga2000
> UI is quite often taken from a drawing and little else

The way I've solved this these days when working with people who aren't
familiar with responsive design (older/inexperienced designers or clients
directly) is to print them an A3 page with outlines of a vertical phone and
tablet and a horizontal desktop screen, vaguely to scale, and all including a
"below the fold area", and tell them to draw on that. This almost always gives
me enough information to implement a properly responsive design.

~~~
MattGaiser
This is a good idea actually. Now, I just need to be in the room before they
draw stuff...

------
jiehong
On desktops, I wish we could just stop wondering “where is that icon?” or
“what does this icon do?” entirely.

We should be able to search for any feature within an application in a
standardised way instead. Maybe something like what Ctrl-P does in Sublime
Text / Sublime Merge.

If this shortcut would work for any application within the same OS, we could
get rid of most icons by default, while adding more consistency at the same
time!

~~~
lvillani
On macOS you can search menu items via Shift–Command–QuestionMark (or by
opening the Help menu). Most toolbar actions are also exposed as menu items,
so this lets you essentially search for almost every function of every
application that plugs into standard macOS frameworks.

Some applications have features that extend beyond what can be surfaced
through the standard menu bar but the infrastructure is there for "normal"
apps.

Ubuntu used to have something similar in earlier versions of Unity. It would
surface Gtk and Qt menu trees in a searchable interface.

~~~
bbx
> Shift–Command–QuestionMark

Oh my! I knew you could search through the menus. I didn't know there was a
dedicated keyboard shortcut for it! I've been using MacOS for 10 years and
never knew this… Thanks.

~~~
balladeer
I also figured it out just now after 8 years of Mac usage. But I already knew
that I don't know/use most of the dedicated Mac shortcuts (thought I think I
should).

------
Kaibeezy
I finally switched from Win7 to 10 about a month ago. Couldn’t put it off any
longer once lockdown started. I’m tech management, so it’s office, mail,
browser and graphics mainly. Illustrator 4 runs fine on an X220.

Win10. I just absolutely hate it. Every day I have to relearn something
obvious. I can’t find the corners of windows to grab them, and when I do, it’s
one damn pixel wide and I get jitters. Why is Candy Crush on the Start list
when I never use it, but where the hell is Notepad? Bla bla bla.

Would someone please make a Win7 skin so I can get back to work?

~~~
chacha2
[https://github.com/Open-Shell/Open-Shell-Menu](https://github.com/Open-
Shell/Open-Shell-Menu)

There you go. Brings back the Windows 7 Start menu/

~~~
LargoLasskhyfv
Nay, i'd rather [https://cairoshell.com/](https://cairoshell.com/)

------
l0b0
This was always one of the biggest failings of open source software. Most
communities in my experience absolutely explode when anyone suggests an UI
change, even if it's to bring the application in line with well-known
usability, accessibility or design standards. The only two outliers are GNOME
coreutils, which have at least a semblance of consistency in their command
structure, and the corresponding BSD tools, which unfortunately have opted for
a completely _different_ UI standard.

I'm afraid there's only one way around this: pressure from above. Pressure
from the community keeps failing every day. Newbies try something out, rant
about the bonkers UI in a forum or bug tracker, and the fans shut them down
with what amounts to "it's how we've always done it!" Whoever decided on the
UI of many of these have clearly got too big an ego to see that they are
hurting users by "differentiating" themselves.

------
mmphosis
There was a time (roughly between 1982 and 1993) when very few could sit down
in front of a GUI. I do feel like I am returning to that time. Here are some
interfaces I could do without, except that I can't:

→ The command line. In 2020, I need to do a lot of things at a command line
because there is no other way. For example, starting and stopping _sshd_ needs
to be a checkbox.

→ Tabs. Tabs. and more layers of Tabs: boot tabs, workspace tabs: work-
spaces/virtual-machines/containers/emulators, Apps ⌘+Tab, Windows ⌘+~, the sad
return of "Multiple Document Interface" in the form of tabs and hierarchies of
tabs within those tabs, tabs within the page and hierarchies of tabs within
those tabs, Views within the page with tabs within those views and hierarchies
of tabs within those tabs, keep going recurring tabs possibly forever.

→ You deserve better than this: window snapping. And so-called "tiled" window
managers which are little more than poor versions of 1980's window splitting.

→ Right clicking and yet another menu/sub-menus pops up of things I don't
want.

→ JavaScript. Advertising. "Block pop up windows" has been enabled by default
for a long time, but what about blocking pop ups within a page? An ad blocker
for now, I guess.

→ The hamburger menu. Or for that matter, any menu with sub-menus and any menu
with more than 8 to 9 menu items.

Here are some interfaces that have improved:

→ No modes.

→ The ability to go full screen when needed without compromise. And, being
able to, fairly easily, get out of full screen.

→ UTF-8

→ more guides: the translucent lines or boxes that help align UI elements in
flexible ways

What is missing:

→ pop ups/menus used extremely sparingly.

→ Tools that float, in the sidelines — not on top of content, only in the
context of when you need them. For examples, see game interfaces, or excellent
graphics applications.

→ What you deserve is "Zoom to fit" which when done well is great.

~~~
majewsky
> starting and stopping sshd needs to be a checkbox

That's how you get something like this:
[https://www.jensroesner.com/wgetgui/](https://www.jensroesner.com/wgetgui/)

~~~
aspenmayer
It’s funny you mention that, because if you keep scrolling to the section “for
the haters,” you’ll find a pretty close approximation of my views, and a
worthy response to those who scorn another’s preferences because they are
against their own, when they aren’t mutually exclusive. It’s okay to have more
than one way. It seems like some kind of appeal to correctness or directness
or purity of task completion that I don’t understand, but which is very common
in computers and software especially.

A poor craftsman blames their tools as the saying goes. A worse one curses
their tools. A good craftsman appreciates the shortcomings and limitations of
their tools and adapts their tool usage, tool choice, and their very tools
themselves if need be. What kind of craftsman criticizes the tools another
chooses on matters upon which reasonable people could disagree? Is such a tool
unreasonable, or is it the craftsman who criticizes another exercising a
preference and doing their own thing their own way?

Not every tool is for every job, nor is every tool for every tool user.
Preferences are normal and vary. So should expectations. There’s always
another tool. Try not to be the tool but rather the tool user.

------
adamc
Good piece, with accurate criticisms. I've lost count of the number of times
designers insisted on re-styling links in some unintuitive way.

I suspect (but cannot prove) that people struggle to channel their natural
inclination toward creativity into constructive channels. A big part of the
job of a UI is to be familiar and hence easily understood. The best UIs often
don't stand out -- they just let you get your work done effortlessly.

------
pknopf
The single thing I hate the most?

Showing search results immediately, while the real search is happening in the
background. I see what I want, click it, and it immediately changes to be
something else.

I get the need to give users instant feedback, but not at the expense of user
experience.

~~~
jhhh
Somehow google even with all their money and rockstar developers still does
this to me in search results. There's no built-in option I've found to remove
the 'People also search for' popup that pushes all the content I was just
about to click on after navigating back to the search results page. I had to
make a filter rule instead.

~~~
nc30
Care to explain how to apply that filter rule?

~~~
jhhh
In ublock origin add a line in "My Filters" for:

www.google.com##[id^=ed_]

I only created this today so I'm unsure if the ids change over time or if
people get different ids for sections but it's been solid so far.

~~~
nc30
Oh this is cool! Have been using ublock for a while and didn't know you could
use it to block particularly annoying things like this. Thanks a bunch!

------
cirno
I was so disappointed when Xfce, long my last bastion of consistent UI design,
finally gave up the ghost and announced their move to client-side window
decorations. It seems the days of the title bar + menubar + optional toolbar
are numbered :(

~~~
iso-8859-1
There is still Trinity [0]. I think it will be very hard for them to switch to
client-side decorations, even if they wanted ;)

[0]:
[https://en.wikipedia.org/wiki/Trinity_Desktop_Environment](https://en.wikipedia.org/wiki/Trinity_Desktop_Environment)

------
anticristi
I would be harsher than the OP: This inconsistency is deliberate and a natural
consequence of attention economy. Every PMs dream is for their app to become
the new homepage/home screen.

Want to switch out of Slack to send an email to your boss? It SHOULD feel
painful, so that next time you do it on Slack.

Want to set yourself a reminder? Well Slack can do that for you. No need to
deal with the "foreign" Google Calendar UI.

VSCode is also somewhat weird, in that it married an IDE with a Terminal, a
File Manager, a VCS, and tiling window manager.

One day there will be the "great merger", Slack + VSCode + GitHub + Spotify,
the Dev Desktop.

~~~
bitwize
> One day there will be the "great merger", Slack + VSCode + GitHub + Spotify,
> the Dev Desktop.

So... yet another way in which VSCode is catching up to where Emacs was
decades ago :)

------
tempestn
Definitely agree on the Firefox megabar. As soon as that came out I had to use
userChrome.css to make it stop covering the bookmarks toolbar when opening a
new tab. When I open a new tab, I'm _either_ going to type something in the
address bar, or open a bookmark. Having one cover half of the other makes no
sense. It's not like the regular address bar size is insufficient.

~~~
a1369209993
Competely orthogonal to the megabar being shit, you can[0] middle-click on a
bookmark entry to open it in a new tab immediately.

0: could at some point? I revert and stop updating software that sends
downgrades and regressions as updates, so I'm not sure what all firefox-latest
is doing.

~~~
tempestn
Yeah, but the order of events in my brain is: 1\. I want to go somewhere new.
_Opens new tab._ 2\. I have a bookmark for where I'm going!

~~~
a1369209993
Fair enough - I think of opening a bookmark (What do you mean "in a new tab"?
That's how you open a bookmark.) as a single action, so it irritates me
immensely when I have to make a space, _then_ put stuff into the space, and
having to do that simply because I didn't know about middle-clicking - rather
than the application being deficient - would be even worse.

------
verall
Aww, he had to go after the Gnome stuff :/

Many a time I have spent over a minute decoding a Gnome GUI for incredibly
simple applications. Is that a clickable icon? Why is that icon/menu option
greyed out? Toggles, icons, buttons, toolbars thrown together with a rare
tooltip. I fully agree they border on parodical.

And yet I still use the gnome tools frequently, because they are useful. So in
a way I do feel bad for complaining because I am certainly not stepping up to
the plate to improve these tools.

~~~
cirno
Problem is, would that even help? What I want is what Gnome 2 already was.
Their changes indicate they themselves don't want that anymore. Any attempts
at voicing my concerns are met with disdain. What more can I even do as a
developer?

~~~
imhoguy
You can fork! /s

~~~
rleigh
Only if you want to maintain a private copy in perpetuity. It's always more
constructive to work with an upstream.

Though in the case of GNOME, that's rarely productive. I had one of my
contributions sit around (with tested patch ready to apply) for 10 years!
Despite comments saying it was good to apply. That entire component ended up
dying through lack of maintenance even when people were happy to contribute
effort.

GNOME's problems like squarely with its terrible choices.

------
zwaps
It feels like there are many unrelated terrible reasons for "UX" being a
clusterfudge. "Webified" UX and "data-driven" UX are of course terrible. Data
scientists in these fields don't come from the right background to know or
care about causal analysis, so whatever is measured is not UX efficiency.

But there's other issues, unrelated to that.

Computers are complex and there are certain tasks we do not have to do very
often. Back in the day, the consistency allowed me to know how to do most
things.

I find this most notable in Win10. Back in the day, I could intuit many
things, for example how to get something in and around my task bar to work as
I want. Another example is to fiddle with audio settings, bluetooth or other
settings stuff in general.

Today, I have to "try around" instead. There are two control panels, plus that
task bar control panel. Sometimes you need to right click somewhere and choose
an inconsistently named item to get to the right setting screen, sometimes
it's better to search. Sometimes the setting is available in the "task slider"
screen.

I always just thought it's me getting old. Because that's how old people use a
computer.

Perhaps it's rather the UX disaster.

------
toohotatopic
Speaking of the Firefox version 75 bar: Why has there been a change in the way
the selection works? Now it takes three well-timed clicks to select an entire
url. Is this an improvement?

~~~
majewsky
I'm currently on Firefox 75 on Linux. Clicking in the address bar when not
focused consistently selects the entire URL.

~~~
toohotatopic
Right. I forgot to mention that the problem comes with X11 and the clipboard.

The first selection doesn't and shouldn't copy to the clipboard because
entering a new address would focus the bar and replace the existing clipboard.
It wouldn't be possible to copy any selection into the address bar.

So to use an url from the address bar requires selecting and re-selecting the
entire address. That second selecting has become far more difficult.

~~~
majewsky
You want right-click > "Paste & go".

~~~
toohotatopic
Actually the other way round: how to get an url out of the address bar. It
used to be that a double click was enough since the X Window system copies
selections automatically to the clipboard.

------
sytelus
The main problem is affordances. In old days, users were expected to be new
bees and needed to be get productive fast. So menus explicitly featured
keyboard shortcuts, apps had status bar telling you what was happening, there
were even main menus that told you everything you can do with app.

Then UX designers thought they should be clever and make this a game for user
so they _eventually_ figure out how to do X. Some such initial games produced
"aha!" moments and UX folks took that as signal for doubling down in the name
of minimalism. Now apps rarely have menus or status bars or even toolbars.
Users are expected to struggle a bit to get their "aha!" moment. Sometime even
critical functionality is kept hidden under weird actions like triple-finger
squeeze. It's horrible for users but apparently UX guys think they are doing
cool shit.

------
PeterStuer
Absolutely true.

It's mainly the mobile "experience" seeping into the desktop. Undiscoverable
UI incantations and improv galore. All to get rid of the healty PC ecosystem
to hail in the brave new world of walled garden platforms with juicy store
taxes.

------
firefoxd
That's because we have a new tool in our UI kit. Google.

If you are trying to underline a text using your fancy text editor, well you
don't try to figure out what the icons means. Instead, you hop on google and
type: how do I underline text in fancy editor. Google is part of our UI now.

------
askafriend
Little of what he says is a problem on MacOS fwiw. I can resize my Slack by
dragging anywhere on the toolbar, for example.

~~~
ogre_codes
The fact that MacOS shows the menu at the top of the display all the time used
to bother me but I've long since come around. As more and more cross platform
Electron apps take over the desktop, I'm even more thankful that it's there,
keeping a lot of this nonsense at bay.

Microsoft has been going downhill for a loooong time, the stupid Ribbon Bar
drove me off of MS Office 15+ years ago. The control panel in Windows XP was a
mess and it's only gotten worse as far as I can tell.

~~~
hf8665
The ribbon bar was when I started noticing this mess. I liked the extended tab
concept but the UI was inconsistent because some things were accessible
through the tabs but others you had to go through the button, in a way that
has never made sense to me.

I do blame monopolies for this in part. When some single thing dominates
(office suite, web browser, whatever it is) the comparator shifts. It's no
longer "how do these 6 things compare", it's "is what's new here entertaining
enough and not too much of an inconvenience to cause me the pain of abandoning
an entrenched tool?" So then users convince themselves the improvements are
progress because it's always implicitly compared to the alternative of
struggling against the lack of choice of other products.

That's not all of it by any means, but I do think it created a context for
what's deemed acceptable ux-wise.

MacOS isn't all it's cracked up to be though and is maybe another historical
source of this mess. I use osx daily and it's much, much, much too easy to
lose track of open windows. Dialogs open and you don't even know they're
there, and you find windows open that you thought were closed weeks ago.

This doesn't happen with a lot of other OSs, or at least used to not happen.

My favorite UX has always been through KDE, although I haven't had the
opportunity to as much as I used to because of work reasons.

------
austinjp
I'm genuinely curious how it came about that user-experience is now "try
clicking, tapping, swiping, hitting a key" until something happens.

------
agigao
Let's avoid putting all the weight on the shoulders of Microsoft, although I
tried to use W10 couple of times last few years, for the job (SE), it doesn't
work for me, but for other major reasons.

Using Manjaro i3 on desktop and MBP 15 on mobility.

But overall in UI/UX we are having a hype of _newness_ , to stand out among
competitors on the expense of functionality and also making it all
"accessible" for as wider audience as possible. FF's address bar zooming
"feature" makes me feel like a damn moron :))

1\. The job is still done on desktop/laptop computers and touch screens
doesn't really make much difference I think. I wish all the major companies
incorporated that point of view into the planning process. F*ck shareholders,
you gotta think about your users in the first place!

Apple suffers another dead end here - so many colors it squeezes the energy
out of my brain! Maverick was the last sane macOS with sane colors.

2\. Regression to the mean, that's exactly what happened to educational
system, and textbooks as well, make everyone comfortable with themselves,
instead of pushing kids to actually learn.

Dumb down everything!

------
Stierlitz
> All of these title bars denote active windows. The top one, Outlook, looks
> exactly the same when inactive ..

Yes, when did active-clickable-elements go out of fashion and everything
became the same faded shade of cold blue brushed aluminum.

~~~
ridiculous_fish
If you have a giant full-screen window with tabs, inactive controls are never
visible, so there is no reason to make an inactive appearance.

~~~
saagarjha
I was going to make a comment about how this might be a mobile thing, where
focus is less of an issue because everything is fullscreen, but now iPad does
multiple windows and has the same focus problem…

------
lugermorph
One of the most annoying UI design features to me is multiple ways of doing
the same thing on the screen at once.

Windows 10 does this endlessly. You can use the taskbar to get to your files,
you can use the Start menu to get to your files, you can use the tiles in the
Start menu to get to your files. This actually makes it far more confusing
when trying to quickly get to your files. At least on macOS there is one
single button to get to your files (the Finder icon).

If you've ever used Maya you'll know it's the same thing. The layout is
incredibly overwhelming and when you want to quickly and effortlessly switch
to a different tool, you have to think about what button to press. I switched
to Cinema4D as their design is simple and very intuitive.

Good UI design to me is about having labeled buttons that have a depth of
interaction, rather than putting all the buttons on the same screen at the
same time. Obviously you don't need to abstract away menus all the time (like
the proliferation of pointless hamburger menus), but at least cleaning up
buttons makes UIs more usable.

------
fm4d
Without experimentation there can be no progress. It is nice that you are
satisfied with the "old" era UIs based on dropdown menus and predictable title
bars and it would be nice to have some decoupling of functionality and UI so
that you can style your apps to adhere to this paradigm.

What I don't see in your article is any reasoning WHY we should build our UIs
in this way, and even if you did I suppose I would disagree. I hate dropdown
menus, I hated them since Windows95 and never stopped hating them. There are
many other approaches - string-based "tell me what you want to do" approach of
Emacs, context-based morphic approach of smalltalk systems, etc. Each of them
is interesting, each of them brings something new and works for certain
applications.

It seems to me that instead of ranting how UIs are not what you want them to
be these days, you could instead rant that you are unable to mold UIs to your
liking and it would have greater utility.

------
pteraspidomorph
> The newly released Firefox version 75 features what has become known as "the
> megabar"

Oh, thank god someone is saying this. This change baffled me. Who thought this
was necessary or a good idea?

------
indymike
A lot of the decline has to do with the desktop importing web-based UIs:
documents with a hodge-podge of interactive widgets that glue them into a
cohesive app. Every app can be different and can be optimized for its
function. New desktop apps are often built with either web-tech or some new UI
framework that tries to give developers and designers that same level of
differentiation and task-specific optimization you get in a browser.

Honestly, it isn't surprising that web style UIs are so popular: they take a
lot less time to build, and you spend more effort building functionality than
integrating the standard actions your app is supposed to support (think the
stuff in the File, Edit and Help menu). Also, most users do pretty well with
webish apps. Better than they should and most of the time better than they do
with a GUI app. As they say, Good Enough beats perfect almost perfectly.

------
SubiculumCode
I wish they would use more words in menus instead of icons. I built this
wonderful capacity to read quickly, I scan a list pretty quickly, some
interpretation is necessary, but some icons are really obscure/vague.

And if I have to click more than once to find a scroll bar because it was so
small I missed, then the gui is doing it wrong.

------
JohnJamesRambo
I often wonder what Jakob Nielsen would think of modern UI haha. Every
decision goes against what he believes in. I wish someone would show him the
inscrutable Snapchat UI.

------
SomeoneFromCA
Some of the worst offenders are 16x9 monitors. Great Apple did not follow that
strange fad, and stuck with 16x10.

~~~
ehonda
My desktop main monitor and laptop are 16x10, both are 10 years old. I can't
get over how terrible 16x9 is for computer use, its a travesty that this
became the standard.

~~~
kps
Using TV screens for computer output is an idea that should have died with the
Commodore 64.

------
ChicagoDave
I read this and I can’t tell if the OP is joking or what. Information
Architecture and User Experience Design have dramatically improved application
development in the last ten years.

This isn’t to say crappy apps aren’t built. They are. But when time and talent
are leveraged properly, good things happen.

~~~
tiborsaas
I'm also tired of articles that are like: "Here's some bad examples, therefore
everything is bad".

It's missing the constructive part and probably finds the wrong audience.

~~~
mcswell
I'm sure people would be happy to provide more bad examples--indeed, they have
done just that (and I contributed my own gripes about Adobe Acrobat).

I can think of bad UI choices in nearly every application I use. I think one
over-arching issue being raised here is the lack of consistency among
applications, even on a single platform.

But if you don't think everything is bad, perhaps you could provide examples
of software with good UIs?

------
necovek
Ah, the times of the Sun contributed and led HIG — Human Interface Guidelines
— for Gnome that actually made sense!

But even that (Gnome 2.0) "simplification of UI" got a lot of flack from the
community, but it was based on sound principles and actual user validation (it
was already brought up in this thread).

What I like to think is that we are in a chaos at the moment, where
experimentation is running wild, and at some point, we'll realise again that
we can't have good UX for both large and small screens, and keyboards/mice and
lack of them, all in one codebase, and we'll stop trying, and all will be
good. For the next 5 years at least.

------
meerita
Whatever they say, as a Mac OS 8 and 9.1 user I don't think there will be
anything similar for a long time: macOS today is a usability disaster. A user
interface that is mined with options for all cases. Gosh, it's a pain in the
ass to work with macOS these days of so many options you have. Before it was
simpler but things were done in an effective way without cutting the user's
chances of working comfortably and customizing their work desktop. And the
Finder is a disaster, for me the epitome of the Finder will always be Norton
Commander.

------
z3t4
I'm guilty of many of these. The reason is screen real estate. You want to
have the most essential stuff on the screen, and not in hidden menus or popup
windows.

And your app needs to adapt to many screen sizes, all from mobile, pad's,
notebooks to desktop monitors.

Then it's much more efficient to use the keyboard rather then reaching for the
mouse to click on icons you have no idea what they mean, or pulling on
scrollbars.

Only problem is that if you design your app like Vim, you will have to put a
lot of time into teaching users to use it properly.

------
adeel_siddiqui
Frontend development/UI/UX design has gotten stuck in a weird loop. Many
companies/startups get working on redesigning the UI every 6 months just
because some competitor has a new design and a few people like it, or even
worse, the designer likes the new "slick" design. It is bad because it is no
longer concerns/standards about accessibility or usability driving those
constant redesigns but some kind of self-created pressure to be part of a rat
race.

------
bambataa
I’ve been helping family get set up with Zoom etc via remote desktop
(recommend Chrome Remote Desktop, easy enough to set up). Watching them do
things makes me realise how absolutely unintuitive computers have become.

Click to download something. Where is the thing gone to? They don’t know that
the little icon in the top right with the down arrow means “downloads” and
they don’t see that it’s just gone blue. One click or double click? I don’t
even know any more.

------
KingOfCoders
Recently my MacBookPro stopped working with an external LG monitor. Until then
the laptop used the highest resolution spanning the whole wide monitor.

Then it suddenly stopped and only displayed a 4:3 image in the middle.
Preferences didn't show anything.

I've learned that you need to option-click the settings to see all possible
hidden resolutions. There was the wide large resolution again. Not sure how
one is supposed to used this preferences panel usability wise.

------
lucabenazzi
Some very good points here, I haven't been using Stack in a while and I can
hardly believe they could come up with something like that! What a
nightmare...! I can't fully agree with the rant about scroll bars: so easy to
reveal them by just scrolling (easier than a mouse click), but a lot better in
terms of aesthetic. Of course a compromise, but an acceptable one. I would
rather prefer scrollbars on desktop to be larger, because they are not visible
most of the times anyway, so why make them such small targets that are
difficult to click on? The author seems to be critical towards flat design,
what are the arguments around that? The emotional impact of design is not
something that can be disregarded that easily, it's legit to expect a modern
UI to be efficient and slick at the same time. Aesthetic is paramount, for the
same reasons why typography matters. Users nowadays expect to deal with a
clean, lightweight UI, rather than with one that looks like it was designed in
the Pleistocene.

------
veeti
> The Decline of Usability

Featuring a 600px wide container for text.

~~~
a1369209993
inspect element ; div.content ; .content,.priv ; max-width ; delete

Compared to any 'modern' website this is a fucking pinnacle of usablity.

~~~
imhoguy
Thanks goodness Firefox provides Reader Mode usability feature.

------
Krasnol
This is the worst device angle pest we've seen in games already.

PC games designed with consoles in mind make PC games UI worse. It starts with
the menu and ends at in-game controls.

This is the same thing we see in windows 10 now. It looks like it must be
build in a way that you could use it drunk with your nose on the screen even
though more precision is not only possible, it's normal.

------
marvinblum
I don't agree with every point made in the article, but it was pleasant to
read.

Especially the keyboard focus got me, as I really like to use my keyboard to
do as much as possible. It's far superior to the mouse unless you really need
your mouse (for image manipulation for example). We are rebuilding our UI for
Emvi [1] right now in the browser "keyboard first" and I must say this feels
quite right. Of course you still need to support mouse and touch devices, but
it's so much more fun to start out with a clear image of what you're trying to
achieve. The handling is much faster. You can read about it here [2] if you're
interested.

[1] [https://emvi.com/](https://emvi.com/)

[2] [https://emvi.com/blog/a-new-experimental-user-interface-
QMZg...](https://emvi.com/blog/a-new-experimental-user-interface-QMZgmZG1L5)

------
amluto
Hah, what a delightful version of Evince the authors makes fun of. Newer
versions removed the “Open...” option entirely AFAICT.

------
lpilot
I might be a little late to the party here, but I don't entirely agree with
him.

What I do agree with: UIs have gotten worse on the usability front, this is
undeniable. Each application does its own thing when it comes to layout, and
each app is also subject to change so not only do new users struggle to find
where a button is, but experienced users have to re learn the UI. I also agree
that a lot of the bad changes to the desktop come from mobile. Something that
works on mobile because of the small screen, limited horizontal space, and
inaccurate input method will inherently not be the most efficient on a usually
large wide-screen device with very precise input methods. There is also a
large element of sacrificing usability for aesthetics which doesn't help.

What I don't agree with: I don't think we had ever reached a stage where we
nailed usability. The "golden age" he talks about, with the file, edit, view
menus weren't all that great. I remember using word 2003 and having to click
through each menu and reading each entry one by one to find the option I
needed because it wasn't obvious where it should live. The one advantage this
system had was that every app used it and so everyone understood the paradigm,
which is arguably a bigger factor to usability than the actual design.

He also makes it seem like (although I don't think he argues this point) every
new innovation in UIs past a certain point were bad. But he also gives a
counter example, the bookmark bar. Older web browsers didn't have this
feature, it was something that came later. Some wizz kid one day implemented
it and this feature happened to stick. We haven't solved UIs yet, and so we
have to try new things in order to figure out what works and what doesn't.

Finally, I don't think the most important thing with many UIs is how easy it
is for a new user to understand. Most people would agree that Vim has a great
interface, but it just takes a while to learn it. This goes for a lot of
specialist and professional applications, I'd prefer it to be designed to be
useful for the veteran user, not the newbie, but with good documentation to
make the learning as easy as possible.

------
oneplane
This is also why Windows Phone and Windows 8 were dead before they arrived,
there is no discoverability besides randomly clicking and swiping to see if
something happens. And if nothing happens, you still don't know if there
simply is nothing more, or if you haven't used the magic swipe/click yet.

It seems that many 'new' interfaces try to do their own thing, and try to go
for looks rather than functionality. If you have no consistency and no known
functionality, the UI is basically worthless.

There are some ways to get a 'better' UI without breaking so much, like the
macOS menu bar that is context-dependant. You save some space in each window,
yet because you can only click one menu at a time anyway you don't lose any
functionality or consistency. The downside is the extra step of first getting
focus on the application you wanted to use (but in most window managers you
had to focus the window first before using the window-embedded menu bar).

It's not that a UI can't be made better, it's just that it's hard to sell to
people with no knowledge or affinity with basic HIG principles. Such
disconnect between reality and 'design' could be found a while ago where
feature phones and some laptops came with either round trackpads or round
screens. That's a neat artistic thing, but completely useless in a reality
where everything is rectangular, including the things you interact with in the
world as well as digital content.

Same with the touch-centric UI components getting ported to desktop operating
systems that assume lots of swiping and touching where there is none. While
possible to execute it reasonably well (I think there is a 'touch mode' in
recent windows versions vs. touch-integrated so you can just switch between
them instead of having a mess of both worlds in one). Same with Apple's idea
where your input device they ship with the computer has the input capabilities
matched with the OS and the first-use explanation of the UI when you power up
for the first time. And if you don't use it, you're not losing anything,
because the traditional buttons and menus are still there, visible, and not in
some hidden swipeable area.

------
mceachen
I'm having a real hard time squaring a rant about usability that expressly
states that everyone should be held accountable to good design, and a website
design that went out its way to be unreadable on mobile.

------
generalpass
One thing I've noticed is how many more mouseovers there are.

I've noticed I have an always-on internal process where I'm hunting for mouse
cursor dead zones. I don't recall it being that way before.

------
dade_
I am using a LibreOffice more than Word and Excel these days, besides being
far too slow, the UI on these apps keep changing for the worse. I give my
feedback, but now there is a search field in the title bar. The MS people
claim this is for discoverability. Then let’s put the whole damned app in the
title bar. This idea that apps need to be made for stupid and lazy people is
why there was Office and Works. This is a catastrophe. I need professional
tools, not Farmville word processor edition. Done griping, and I wholly agree!

------
Paianni
GNOME MPV (now Celluloid) seems like a bad example since it's not part of the
GNOME project.

GNOME Videos gets away with the same UI choice as in that app, media is opened
from the lists and not the menu.

------
DominikD
Honest question: is there a single desktop manager that conducts usability
studies and isn't guided by hunches and conventions? Or more broadly: any open
source software that does?

------
dzonga
what the post forgot to mention is how even though the hardware on a win95
machine was slow. the interactivity felt fast. no slow loading bars. software
was snappy. & fewer bugs. compared to the present day with fast AX series on
iphones, Zen processors, almost every piece of software feels slow. from the
'native' desktop apps to web apps running boatloads of JS in the browser.
let's not mention endless AB testing

------
pupdogg
UI/UX is just one side of the problem, talk about app sizes...specifically
those Electron ones...I’m not kidding, the other day I saw a modern day Note
taking app promoting itself as stellar Note taking solution which comes
“featured packed in a single install file, just shy of 100MB“...and I thought
to myself, oh gosh! Am I the only one that thinks WinAmp, an approx. 2.5MB
install, is still the most kick ass app to date?!?!

~~~
Nextgrid
Nowadays you need 100MB to _really_ whip the llama’s ass.

------
sunpazed
Ornamentation vs function is as old an argument as when humans began crafting
objects. I would argue not a decline but a phenomenon that ebbs and flows over
time — check out this NYT article from over a decade ago
[https://www.nytimes.com/2009/06/01/arts/01iht-
DESIGN1.html](https://www.nytimes.com/2009/06/01/arts/01iht-DESIGN1.html) —

------
poink
> Putting things in the title bar saves screen real estate! This is true to
> some extent, but screen real estate in general isn't much of a problem
> anymore.

Doubt. I may have 3x27" monitors on my desk now, but I work on a 15" laptop
way more than I ever have, too. Screen real estate is at a bigger premium for
me _now_ than it has been since like 1997.

(You can't just scale up resolution past a certain point. I'm also getting
old.)

~~~
int_19h
The way the "screen space" argument is used in practice, it's almost always
misleading, because the same apps that insist on shoving their UI elements
into the title bar, are usually also guilty of copious whitespace in the
context area. The article even points that out.

I think what can work reasonably well is an _option_ to make the title bar
auto-hide (but pop up if hovered) while the window is maximized. This
shouldn't be the default, because new users should be able to find the window-
related commands easily - especially Close.

MATE can be configured in a somewhat similar fashion - it has auto-hide, but
it doesn't expand on hover - it's just completely hidden. This works
reasonably well in practice anyway, because there is a shortcut to close, and
un-maximizing can be done via the task bar in the very rare case where it's
needed.

------
soneca
Recently moving from Fedora to a Mac to use my employer's computer one thing
that's still very annoying to me after 6 months is the top menu bar that the
Mac keeps fixed in the top of the monitor, even if I have the program window
not occupying full screen.

I have a 4k 20 something inches monitor, so I keep around 4 different windows
open. To go from a bottom right corner window to the top left corner File menu
is a long trip.

~~~
californical
Personally, I really like the top menu. It prevents apps from trying to style
their own version of it, and it's always in the same place. Developers know
that users always look there, so almost no apps skip making it. Was always the
worst to me on Windows/Linux when programs would make their own crappy
versions of a top menu that were all inconsistent and bad.

I use a mix of 27-inch 4k monitors and a 21:9 34" monitor, with many things
open all over the screen, and have never felt it's too far away. You might try
increasing your mouse speed a bit (you'll get used to it pretty quickly). If
you can get from the bottom right to top left corner without picking up your
mouse, you're probably at a good speed.

------
citrin_ru
For my personal system I use a tailing WM (i3) and actively use hotkeys, so I
less affected by GUI degradation described in the article (though Firefox
megabar is really annoying).

But for an average user IMHO best GUI design was in Windows 2000 - almost all
changes done later made GUI look more modern (whatever it means), but less
usable. And Linux GUI (GTK/QT) unfortunately follows this pattern.

------
q92z8oeif
While some are egregious usability disasters, like hiding the scroll bar,
others points are just a sign of times moving on, like the ≡ for menu.

~~~
the_af
> _others points are just a sign of times moving on, like the ≡ for menu_

The article addresses this at the end. In software, the "times" do not move on
like an unsteerable force. We (users and designers) decide how the times
should move on. For non-mobile, the ≡ for menu seems like a step backwards and
we shouldn't accept it without complaint.

Applying mobile patterns which are suboptimal on the desktop seems like a UX
antipattern to me. I think a huge part of the problem is that software -- both
free and nonfree -- needs to show "change" in order to signal it's still alive
and maintained, but change for change's sake can be a bad thing, especially if
you already had a pretty good (or consistent) UI.

~~~
pdonis
_> change for change's sake can be a bad thing, especially if you already had
a pretty good (or consistent) UI._

I run Trinity Desktop on Linux for exactly this reason: it's basically what
KDE 3 looked like 10 or 15 years ago. I run it so I don't have to re-learn all
my workflows every time somebody comes up with some new eye candy.

------
Havoc
I find the OS interfaces to be OK.

The internet on the other hand is a train wreck. Oh you want to read a
website? Not before you click away these million popups and cookie banners and
news letter things. Scroll down? haha no. You're in for some sort of sideways
scroll phamplet thing. Or my personal favorite alternating down and sideways
by page number (yeah I couldn't believe it either)

------
auggierose
Well. It's a generation of programmers brought up on Javascript. They are in
charge now, and we are paying the price for that.

------
jordache
what about the infamous iOS keyboard states? is white active or inactive?

------
crazygringo
I fundamentally disagree with the entire premise of this article.

Yes, usability has become less uniformly consistent within a single platform.
But that's because the _feature set_ of desktops, laptops, tablets and phones
has increased _exponentially_ , beyond what the classic desktop GUI could
handle.

Not only can you access a huge percentage of the world's knowledge within a
few seconds, but new UX paradigms such as search boxes and recommendation
engines have completely changed the game.

Now when you're trying to build an app that has a modicum of consistency
across sizes from phone to desktop, whether you're using a mouse, trackpad or
touchscreen, whether you've got a hardware keyboard of software one or are
using dictation, and so on...

...then you have to make tradeoffs. Yes, the purely desktop experience has
become less consistent. But at the same time, an app can be _more_ consistent
across platforms, which is what many users want when they're switching between
platforms multiple times a day.

And as an industry, apps really do seem to fairly quickly standardize on UX
conventions like tabs, hamburger menus, autocomplete, drag-to-refresh, and so
on, which aren't any less intuitive than right-clicks, keyboard shortcuts,
minimizing, or drag-to-trash-to-eject (remember that?).

So _relative to functionality_ I don't see any decline at all. Young children
can pick up an iPad and learn to use it without instruction. I don't remember
young children doing that with a Mac Classic or Windows 3.1.

~~~
Jtsummers
> Now when you're trying to build an app that has a modicum of consistency
> across sizes from phone to desktop, whether you're using a mouse, trackpad
> or touchscreen, whether you've got a hardware keyboard of software one or
> are using dictation, and so on...

Maybe we shouldn't strive for uniformity across many different interfaces.

My office has switched to MS Teams, which is an abomination on the desktop.
But would be perfectly fine on a tablet. I can't have multiple chats open
simultaneously. If I open a shared document in a chat or team, then I go back
to the chat, I have to click several places to get back to the document
(rather than having it, you know, opened and in a separate window). A desire
for "streamlining" the experience or some other such bullshit has produced one
of the worst productivity/collaboration tools I've seen in 25+ years of using
networked computers.

I wouldn't expect a remote server to present the same interface as a desktop
as a tablet as a phone as a watch. It's absurd, acknowledge the distinctions
and design for the system that it's executed on. I'd rather MS Teams on
Windows be like, well, a desktop application:

Contact list, chat window(s), documents opened in the application that can
edit them, all with multiple windows taking advantage of the actual
capabilities of my system. I have two monitors at work, but with MS Teams I
may as well just have one.

And that's just one of the more egregious examples, many others are like it
and it's the result of laziness or hubris or ignorance on the parts of the
designers/developers.

~~~
anthk
Microsoft even had MSN Messenger. The former commenter has no excuse, but
lazyness and a great lack of understanding of 90/00's computing environment.
W9x-w2k/KDE's paradigm were the best ever for a DE based multitasking.

This is a PC. Why does the parent commenter want to _downgrade_ its user
experience to the one from a mobile user?

~~~
Jtsummers
I wouldn't even go so far as to say that the mobile experience is a downgrade.
The issue is one of respect for the medium the program is executing on.

On a tablet, MS Teams would be (mostly) adequate. The tabbed interface is
somewhat natural:

    
    
      +----------------------------------+
      |      |search/command field|      |
      +---------+--------+---------------+
      |Activity |Person1 |Chat Files Org |
      |<Chat>   |Person2 +---------------+
      |Teams    |        |Some stuff     |
      |Calendar |        |               |
      +---------+------------------------+
    

Select the main thing you want in the left column, select a sub-activity in
the next column, then another, or interact with files/chat/whatever.

It actually works well as a model for how to do chat/collaboration with a
tablet. It makes good use of space (the first two columns take up almost as
little space as needed). It's touch-friendly. If it launched files into their
own editors, it'd be a perfectly reasonable tablet application.

But on the desktop, I'd really like to chat with more than one person at at
time. I'd like to view the files _and_ chat with whoever sent it. The desktop
computer offers one of the most flexible GUI systems we have, don't force
users into a singular mode of use unless you're running on a kiosk.

\---

I'm picking on MS Teams, but look at Slack and Discord as the things it's
imitating (or all are imitating each other). Restrictive single-window
applications neglecting what makes the desktop so flexible/capable and
distinct from tablets, which their present UIs are more suited for.

~~~
anthk
A touch interface for a chat application is the worst thing on usability
happened on IT ever.

Physical keyboards as a cover should be a must shipped item on every tablet.

A tablet is a consumer/broadcasting device, like the smartphone. You are not
producingwith it. You are either consuming media, or sharing it. It's the
perfect ad device. And zoomers are blindly embracing it as if it was better
and more "modern". They are not right.

~~~
Jtsummers
> A touch interface for a chat application is the worst thing on usability
> happened on IT ever.

That's a strong opinion, not one I'd agree with (except for the how touch-
device interfaces have spilled into desktop interfaces).

------
photawe
I love the article! Couldn't agree more! One of the trends was set by, who
else, Microsoft. They went mobile first, failed miserably, but still kept the
concepts.

It's prevalent in their UWP apps - especially settings, in Win10 - one of the
worst UI's I've ever seen.

I can't explain how much I hate this.

~~~
julienreszka
Everybody doesn't agree with you. I believe going Mobile First was the best
decision they made in decades.

~~~
Nextgrid
Why? Windows is a desktop OS primarily used to get _work_ done. “Mobile first”
doesn’t make sense here and just gets in the way (it personally made me give
up on the platform and switch to Mac instead).

~~~
julienreszka
Having a ui that adapts to screen width is important to me because I do a lot
of things at once, the window manager on Mac is the worst for this

~~~
photawe
there's a difference between adapting to every screen, and being mobile first.

The point is: no matter what you do, to have an app that works on both mobile
and desktop, you'll need two UIs (otherwise, you'll get a crappy UI on both
platforms). So, if I use UWP on desktop, it should "bend" to desktop (all the
controls should be desktop-friendly). If I use UWP on mobile, it should "bend"
to mobile.

------
dredmorbius
I date all this to Jakob Nielsen retiring the ever-so-wonderful
[https://useit.com](https://useit.com) for the utterly forgetable
[https://nngroup.com](https://nngroup.com)

(Fortunately the redirect still works.)

------
ilamont
The OECD published the results of a massive survey of member countries some
years ago, titled "Skills Matter" ([https://www.oecd-
ilibrary.org/education/skills-matter_978926...](https://www.oecd-
ilibrary.org/education/skills-matter_9789264258051-en)). The researchers
defined 4 levels of technology proficiency, based on the types of tasks users
can complete successfully. There was a very good summary published here
([https://www.nngroup.com/articles/computer-skill-
levels/](https://www.nngroup.com/articles/computer-skill-levels/)) and
excerpted below.

For each level, here’s the percentage of the population (averaged across the
OECD countries) who performed at that level, as well as the report’s
definition of the ability of people within that level:

 _“Below Level 1” = 14% of Adult Population

Being too polite to use a term like “level zero,” the OECD researchers refer
to the lowest skill level as “below level 1.”

This is what people below level 1 can do: “Tasks are based on well-defined
problems involving the use of only one function within a generic interface to
meet one explicit criterion without any categorical or inferential reasoning,
or transforming of information. Few steps are required and no sub-goal has to
be generated.”

An example of task at this level is “Delete this email message” in an email
app.

Level 1 = 29% of Adult Population

This is what level-1 people can do: “Tasks typically require the use of widely
available and familiar technology applications, such as email software or a
web browser. There is little or no navigation required to access the
information or commands required to solve the problem. The problem may be
solved regardless of the respondent’s awareness and use of specific tools and
functions (e.g. a sort function). The tasks involve few steps and a minimal
number of operators. At the cognitive level, the respondent can readily infer
the goal from the task statement; problem resolution requires the respondent
to apply explicit criteria; and there are few monitoring demands (e.g. the
respondent does not have to check whether he or she has used the appropriate
procedure or made progress towards the solution). Identifying content and
operators can be done through simple match. Only simple forms of reasoning,
such as assigning items to categories, are required; there is no need to
contrast or integrate information.”

The reply-to-all task described above requires level-1 skills. Another example
of level-1 task is “Find all emails from John Smith.” Level 2 = 26% of Adult
Population

This is what level-2 people can do: “At this level, tasks typically require
the use of both generic and more specific technology applications. For
instance, the respondent may have to make use of a novel online form. Some
navigation across pages and applications is required to solve the problem. The
use of tools (e.g. a sort function) can facilitate the resolution of the
problem. The task may involve multiple steps and operators. The goal of the
problem may have to be defined by the respondent, though the criteria to be
met are explicit. There are higher monitoring demands. Some unexpected
outcomes or impasses may appear. The task may require evaluating the relevance
of a set of items to discard distractors. Some integration and inferential
reasoning may be needed.”

An example of level-2 task is “You want to find a sustainability-related
document that was sent to you by John Smith in October last year.” Level 3 =
5% of Adult Population

This is what this most-skilled group of people can do: “At this level, tasks
typically require the use of both generic and more specific technology
applications. Some navigation across pages and applications is required to
solve the problem. The use of tools (e.g. a sort function) is required to make
progress towards the solution. The task may involve multiple steps and
operators. The goal of the problem may have to be defined by the respondent,
and the criteria to be met may or may not be explicit. There are typically
high monitoring demands. Unexpected outcomes and impasses are likely to occur.
The task may require evaluating the relevance and reliability of information
in order to discard distractors. Integration and inferential reasoning may be
needed to a large extent.”

The meeting room task described above requires level-3 skills. Another example
of level-3 task is “You want to know what percentage of the emails sent by
John Smith last month were about sustainability.” Can’t Use Computers = 26% of
Adult Population

The numbers for the 4 skill levels don’t sum to 100% because a large
proportion of the respondents never attempted the tasks, being unable to use
computers. In total, across the OECD countries, 26% of adults were unable to
use a computer.

That one quarter of the population can’t use a computer at all is the most
serious element of the digital divide. To a great extent, this problem is
caused by computers still being much too complicated for many people._

Let that phrase sink in: _across the OECD countries, 26% of adults were unable
to use a computer._ In some countries like Japan, the number is even higher
(about 1/3 of Japan's population can't use computers, which may reflect the
aging population, poor interface design, or some other factor.)

These data were based on surveys from 2011 through 2015, and if TFA is correct
about the usability trends, surely it's gotten worse.

~~~
dredmorbius
That OECD study, and its implications, was a major inspiration for my essay
"The Tyranny of the Minimum Viable User"

[https://old.reddit.com/r/dredmorbius/comments/69wk8y/the_tyr...](https://old.reddit.com/r/dredmorbius/comments/69wk8y/the_tyranny_of_the_minimum_viable_user/)

The problem is that we're stuck between a. rock and a hard place. People --
the general population -- need to have useful devices and interfaces. The
market will fill that need. But _even very modestly advanced users_ , to say
nothing of elite one, _those who make technology happen_ , are left out.

From the essay:

Let's assume, with good reason, that the Minimum Viable User wants and needs a
simple, largely pushbutton, heavily GUI, systems interface.

What does this cost us?

The answer is in the list of Unix Philosophy Violating Tasks:

\- Dealing with scale

\- Dealing with complexity

\- Iteratively building out tools and systems

\- Rapid response

\- Adapting to changing circumstance

\- Considered thought

\- Scalability of understanding, comprehension, or control

\- Integrating numerous other systems

\- Especially nonuniform ones

------
Causality1
It's all about fashion now. The less technical the target user the less
capable the system is. Windows 10 drops distinct window borders? Fashion.
Smartphones drop 3.5mm jacks? Fashion. Microsoft puts a fucking phone
lockscreen on a desktop OS? Fashion.

------
wegs
This brings back the good-ol' days of early Linux. Anyone remember
Enlightenment, and it's crazy theming?

I think what's changed is tech use and literacy. Most people spend hours on
digital devices today, and are a lot like the tech nerds of the nineties.

------
Tepix
Another rather infuriating example:

On iOS Mail you now have to hit the "reply" icon to move a received email to
another folder, mark it or print it.

Very poor discoverability. They should at least have changed the icon when
they changed it into a multiple purpose thing.

~~~
balladeer
A lot of things in iOS are counterintuitive. Everything happens via a "Share"
option. You want to do something, you click "Share" option and hope there's
that option that you need.

By the way, on my 7 at least, there's a "Folder" icon right before the "Reply"
icon that can also be used to move the message and you can add move option in
one of the swipe gestures as well.

------
virgil_disgr4ce
1) The examples cited are valid UX/UI design criticisms.

2) The author makes quite a few important points about UI problems (I
especially appreciate the point about the importance of maintaining high
standards for free software).

3) Concluding that "usability" is "in decline" from a handful of anecdata is
an irritating, insincere, clickbaity absurdity that serves only to make the
author and those who agree feel more important, that they're Older and Wiser™
for having grown up with CLIs, while the Children Today™ are ignorant fools
who ought to Get Off My Lawn™. I'm so, so tired of this attitude getting in
the way of sincere design critique. If the author had instead titled this
"Some Problems With Various Software UI Design" I wouldn't have a problem. But
then no one would click on it, I guess. (The author anticipates some of these
and the following objections but doesn't actually make any satisfying argument
against them.)

4) Design (among many, many other things, like art and language) are output by
cultures. Cultures evolve unstoppably. Any argument that suggests that
cultures should just "stop changing" are arguing the impossible.

5) Cultures CAN be steered deliberately, but generally only with massive
efforts, such as civil rights in the 20th century (and even then.... :/). But
saying "It's people like you and me who decide to change UI design" is
completely insufficient. I understand and very much appreciate the idea, of
course—be the change you want to see in the world—but insinuating that new
ideas are dumb and useless is itself useless.

6) Cultural change is absolutely critical to continued survival of the
culture. Many new ideas will fail. Many people will fail to learn from
history. But some people will, and some new ideas will succeed wildly.
Stagnating in a perpetual, rose-tinted dream of everything running on a
command line doesn't help anything.

~~~
the_af
I don't think he is arguing against new ideas. He is arguing against a trend
in UI design. In some cases, he's arguing against novelty for novelty's sake.

As for cultural change being a positive force: I agree. However, for a lot of
people computers are mainly tools to achieve a goal, not a goal in themselves.
Just like you would be annoyed if your screwdriver was deprecated, and instead
all that was supported was a power screwdriver -- yes, it's useful sometimes,
but don't test your newfangled ideas on me when all I needed was an old
fashioned screwdriver.

My metaphor is flawed because physical screwdrivers don't deprecate themselves
out of existence, but you get the idea: for most people, computers are just
tools. Change to see "what sticks" is annoying and they don't want to become
guinea pigs.

 _Particularly_ irritating is when the screwdriver manufacturer tells you that
a- manual screwdrivers are no longer supported, and b- you were unscrewing
screws the "wrong" way -- like desktop environment developers sometimes tell
their users: "it's wrong to want icons on your desktop" ("but that's what I
like and always did!" "Well, you're wrong, feature removed!")

------
rileymat2
I may not be using the software as intended, but I like to move windows around
with the title bar. Chrome was the first I noticed hostility against this with
a small title bar, but all browsers have followed.

------
Vektorweg
I feel that "everything is a fullscreen application" is not such a bad idea,
as I can't concentrate on more than one thing at a time regardless. The rest
is just tabbing.

~~~
asiachick
I could get very little work done if I couldn't reference something why
working on something else. Often I need to reference more than one thing. I
also may have a result I need to see separate from the source of that result I
want to edit.

~~~
Vektorweg
Maybe the ability to display references should be a feature in your tool?

As for my experience from Blender, you either utilize its UI to display it
where you want it or put the references directly into the scene. Its also
common for 2D artists to just use layers on the canvas for keeping references.

------
jason0597
Simply put; If it ain't broke, don't fix it.

I feel like modern development has developed into "features for the sake of
features" rather than actual improvement.

~~~
Nextgrid
But developers, designers, devops, marketers and everyone in between need to
justify their jobs so they need things to do.

Maybe this is a result of companies hiring way more full-time people than they
need in the long-term instead of using short-term contracts to develop the
initial product and now they need to keep all those people busy.

------
kzrdude
The missing menu bars are really the most egregious; resulting in totally
mystery meat navigation (props if you know what that's a reference to!)

------
0x445442
Suggestion to author. When you write an article on the decline of usability,
you might want to consider centering the text.

------
bsdubernerd
During the Amiga times, users would scold you for writing a dialog prompt with
non-standard placement of buttons.

------
LargoLasskhyfv
Relax.

Try [http://eaglemode.sourceforge.net](http://eaglemode.sourceforge.net)

------
longtermd
I truly feel like Apple and MacOS has next to none of the problems mentioned
here :)

------
julienreszka
Ironic a web page with such a poor usability having this kind of title

------
aj7
Ha! And I simply hated Adobe apps.

------
ptrenko
Personally loved win 7 the most!!

------
flowerlad
iOS/Mac also experienced a significant decline in usability after Steve Jobs
died. Here's a comparison before vs after:
[https://uxcritique.tumblr.com/](https://uxcritique.tumblr.com/)

~~~
galad87
iOS 7 was a rushed redesign. It took 5 years to fix many of those issues.

~~~
interlocutor
It is still unusable compared to iOS6

------
virtualritz
A friend of mine nailed the problem. His theory: for most apps the user is the
commodity; not the software!

This is the number one reason why usability of software is not improving and
often even declining. The other is feature creep. I'll get to that later.
Caveat: I will be generalizing a lot. :)

For the first point —

In the case of commercial software this is obvious. You get paid an hourly
rate and the company buys software that you then use. Aka: you are not the one
deciding which software to buy. How much you enjoy the experience of using it
and how productive your are while doing so is thus not really important for
the vendor.

In the case of open source software you commonly do not pay for the software
too. The software is developed by people for various reasons. They may use it
themselves or they just like working on it. Again – if you have an issue they
do not really have reasons to care that much.

Feature creep —

One of the things that makes a developer very happy is adding a feature to a
software and exposing it to the user. As a developer myself I know the
feeling. It's wholesome, warm, fuzzy.

But when you expose a feature you need to add a user interface to this
feature. This is the most difficult part. The number of parameters driving a
feature is also called a 'parameter vector'. The more publicly exposed
dimensions such a vector has the more difficult the feature is to use.

A feature that has ten parameters may be useful to 99% of users if only three
of these ten are exposed. The rest can have magic numbers in the code. Adding
another seven dimensions to the publicly exposed parameter vector of a feature
to cover 100% of use cases is a bad idea.

Deciding these things requires intricate understanding of the problem space
from the user's side. Most developers are not good at this. And more
parameters added to a UI somehow feels better to most people. Even though they
do understand that this can be counterproductive.

So my friend had this analogy: imagine if you got paid to use a mobile phone.
Imagine your government bought phones from Apple, Samsung; whoever. And then
you got paid for using them. Do you think we would have something like an
iPhone or modern smartphones? Unlikely. We would have crazy awful phones from
the pre-smartphone era. Probably with much worse UX than some Nokia or
Ericsson phones had at the time just before the iPhone appeared.

This is the situation we have with most of the common closed- and lots of open
source software. Again: I am generalizing here.

On the bright side: software that needs to fight for its user base – whether
open or close source –, often has better usability.

And certainly: for everything I said above there are countless counter
examples. But the overall trend seems obvious to me. I agree with the author
of the article 100%.

Well, maybe I'm just grumpy and old too. :]

------
coleifer
UX careerists are clowns who chase trends and aesthetics over quality and
function.

~~~
noisem4ker
Add marketing-oriented metrics to the mix.

