

My Brain Has No Space For Your User Interface - achalkley
http://joshtimonen.com/post/79486778102/my-brain-has-no-space-for-your-user-interface

======
taeric
I thrive almost precisely _because_ I'm willing to explore and learn
interfaces. The day I give up the desire to figure out how to use rather
mundane objects is the day I officially give up being a tinkerer.

Does it take some figuring out when I jump from one android to another? Or to
an iPhone? Or an old style phone? Sure, but so what? Do I want to do it daily?
Not really. However, doing so every now and then is _hugely_ beneficial to
seeing how others see the world.

This is no different than why you should try different programming languages.
Sometimes, it just helps to see things differently. Even if it is the same old
todo list you have seen countless times. Maybe something will "click" this
time.

Seriously, consider that when you are using a new UI, you are seeing something
as envisioned by someone else. It may be that the someone else is a committee
that couldn't agree on anything. In this case the learning experience will be
mostly frustration.

However, do you really think that things that you know and use daily are truly
easy? Watch a child try to operate a door sometime. Any kind of door. It does
not matter.

Maybe I'm just projecting and hoping that by instilling a sense of "keep
trying" and "learn what you can do, do not get upset with what you can't" to
my kids. Surely I am not alone in this, though.

And this is far from new. The devices we have to learn nowdays are downright
easy compared to the stuff from years past. Have you seen some of the heavy
machinery that people operate? There are more levers on a standard worksite
tractor than I can really make sense of.

------
Turing_Machine
IMO, it's more likely that your brain doesn't have space for _bad_ user
interfaces. I don't think we actually have any _good_ ones yet.

Consider the physical world. Cars, toilets, old-school dial telephones, doors,
blenders, paper books, hammers, bulldozers, can openers... have vastly
disparate user interfaces, but no one complains about having to learn them
all, nor do they expect some kind of chimeric "common user interface" and
complain when the hammer and the table saw don't work the same way.

Hammers that required you to adjust screen sliders to adjust the force and
angle of the blow would _suck_. Hammers and table saws that were given some
sort of half-assed "common user interface" would _suck bad_.

So, yeah, there are a lot of bad UIs. That doesn't make (current favorite UI)
a good one, though. The way that this gets better in the long term is for
people to keep making new UIs (most of which will inevitably be bad). Trying
to stop that process just means that it'll take that much longer before we get
good ones.

~~~
henrikschroder
> That doesn't make (current favorite UI) a good one, though.

There are plenty of things with great user interfaces, and the way you can
recognize them is if the interface "disappears" when you use the thing. If you
can use the thing without thinking, without having to read a sign, without
having a cheat-sheet, when you're very sleepy, and you still do the right
thing, then the interface is good.

My pet peeve is household appliances that are just in the way, that do the
wrong thing, that make you do unnecessary steps to get what you want.

I still miss my old microwave with its huge digital knob for setting time and
power. I developed muscle-memory for using that thing, because it just felt
right, my body _knew_ exactly how much to turn the knob to get a certain time.

My current one is atrocious, it has a number pad. No.

And it doesn't have to be that way, I'm always concious about user interfaces
of things I buy, and go out of my way to buy stuff that has good interfaces. I
hope that by supporting companies that put in the effort, they'll win in the
end. (It's a slim hope, though.)

~~~
thaumasiotes
> If you can use the thing without thinking, without having to read a sign,
> without having a cheat-sheet, when you're very sleepy, and you still do the
> right thing, then the interface is good.

I'm not sure any of this applies to table saws. You might be setting the
standard a little high.

~~~
buddhirons
"Muscle memory" I worked years ago as a welders assistant and the physical act
of controlling the trigger on a burning torch became ingrained and reflexive.
It really does apply to everything even table saws.

------
userbinator
I think a lot of it has to do with discoverability (or lack thereof) - e.g.
buttons, which used to _actually look like things you could depress_ , have
turned into little icons that often don't even have a border. It makes it
harder to know whether something is a button or just an indicator, or even
purely decorative. The trend seems to be to hide everything away in (at times
multiple) layers of submenus, require various gestures (with no hints that you
can), and offer little in the way of context-sensitive help.

~~~
joshtimonen
I also dislike the trend away from bordered buttons in iOS 7. Users shouldn't
have to guess whether something is a label or a button. Trading off that
usability for just a little visual cleanliness seems like a bad deal.

~~~
lstamour
You can turn on visual buttons in accessibility. Which to me just becomes one
more thing I need to test for as a dev ;-)

~~~
userbinator
Funny they put it in Accessibility - the category of options for
"disadvantaged" users, as if "normal" users wouldn't need a more accessible
UI?

~~~
lstamour
I don't know if I'd say that. I can't tell you how often I employed VoiceOver
pre-Siri as a way to dim the screen while playing YouTube audio or to read me
all my emails while doing something else. In fact, a year to when Siri was
announced, I referenced that and got a nudge-nudge, wink-wink that audio was
one of the next big things.

In some ways I think Accessibility is a misnomer -- it's the way Apple says,
hey, you 20% of pro users, here's something you can try. Hence Accessibility
on Mac getting a security center so apps can use those hooks to resize and
read the screen the way "most" shouldn't.

I treat it as a toy box, and frankly still miss most of its settings on
Android. Have a look at WWDC videos on accessibility, unlike the counterparts
from Google and Microsoft which often require special extensions or apps to
enable, Apple delivers the whole thing, and it really works.

Why else would Apple offer 8-9 different font sizes in roughly four font
weights (normal, bold, and fonts overwritten normal and bold). Yes, some
options could be highlighted during the welcome screen, in the same way
VoiceOver is on a Mac at first launch. But like right-click, some things you
have to turn on yourself until they sync over the cloud. But that's life. At
least their settings apply across the board to all apps.

------
robobro
The author seems to present two main points in this essay:

1\. I have a lot of really cool brand new stuff.

2\. Instead of having 100 different UI's, let's have less.

Is there any value in either? Half bragging and half common sense does not an
engaging essay make. And while I appreciate what I understand to be his second
point, I don't think it's necessarily right.

Sometimes it makes sense to have more or less buttons on something, depending
on its functions. A TV remote with anything less than 20 buttons is probably
useless. A UI which does not take advantage of the wide-ranging input is
equally useless. A car's GPS can't really work well with just 2 or 3 physical
buttons, and vanilla Android is obviously ill-fitting, even with a touch
screen. The author, a serious Mac user, shouldn't even be satisfied with OSX.
Has he read the UNIX Hater's Handbook? It criticizes OSX's underlying
operating system with the same points he makes against everything but his
beloved iToys.

I have a simple solution to the problem of too many UIs. Stop buying so much
shit. Keep the growth of technology in check by voting with your wallet, and
keep your mind clear by not filling it up with anything other than what is
absolutely necessary. Use your phone as your GPS and TV remote. Watch TV on
your computer. You can really reduce a lot of your bulk, both in terms of
hardware and software, by getting more out of what you already have.

[Obligatory comment against software patents.]

------
lam
Good start in pointing out those disparate UIs. As you have pointed out,
though, devs/manufacturers feel they have an incentive to create something
different. From a user's POV what's needed is just one UI that he's familiar
with. Maybe one solution is creating a UI (HW/SW/etc) that knows how to map
functionalities automatically for different devices/apps.

~~~
SolarNet
Definitely, the main thing is that we need a unified UI system.

The web isn't a _current_ solution, although it has the potential to be it.

And there aren't any good desktop solutions because there are so many UI
patterns and elements that are required.

~~~
sizzle
what about the ability to create user friendly, customizable shortcuts in the
UI of certain products? I really enjoy customizing my android phone with
shortcuts that are intuitive to me, such as double tap back arrow --> jump
between last open app, and long press menu button --> pull down notification
bar from top.

I wish I could define or remap buttons on physical appliances, granted it's an
advanced mode common users would not be burdened with.

~~~
SolarNet
By being the one that control the UI's look and feel, the ability for power
users to create arbitrary UI controls would simply materialize!

------
bitkit
Well, I think things will get more focused. Like Google's watch UI announced
the other day, it only does a few things. Also, as voice commands start really
taking off, visual UIs will fade in importance. Sure, we'll always have
visuals, but it won't be so abused as described. Lastly, good UX is happening
now, you see really smart people like Alan Cooper who design for goals, not
just a single task. This way of thinking about an interface even before its
made will cut down on some of the UI foolishness.

------
pistle
First world problems. Get an aeropress or french press for coffee. Get rid of
your TV (that will take out another 3-4 devices most likely). Get rid of your
watch.

There... now you are UI-defragged and you make better coffee.

~~~
aashishkoirala
Annoying hipster alert.

~~~
ludston
Is that really necessary?

------
lvturner
How many are interfaces are you expected to deal with now? Zero.

This sounds as much about having too much stuff in your life than it does
about user interfaces.

If you are complaining about your cars touch-screen, you should perhaps be
considering if you really need a car with a touch-screen in it at all.

And for the record, my coffee machines user interface is roughly as
complicated as the user interface to a light-bulb.

------
rando289
Today I opened one of many blog's I will skim with a unique design. I first
looked for the date, it took me a good 20 seconds to find because it was in
tiny greyed out text hidden at the bottom of the page. Then I saw this was a
blog about inconsistent user interfaces, and I'd say it was partly the pot
calling the kettle black.

------
AnimalMuppet
Absolutely.

And for this reason, an app has to be _significantly_ better for it to be
worth the bother to switch.

------
nnq
> They’re trying too hard, and making a mess in the process.

I can sincerely say this about _all_ the _graphic designers_ and _ui /ux
designers_ I've ever worked with!

------
nej
"In 10 years, this UI list may look laughably small. We’ll probably be
discussing the operating systems on our tube socks and dust pans. What can be
done?" really makes you think.

~~~
RobotCaleb
How's that?

------
mnorton3
Guess it's time to crack out the Chemex for a simple pour over coffee, then
hop on to our bikes! At least that will help with some of the brain fatigue.

------
Pacabel
This really wasn't an issue in the 1990s and even into the early 2000s. Back
then, most applications running on Windows, Mac OS and even the various
commercial Unixes followed rather clear and uniform UI conventions.

There were differences between applications running within the same ecosystem,
of course, and not as wide of a range of device types. But because of the
greater consistency, it was generally much easier to learn how to use new
applications. A lot of existing knowledge carried over, and was immediately
applicable when using a new application.

This uniformity has unfortunately never really developed well on the various
mobile platforms used today. It has deteriorated quite badly under Windows,
with Microsoft themselves being somewhat responsible for this. The situation
is perhaps the best under OS X. But even here we've still seen applications
like Chrome and Firefox use non-standard UIs.

~~~
jacobolus
> _This really wasn 't an issue in the 1990s and even into the early 2000s.
> Back then, most applications [..] followed rather clear and uniform UI
> conventions._

Hah. Haha. Hahahahaha.

If you pulled similar samples of UIs from the 90s and from today, I bet you’d
find about the same proportion of fucked up designs in both (and that
proportion would be shockingly high).

Some applications (especially on Mac OS) tried to stick to Apple’s human
interface guidelines, and some of those were successful, but even on the Mac
there have always been heaps and heaps of horrible broken user interfaces. And
don’t get me started on Windows applications from the 1990s. What a steaming
pile...

I think what’s changed today is that (a) with ubiquitous internet and a very
easy install process, people end up interacting with more applications (at
least to download, play with once or twice, and then forget about),

~~~
Pacabel
The consistency of a UI is quite distinct from the quality or the usability of
a UI. You're incorrectly blurring these very different concepts.

I never claimed that all UIs back then were "perfect", or that there weren't
"badly designed" UIs. I was very clearly talking about consistency between
different applications running on the same system.

When using applications from that period of time, one could expect to find a
menu bar and menus that were quite similar to those of other applications. Of
course there were application-specific menus and items, but there was at least
a common subset shared by nearly all applications, especially any that were
designed with even the minimal level of care.

The same held true for toolbars, keyboard shortcuts, common dialogs, and so
forth. There were also general conventions for how the application-specific
parts of UIs looked and behaved. While not everyone followed these
conventions, of course, most of the major and seriously-designed applications
did.

We have lost much of that today. It's particularly bad when it comes to mobile
apps, and getting worse when it comes to desktop applications. Yes, Microsoft
and Apple themselves are to blame, to some extent. But it goes much beyond
that these days. Inconsistency is the norm, and that does lead to
inefficiency.

