
We Have Always Been at War with UI - douche
http://eev.ee/blog/2016/02/10/we-have-always-been-at-war-with-ui/
======
lazaroclapp
> Twitter recently changed “favorites” to “likes” and swapped out the star for
> a heart. I’m pretty used to this from Tumblr, so I was surprised by the
> amount of pushback. Until I saw someone make a brilliant point (which I
> neglected to save a link to): these tiny changes bother us because they
> remind us that even our very personal spaces are owned by someone else.

This is, in so many words, the main argument against the cloud and against web
apps in general.

Although, with regards to things changing under your feet, even with
traditional self-hosted software, a pretty big issue becomes that you must
either upgrade eventually, suffer security bugs due to something being dropped
from support, or (if you are using only open-source software) become a
maintainer for every single piece of software you care about...

~~~
agumonkey
Something the desktop era didn't have to bother with. You owned the device,
the software, the storage. The web is sucking the world into its cloud for
reasons that seem less and less relevant.

~~~
lazaroclapp
The desktop era still required you upgrading your software often enough to
patch security issues and remain inside of support. I suppose the main
advantage is that you still roughly got to pick _when_ to upgrade. Also, the
time between major UI shifts was long enough that by the time you absolutely
needed to move from your current version, you were buying a new desktop.
Rather than having the one you'd already been using randomly "mutate" from
under you...

~~~
creshal
> The desktop era still required you upgrading your software often enough to
> patch security issues and remain inside of support.

And sometimes people find UI issues bad enough to take the risk.

Just look at e.g. all the how-tos to downgrade Skype to version 6 or 4, just
to avoid the horrible current UI.

Or the migration from Office 2003 to newer versions with the ribbon UI. If I'd
give my users the choice, they'd all vote to downgrade.

~~~
Drdrdrq
Actually, that was the reason I sold one user on OpenOffice, back then. Happy
user now. :)

------
jasonallen
Good writing has similarities to good UI in that you have to relate to your
audience. I'm a fan of this topic and appreciate your thoughts on it. However
I found your article too long for my taste and attention span. Your complaints
get lost in the enormity of it all.

~~~
Mithaldu
He's writing for exactly the right audience: The people who appreciate this
kind of long-form thing and support him via Patreon, i.e. actual money.

Your post would've been better spent by suggesting a TL;DR paragraph at the
top.

~~~
pandatigox
Just as an aside, I checked out his Patreon page, and it looks like he earns
around $850 a month for blogging. With that kind of money rolling in, you'd
expect some pretty long essays. Or high quality short ones.

~~~
cwyers
That kind of money? That's not enough money to live in a cardboard box and eat
ramen noodles three meals a day.

~~~
exolymph
Au contraire! Cardboard box: free (salvageable). 90 packs of generic ramen:
$90 at most.

~~~
justinjlynn
Malnutrition and exposure leading to chronic ill health and reduced lifespan:
priceless.

------
proc0
Aside from simple web app UI's, something that is highly overlooked in UI, is
customizeability (ability to customize, yeah that word sounds better than it
looks). Only a handful of apps get this right. I've been doing UI for a while
now so I notice as I used different apps. Adobe apps are pretty good, with
consistent icons yet distinguishable, and highly customizeable UI that you can
move around and reposition. On the other side, you have Microsoft Office UI
that is just shit. I want to headbutt my monitor when I spend more time
looking for a command in WORD than in Eclipse, when it's supposed to be a
fucking word processor! It has no consistency in how its categorized in what
seems to be random grouping, very little customization, and they take the
liberty of changing drastically from one version to the next with no option to
go back. I imagine the UX designers at MS like the Comcast guy from South Park
massaging his nipples.

Just give the user the ability to customize a lot, and don't pay teams of UX
to design specific UI that will change again and again.

~~~
oliv__
On the other hand: I don't think there would be a need to customize anything
if the UI was actually well thought-out and executed.

Customizing, unless it is simply about making things feel more personal (like
changing the background for example), feels like fixing the UI to what it
should have been, basically fixing the designer's botched job.

In other words, it _shouldn 't_ be needed.

 _> Just give the user the ability to customize a lot, and don't pay teams of
UX to design specific UI that will change again and again._

No. That's just offloading the complexity onto the user. Not everyone wants to
have to redesign their software.

~~~
leppr
The thing this reasoning overlooks is that _apps_ are complex tools that
people use in very different ways, and it doesn't make sense to force people
to reinvent the wheel everytime they want a feature that your use case studies
didn't predict.

How I think it should always work is:

(1) back-end layer > (2) very customizable GUI layer > (3) highly tuned
sensible defaults and presets.

The great thing about this approach is that if you include analytics, the
users to the UI testing for you: if you introduce an optional UI feature and
see that a large portion of users do activate it, then you know you should
make it the default or include it in some preset.

~~~
oliv__
Well see in my opinion, they don't need to be complex tools at all.

And I would argue that 80% of users probably use the tools in more or less the
same ways anyways: hence the design in the first place.

------
Nutmog
It's a pity that there isn't a ubiquitous way to communicate to programmers
features that are at risk of being removed or at risk of causing
incompatibility with other software. I don't mean some document you have to
look up, I mean if you use an at-risk feature, you're given a warning many
years before it actually gets removed. Programmers often discover features of
APIs by trial and error, not just by reading the manual.

At-risk features should include things that may not get removed but aren't
supported by every platform - like .Net classes that aren't present in Wine.
Sure, MS asks you not to target non-Windows platforms but in reality, users
will still try to do it.

~~~
spacelizard
The tendency to just remove features without so much as a word of warning to
users seems pretty popular these days. I think it is most commonly seen as
part of a push to remove complexity from the program. Unfortunately this
mentality can also make it seem like creating and maintaining channels that
warn users of upcoming changes are unnecessary pieces of process complexity.

Of course all that is accomplished by this internal simplifying is offloading
that complexity to the user who will then complain and/or seek out a way to do
what they want to do on their own. Users searching for undocumented features
is probably a good thing and should be encouraged. In the case of developers
unwilling to communicate properly, all the users can do is try to fill in the
gaps themselves by guessing and speculating on what the developers are up to.
This, I don't think is such a good thing, and it's somewhat embarrassing when
it happens with a social media company whose main product is a communication
tool.

------
wonnage
I don't believe that changes are being made as thoughtlessly as this article
portrays. Twitter employees read the same things you and I read, you can be
damn sure they're aware swapping the Moments tab caused a bunch of accidental
clicks.

The article ascribes the decision to ignorance, or being bad at measuring.
That doesn't feel right here. Swapping tab order is going to guarantee more
people look at it, and maybe the thought process is that twitter moments is a
great feature so even some accidental clickers might start using it as a
result of actually seeing it for the first time.

Call it hubris or not caring about your users or whatever; the point is, all
of these "dumb" decisions are being made by people whose job it is to think
about them.

~~~
chipsy
Employees can easily experience tunnel vision even if they were happily
dogfooding the product themselves:

[http://www.theverge.com/2016/2/6/10926816/twitter-
employee-e...](http://www.theverge.com/2016/2/6/10926816/twitter-employee-
experiences-harassment-on-twitter)

