

Retro design is crippling innovation - PixelRobot
http://www.wired.co.uk/magazine/archive/2012/03/ideas-bank/clive-thompson

======
mechanical_fish
Comparing the headline to the contents, I feel there's room for a follow-up
article: "hyperbole is killing criticism".

I mean, it's OK to hate skeuomorphs, a valid critical position, but nobody is
going to die.

And the whole thing feels like Louis CK's riff about airplanes. You are
_riffling through entire bookstores and museums on an affordable 150dpi Star
Trek pad in the bathroom_ while complaining that innovation has stopped
because the graphics look too familiar.

Two limited defenses of skeuomorphs. One: In a world where all of tech turns
over on a timescale of months, orientation is important. It may be more
important that people can glance at your calendar app and _tell that it's
supposed to be a calendar_ than that the calendar app perform optimally in the
hands of a trained expert.

The other is: Skeuomorphs generally mimic designs that are at least decades
old, sometimes centuries old. Be cautious about casually discarding the work
of tens of generations of designers in the name of neophilia.

(Book page-flipping animations are less defensible with the latter argument,
but the former still applies.)

~~~
dkarl
I don't think it's okay to hate on skeuomorphs in shipping products, as the
author does here. It's a designer-centric rather than user-centric way of
looking at things, and it's bad news for users when value judgments that
aren't derived from the user experience are allowed to affect real products.
Going by his definition of skeuomorphs, a drop-down menu is not a skeuomorph.
A spinner (like the ones used in the iPhone to set a timer or an alarm) is a
skeuomorph. That distinction has nothing whatsoever to do with which one is
more usable in a given context.

Design space is dizzyingly unconstrained, and finding an optimal design is an
intractable problem. A skeuomorph (sticking to the definition in play here)
has taken its initial design from a particular source, a physical object. Odds
are that the optimal design does not resemble that physical object. So what?
Physical objects are a legitimate source of design ideas. Starting with a
highly optimized solution to a similar, more tightly constrained problem is a
common and effective pattern for generating good solutions to an intractable
problem.

Another way of saying "hating on skeuomorphs" is "stigmatizing skeuomorphs,"
and the likely outcome of that is that designers, mindful of their
professional image, will tend to eschew skeuomorphs in favor of inferior non-
skeuomorphic designs until the winds shift back. (Isn't that the point of
hating on skeuomorphs? To influence which designs are perceived as good or
bad?) All that means to users is that they will be forced to use inferior
designs because of a point of fashion they aren't even aware of. Another way
of saying it is that analyzing whether a design is a skeuomorph or not is like
analyzing a musician's influences: it might give you a clue to whether the
creator keeps up with current trends, but it won't tell you anything about
whether the product is good or not.

Sure, it's legitimate to say that other sources of inspiration should be mined
as well. Constraining a problem to non-skeuomorphic solutions is a legitimate
creative exercise, just like constraining the problem in any other way. It's a
creative exercise, though. It isn't a legitimate rule of thumb for designing
products any more than "use skeuomorphs!" is a good rule of thumb. Blindly
changing features to make them less skeuomorphic is likely to make the product
worse, not better. How to implement features in a shipping product such as
Google Calendar should be based on usability, not whether you think the
designer's thought process reflects a certain heuristic, overdependence on
which has historically inhibited the emergence of new and better designs.
Research efforts should be criticized on that basis, not shipping products.

~~~
reuser
Ease of learning is often decoupled from ease of efficient use, sometimes even
inversely related. 'Usable' is not one thing for everybody.

So: a skeuomorph is obviously not a problem if it does not cause gratuitous
errors or slowdowns, and it obviously is if it does.

And in any case design also concerns matters of taste, on which opinions will
naturally differ.

------
camtarn
There's a nice tip in the comments for that article:

"In Google Calendar, pick the "4 Weeks" option at the top, instead of "Month"
and it functions exactly as you describe, with the current week at the top.
-benjymous"

I didn't have a 4 Weeks view - turns out there's a setting which controls what
the button between Month and Agenda does, in Settings -> General -> Custom
View. There's also a setting above that for which view is your default when
you reload the calendar.

~~~
pedrolll
I've set 4 weeks as default in Google Calendar, so reading this article was
baffling at first. Curious that 4 weeks doesn't show up for everyone. It is
infinitely more useful than the month view.

------
jinushaun
I don't understand his criticism of the calendar. I WANT to see past dates. It
gives me a context to what day of the week it is in relation to dates before
and after today. Nothing irks me more when a calendar does not show days from
last month on this month's calendar.

Example: Wednesday is April 2... Was Monday March 30 or 31? It forces me to
switch my entire calendar to get this info. I want to see days in March and
May on my April calendar.

For me personally, the agenda view is hard to view because my brain works on a
week/month scale, not day scale. In day view, I also like to see all my events
in blocks throughout the day with gaps, instead of a linear list of events
without gaps. It helps me visualise free time.

~~~
checker
I learned this rhyme from my parents:
<http://en.wikipedia.org/wiki/Thirty_days_hath_September>

It works pretty well (I just remember the first two lines). I hope it saves
you some time!

------
sambeau
I don't get it.

He spends most of the article telling us that skeuomorphs are bad giving the
example of calendar app that looks like a desk calendar and then finishes by
praising an app with a screen that skeuomorphically flips up just like a wall-
calendar.

~~~
dagw
The difference (at least as I understood it) is that that app isn't a wall
calendar and chose that approach because they believed it was the best design,
not because they wanted to mimic a wall calendar. Most designers of calendar
and calculator software set out to design something as close to a paper
calendar or desktop calculator as possible without necessarily asking if it
was the best possible design.

Or in summary Mimicking because it is good design that fits what you're trying
to do is good, mimicking simply for the sake mimicking is bad.

~~~
itmag
Yeah, there's a lot of cached thoughts in UI design.

Someone should start an open source lab that sets out to explore and test new
design patterns in that space, without any preconceived notions of what should
and should not work.

------
dkarl
I really hate criticism that doesn't suggest improvements. You can point out
deficiencies all day, but until you have proposed or implied an improvement,
you have not even proved that what you are criticizing is suboptimal, much
less provided any helpful pointer towards improvement.

Unfortunately, the "old-fashioned, physical object" this design is based on is
the human brain, or more precisely, the useful cultural artifacts that happen
to be engrained in pretty much all the human brains on the planet. The monthly
and weekly calendar reflects how we think about time. You think you can stop
me from thinking in weeks and months by changing one calendar application?
You'd have to work a little harder than that. Here's what you need to do:

1\. Stop my company from scheduling my paychecks to coincide with the
beginning and middle of each month. Stop them from organizing my work days
into groups of five days in the middle of groups of seven, and stop people
where I work from informally scheduling things with respect to month
boundaries.

2\. Do the same to the companies that employ all the people I occasionally
synchronize my social schedule with.

3\. Stop all the businesses I use from scheduling lessons and classes on a
monthly basis and varying their hours on a weekly basis. Persuade the state
government that it should be as easy to buy liquor on Sunday as on any other
day. (That should be easy once you've abolished the days of the week; see 5.)

4\. Detach holidays from dates. It would be so much nicer if Thanksgiving was
the 329th day of the year instead of the fourth Thursday in November. That way
I could forget about months and weeks, and my online calendar could make
better use of screen space. (Again, this will follow easily once you've
accomplished number 5.)

5\. Abolish the days of the week and the months of the year. Prevent anyone
from referring to Monday, Tuesday, January, etc., or to weeks and months at
all, so I never have to think, "We're shipping on the first Monday of next
month. How many days until then?"

After all that, I'll no longer want a calendar app that orients me with
respect to weeks and months. You can get rid of your "skeuomorph," and I won't
mind that my calendar app gives me no visual, non-verbal cues about what day
of the week or week of the month it is.

So, sarcasm aside, you DO need to show month boundaries. It is not helpful to
propose doing away with the one convention for that without proposing any
replacement for it.

Also, it is not helpful to propose doing away with the past entirely. People
are oriented by the past as well as by the future. Oh, dear, it's been a week
since I told Doug that information would be available soon. I had better drop
him a line and explain that it's delayed. My tooth has been hurting ever since
I went to the dentist; how long has that been? When's the last time I worked
out?

Showing the past is a valuable function! If you don't recognize that showing
the past is part of the function of the calendar, then criticizing a calendar
for showing too much of the past rings a little hollow, because you aren't
balancing the valuable functions of the interface. You're just picking one and
throwing out another.

A better statement of the problem is that the visual clues for month
boundaries are the top and bottom of the displayed grid of days on the screen,
and therefore the past to present ratio varies dramatically through the month.
During some parts of the month, you see very little of the past, and during
other parts of the month, you see very little of the future.

A useful suggestion would be to detach month boundaries from the edges of the
displayed grid and show them in some other way, perhaps by using color or
shading. That way you can keep the balance of past to present close to an
optimum value. Perhaps the current week can be the second or third from the
top. I'm not a UI designer, but I think that is a more useful analysis of the
problem, even if I didn't show off a new word I learned a few weeks ago from a
magazine article.

~~~
wulczer
Not quite there, but at least it was a start:

<http://en.wikipedia.org/wiki/French_Republican_Calendar>

------
dwyer
Finally! Somebody who understands me!

Now I certainly don't think retro design is "crippling" the basic OS
experience. I personally find the whole desktop metaphor to be silly and
outdated and not so useful for a generation that has grown up with computers
and don't need a clock to look like a wall clock in order to understand that
it's telling us the time. OSX widgets may be immature and tasteless, but
they're not a hindrance.

If retro design is crippling anything, IMO, it digital audio workstations
(DAW). Whenever I launch Cubase I have to deal with virtual mixer boards and
virtual synthesizers that mimic the interfaces of 40 year old hardware to the
point that you have to turn virtual knobs with my mouse. It's ridiculous and
more often than not it's frustrating. Now I understand how this choice of
design eases the learning curve, but I'd much rather jump over a few hurdles
than run straightaway into a wall.

This is why programs like vim and emacs are still relevant decades after their
invention. They don't insult the users intelligence, they don't pretend to be
something they're not, and they take full advantage of the platform they're
designed for.

If somebody could create a DAW that adhere's to this kind of philosophy, I
don't care if it uses ncurses as an interface, I'd adopt it in a heartbeat.

------
keithpeter
"Let paper work like paper and screens like screens."

I liked that quote. Having said that, I take the point that skeuomorphs are
not exactly destroying people's minds.

Another quote I found extremely intresting was this one, from a professional
designer's blog

"It shows the care and attention paid to the printing of a photographic image,
but also _shows how the analogue process of printing a photograph shares a lot
with the the digital process of adjusting an image in Photoshop._ " [ my
italics ]

See the quote in context at the link below

<http://www.wemadethis.co.uk/blog/2012/01/shaped-by-war/>

Basically, are we reaching the point where metaphors that originally made
software more accessible (Photoshop like a wet darkroom) actually lose their
meaning. My colleagues who teach photography often illustrate aspects of the
photographic printing process using Photoshop (reverse metaphor).

Does anyone have any academic references on the anthropology of interfaces?

------
bornon5
If his argument is valid, there's no logical reason to stop where he does. You
see a continuous progression of weeks instead of discrete months; fine. But
what about days? What's so special about the end of one day and the beginning
of the next? Nothing - it's just an artifact of paper calendars, daily
planners, and so on.

What a calendar should be, if we're going to truly abolish the tragedy of
skeuomorphism, is a smoothly flowing timeline, with you at the front,
diminishing logarithmically into the future. This way, you can clearly see
your appointments. Should make the author happy.

The point is, sometimes we need to artificially break things up into
manageable pieces. We think in milestones - the beginning of a season,
midnight as a landmark showing how late you're staying up. Some skeuomorphic
designs are only "incidentally" skeuomorphic, in that they solve a problem the
right way, and just happen to resemble how people used to solve that problem.

------
nextparadigms
Isn't iOS filled with skeumorphism, too? I know Matias Duarte said he wanted
to make Android 4.0's design completely digital and without any skeumorphism.

------
rurounijones
First thing I saw "Retro design is crippling innovation" followed by the left
hand column "Subscribe to Wired Magazine for..."

Magazine, how retro

~~~
jmilloy
No, I don't think that magazines count as retro _design_. The problem isn't
that print magazines still exist; the problem would be if the online
subscription was designed to look like a print magazine (static pages sizes,
separate pages that had to be turned, no links, etc).

------
ctdonath
I keep wondering when we'll move away from the "pages" paradigm in e-books. We
should be returning to the scroll paradigm (which is already a familiar motif
in software), not forcing content to fit a physical format which the e-reader
is precisely an attempt to get away from.

Worst case is when trying to highlight, as a single mark (with note attached),
a section of text which spans more than 2 pages (esp. when a "page" is a
large-font phone-screen size): I can't tap-and-drag from one end of the
section to then next because the "page" paradigm intervenes. Web page? word
processing? no problem, we're used to highlighting via scrolling. E-book?
_scroll_? nope, gotta match that page-flipping animation.

~~~
bwarp
Scrolls do have one disadvantage over pages. It's hard to communicate where
you are in one without some indexing aid. Any scrolling technology seems to be
devoid of that.

I think that's why the page paradigm has persisted. It's all about
familiarity.

------
petercooper
I thought this was going to be about gaming before I read it. Retro designs
have become de rigeur in gaming over the last couple of years and it's
_boosting_ innovation by making it possible for indie developers to compete on
a more level playing field against the big studios (e.g. MineCraft).

This article demonstrates how it's almost the opposite for non-game
interfaces. Instead, it's the plucky upstarts that can take the risk on novel
interfaces since they don't have large numbers of users to annoy (and I bet if
Google tried to get radical with Calendar now, there'd be a real storm of hate
over it).

------
yabai
If nothing else, this article made me consider all of the 'retro' ideas I live
with everyday. It is amazing how much things advance while staying the same.

What other 'retro' tech hasn't changed much? The automobile?

------
jmilloy
Often, I'm just as frustrated about the opposite problem... technology that is
_less_ functional than what it is replacing because it _doesn't_ replicate
important features in a natural way. For example, I feel like I can organize
my physical music collection on shelves in my house more quickly, naturally,
and helpfully than I can my digital music collection in any media player/music
manager I've tried. Shouldn't our technology be _at least as good as_ its
counterpart?

~~~
andybak
So how do you search or sort by track name in your physical music collection?

~~~
jmilloy
You're right, I can't, and that's a useful feature. However, I don't find that
I need to very often. In the interest and excitement of offering new (useful)
features such as track name searching, I feel that these programs overlook
many _more_ useful features. (This is besides the point, but... for example, I
can organize albums together on a shelf in my house. That's how I like to
decide what to listen to. Why do digital music libraries fail to offer such a
tool?)

------
aridiculous
The author is off-target.

Apple went over the top with the new iCal GUI, but it wasn't an interaction
problem. Users have mental models from cultural experience that dictate how
they want to use things. In other words, "that software feels intuitive."

The issue I suspect the author actually has with the new iCal is that it goes
over the top visually. Software doesn't have to mimic the gaudy ornamentation
of a real world object to resemble it.

------
bradt
Can't agree more that skeuomorphs are awful. Probably my fav example of this
is "sticky notes" apps. So f'in useless! What are your favs?

~~~
natesm
iBooks. It shows a bunch of pages on each side, regardless of whether or not
you're on the first page, actually in the middle, or on the last page.

I wish that there was an OS with the core of iOS and the appearance of WP7[1].

1\. except Helvetica.

------
zmonkeyz
One nice thing that came about with the KIN (yes as in Microsoft) was the KIN
studio. It had sort of a timeline thing going with all of the pictures and
video you uploaded grouped by days. A lot of people complained it had no
calendar but I thought it would have been kind of cool to put the calendar
functionality with the studio.

------
Rygu
Regarding Google Calendar, there's a "7 days" view that shows today+6 days.

Personally I prefer the regular week view.

------
jasonjei
Speaking of a good calendar UI widget, I really like how ITA Software OnTheFly
iOS app does it... Instead of showing you one month at a time it shows you the
current week - 2 weeks, current week + 2 weeks. Makes it really utilitarian. I
wish more apps would adopt that format making scheduling much easier.

------
Too
HEAVILY inspired by [http://www.marco.org/2010/03/28/more-ideas-than-time-
logarit...](http://www.marco.org/2010/03/28/more-ideas-than-time-logarithmic-
calendar-view)

------
yock
So...this article about how a design philosophy is wrong contains exactly how
many direct comparasons between flawed designs and their alternatives?

------
mbesto
This is article's premise is weird on so many accounts..

It's "crippling innovation"...what!? Retro design isn't an institution, or a
person, or even a "thing". It's a concept that designers take in order to
progress towards a more usable interface.

You know what "cripples" innovation - massive design changes that happen in
extended batch schedules. Technically this doesn't cripple innovation, it
simply decreases innovative adoption and user acceptance.

------
pnathan
What's wrong with skeuomorphs?

That needs to be answered first before assuming crippling of innovation.

