
The Anti-Mac User Interface - uros643
http://www.useit.com/papers/anti-mac.html
======
waxman
Summary:

In 1996 some academics tried to re-imagine human-computer interaction, and
they used the current Mac OS as a starting point (FYI: It was version 7.5).

How Their Predictions Panned Out:

First off, they got some relatively obvious things right: they correctly saw
that computers of the future would be hyper-connected instead of isolated (but
in 1996 this wasn't exactly a bold prediction), and that they would have
hardware orders of magnitude more powerful.

Somewhat more insightfully they predicted that the purpose of computers would
shift from mainly solo productivity-type work to games, multimedia, creative,
social, etc. They also predicted the rise of alternative forms of I/O, and I
think the I/O of Apple's iOS, for instance, is in many ways consistent with
what they envisioned.

A few of their guesses have not yet materialized, though. They emphasize the
use of "language" over point and click icons. Unfortunately, NLP and AI are
harder than they anticipated. If we again look at iOS as an example, touch
icons, it seems, are still much more useful than natural language.

The article provides a fascinating time capsule of HCI thinking at the dawn of
the Internet.

Incidentally, just weeks after this article was published Steve Jobs returned
to Apple and sparked the new generation of interfaces that they could only
imagine.

~~~
lwhi
" _Incidentally, just weeks after this article was published Steve Jobs
returned to Apple and sparked the new generation of interfaces that they could
only imagine._ "

In what way were the newer generation of interfaces different to what they'd
already described (as the 'mac interface')?

\--

I think we're still using very WIMP orientated interfaces .. most interfaces
haven't changed that much. Even iOS / Android use central themes from the old
desktop metaphor.

~~~
Samuel_Michon
Apple's iOS doesn't use windows, all apps are full-screen. IOS doesn't have a
menu bar, only an info bar (time, power, cell reception, etc). It doesn't even
use a pointing device, and there's no onscreen pointer -- you can use your
finger to 'click' (touch) and drag, but you can't point.

Window, Icon, Menu, Pointing device. Only the icons survived.

There's no desktop metaphor either in iOS: no recycle bin, no central file
cabinet to browse through, no desktop to arbitrarily place things on, no way
to make file folders.

~~~
lwhi
* (W) You are presented with modal dialogs (a form of window) at various points in many iOS interfaces - ditto for Android.

* (I) The main way of representing an application is via an icon.

* (M) There are definitely menus in Android. I'd argue that there are also even menus in iOS - lists of options that are allow you select which operation you'd like to perform (e.g. mail settings menu).

* (P) You can point - you use your finger. You can't hover.

It's not really that ground-breaking - it's a relatively small iteration.

~~~
Samuel_Michon
When using Mac OS X, no matter what you do, there's always a menu bar at the
top of the screen. Until recently, all Mac OS X apps had window chrome, even
when the application window was maximized. IOS doesn't have any of that --
there are specific situations in which you are presented with a modal dialog
or a menu, but they aren't GUI elements that are always onscreen.

Then there's the pointing. In iOS, you can't just point -- to touch is to
point+click.

~~~
lwhi
I don't understand why these distinctions make any difference? Fundamentally -
I think the core elements are very similar - the differences are necessary due
to the constraints of mobile devices. Perhaps implementation differs - but I
don't see how this makes my point less true.

Are you stating that we actually are making use of interfaces which are
groundbreaking (in a similar way to those presented in the article)?

~~~
Samuel_Michon
I doubt we'll see another leap as huge as from CLI to GUI. But as incremental
improvements in user interfaces go, I find iOS (and multitouch control in
general) pretty significant.

~~~
cturner

        I doubt we'll see another leap as huge as from CLI to GUI
    

Unless you have a particular reason to believe that GUI is the end of
evolution, then there's room for massive change.

Circumstances suggest that wimp is not the last word on user interfaces. Some
factors to think about:

* The GUI has grown during an era of groups fighting to establish platforms, when there were constant newcomers to the scene which caused a large emphasis on creating systems that are as newcomer-friendly as possible. Those circumstances will change.

* As the scale of data increases, user interface needs change. Consider the progression from flat file structure, to directories on a single machine to searching through large datasets, and the different tooling we use. I deal with large numbers of small files, and have built custom tools to allow myself not to drown in scale. GUI tools and CLI directory systems are inadequate.

* A lot of GUI has been driven by the need to make glossy things that sell, rather than things that are useful. Look at changes to Windows since XP. As free software eats away at territory that is currently commercially viable, this will change. We won't have companies pushing gaudy bells and whistles that have shifted platforms over the last fifteen years. Stability increases in value.

* Plenty of power users have never been happy with the WIMP approach, and consider it a leap backwards from the CLI. 1980s word processing users, unix users, people who knew menu-driven mainframe systems before being "upgraded" to less-useful GUI replacements. There's strong interface loyalty to the Bloomberg interface from people who have also dealt with GUI and CLI tools, despite significant problems in that system.

It's unusual to know about groundbreaking things before they happen. I expect
our grandchildren will look back at screenshots of today's candy interfaces
and ask how we took it seriously.

------
Tichy
Upvoted in the hopes that somebody will provide a summary...

~~~
lwhi
TL;DR - the user interface paradigms that we current use are still pretty
similar to the interfaces that were developed at Xerox PARC (bought by Apple).
Not much has changed.

It doesn't have to be this way.

What are the principles of current interfaces / what could the future
principles be (if we're not stuck with the constraints set by the current
batch of WIMP interfaces)?

It's a 'what if' article.

~~~
kenkam
Makes sense.

If we take anti-mac as the future principles, then many of them have in some
ways become true...

~~~
lwhi
Indeed .. I think that's a good point, but I'd also argue that I'm using a
WIMP interface right now to write this message.

Maybe Chrome OS is more of an 'anti-mac' interface?

Nielsen's initial summary points out that the article forecast the coming of
an 'Internet desktop', but I'm not sure what an 'Internet desktop' is ..

------
joe_the_user
The article is quite vague about what an anti-Mac interface would but it does
make some good points about the limits of the mouse-window-and-menu GUI.

Especially: Language-like interfaces are better than thing-like interfaces.
Interacting with language is a much preferable approach than physically
manipulating things, especially since most of what an average user wants to
input is discrete, meaningful bits of information (visual and auditory artists
are the exception).

Both Google and Quicksilver are example of more "language like" interfaces but
there's not too much flesh on the bones of the article's "anti-Mac User
Interface".

------
argimenes
This article shares with many other academic exercises a fine analytical
examination of current trends but provides a woefully unhelpful basis for
innovation. It sounds merely negative to say that, but I think the problem
isn't with the scholarship of these exercises, but the purposes they're put
to. Plainly put, academic constraints in all but the most fearless imposes a
kind of self-censorship, a desire to present a tidy, well-researched argument
that only with extreme caution ventures out on a limb.

Almost the first thing I do when reading articles like this is scroll to the
end and see if any actual interface designs are offered. There are none. No
criticism is intended; mostly a statement of the obvious!

The frustrating thing is that interface design FEELS like it ought to be the
most natural thing in the world, yet even after the first Macintosh in 1984 we
are STILL steaming down the track of the window paradigm. Over 25 years it's
of course become more 'natural' - which is to simply say more mentally
efficient - but thinking outside the paradigm is kind of like trying to
imagine a third arm; or at least wiggle an eyebrow you've never wiggled ...

------
tuhin
There are two distinct way to look at it. The first, progress for the sake of
progress is not the solution. However, when you think of it, we have been
walking using our legs since time immemorial. Is that wrong or does it call
for rapid evolutionary prototyping of humans? Maybe, maybe not?

I remember using one of such revolutionary interfaces recently "Bump top
desktop", where did that lead us? I was honestly irritated with it. I am of
course talking about the visual interface and not of things like memory of
last actions performed and understanding the series of events. The nearest
things that comes to my mind is "Clippy" from MS Office. Remember how if you
used to paste something in every slide, it used to tell you of the Master
Slide View? That is the direction interfaces need to go.

Also has anyone used Soulver? For a real metaphor of calculator in the UI, it
goes out of box to solve the way people want to deal with it on computer.
Using keyboards rather than click point 1/+/2/=

------
aufreak3
The yearning for the richness of language in computer interfaces is a meme
that keeps coming back. I wonder whether there is a way to satisfy that.
Here's an idea (around MacOSX) -

Push the dock over to the side as a column and set aside a fixed text box at
the bottom of the screen for both text input and output. When you do things
using he GUI, the text box should continuously update itself with a textual
description of what you're doing, that will also work the other way - i.e. if
you'd typed that textual description in there, the same actions would be
accomplished. This may setup a dialog between the comp and the user gradually
building a vocabulary for linguistic interaction with your computer. Could
this be a way to leverage the explorability of a GUI to teach a language using
which you can over time become a power user?

------
Symmetry
Since the article was written file search has become much more a part of both
Mac and Windows operations. And on my Linux laptop I use dmenu to launch most
programs that aren't running in a terminal or important enough to have a
keyboard shortcut.

------
EGreg
In this article they have tried to Think Different for the sake of thinking
different. Not bad. But they basically described Linux, Command-Line
interfaces, and the Web. Their idea of "language" is basically scripting
languages.

------
michaelpinto
It's funny the example of an anti-Mac interface is a bookcase — but if you
look at the ebooks interface on an iPad it's pretty much that...

------
Rhapso
The fact that is is only beginning, is the reason I want to go into Human
Computer Interactions and it is the reason I am truly terrified I am going try
and make this magical "anti-WIMP" and then the current paradigm will be too
deeply ingrained to make room. This is what I hope the wearable computing
niche will find symbiosis with.

------
hasenj
Using language instead of icons is only useful for advanced users. I love
gnome-do/launchy/quicksilver but most people don't get them. It would be bad
to think of them as "replacement" for icon. Instead I think it's better to
think of them as complimentary components.

------
AndyKelley
I think they have the "Feedback and Dialog" and "System Handles Details" in
the wrong column. Those should be reversed. Macs love handling all the little
"details" for me without letting me change them, for example the mouse
acceleration curve.

------
nbashaw
My (in-depth) summary:

The authors (in 1996) imagine what an interface would look like that is
inspired by the opposite of all the guiding principles for the Mac GUI.

Mac => Anti-Mac

* Metaphors => Reality

* Direct Manipulation => Delegation

* See and Point => Describe and Command

* Consistency => Diversity

* WYSIWYG => Represent Meaning

* User Control => Shared Control

* Feedback and Dialog => System Handles Details

* Forgiveness => Model User Actions

* Aesthetic Integrity => Graphic Variety

* Modelessness => Richer Cues

The authors then talk about the weaknesses of each of the principles guiding
Mac interfaces.

1) _Metaphors_ \- impose artificial restraints and obscure the true
capabilities of computers.

2) _Direct Manipulation_ \- repetitive work is better handled by batch
processing and simple scripting.

3) _See and Point_ \- language is more expressive.

4) _Consistency_ \- different things should be represented differently, forced
consistency is oversimplification.

5) _WYSIWYG_ \- the authors interpret this as meaning "your document, as it
appears on the screen, accurately reflects what it will look like when it is
printed," and argue that interactivity is better.

6) _User Control_ \- sometimes automation is better, and when there are
multiple actors (as in networked systems, like the internet), control must be
compromised.

7) _Feedback and Dialog_ \- interruptions should only be made when they are
valuable to the user, and over time as he/she gains proficiency they will
matter less and less.

8) _Forgiveness_ \- forgiveness means there should always be an "undo" button
and warning signs, but this can become a nuisance when the warnings are
gratuitous.

9) _Perceived Stability_ \- the real world is not stable because there are
forces beyond our control, and that's what makes life interesting. (This
principle is curiously missing from the table summary in the article).

10) _Aesthetic Integrity_ \- variety is more interesting and expressive than
unity.

11) _Modelessness_ \- this is defined as not having "modes" which restrict the
user's range of actions. The problem is that users can only cope with so much
at once, modes help chunk things up.

\-------------- Next: The Anti-Mac Interface --------------------

This was a pretty time-consuming summary, and I'm kinda wanting to get back to
work. Want me to write a summary for the second part of the article? Use the
upvote as a demand signal. If this gets 20 upvotes I'll summarize the second
part.

------
ddelony
I wonder how close Mac OS X comes to the Anti-Mac interface, given that it
includes a command line.

------
pohl
The result is Microsoft-Bob-esque.

~~~
MortenK
Not at all. The article even makes references to Bob and similar systems, and
explains why they didnt work.

------
mbateman
Appears to be from 1996.

~~~
lwhi
I think the 1990s were an interesting time for the Internet. Lots of crazy
thinking about what it could become - large ideas, less fixation on profit and
commerce. Lots of theory.

~~~
michaelpinto
If you look at the success of Windows 95 in that era (and what failed) it was
very much about the money as much as the theory in that era. What's
interesting is that when the article was being written Jobs was working on
NeXT which is still alive today on your Mac and iPhone.

~~~
lwhi
I agree - commercialism was around .. but not many companies really understood
how to monetise the web in the early / mid nineties. For example, Bill Gates
famously didn't think the Internet would catch on, and assumed that MSN would
be most people's choice of walled-garden alternative.

The WWW was just another application for the Internet (along with Gopher / FTP
/ Usenet / IRC) - and therefore wasn't seen _as_ the Internet.

Because the technology was relatively new, I think people were interested in
taking risks - user expectations hadn't been set in stone, and innovation
could be less about iteration and more about being bold / daring.

