

The iPad; Crazy like a fox. - etherael
http://skepticalphoenix.blogspot.com/2010/01/ipad-crazy-like-fox.html

======
karzeem
I like the author's honesty; he seems to hate decision-free computing but
acknowledges that it's what most people prefer.

If his main objection is that (like everyone's saying) abstractions are sad
because they make tinkering harder, then I'm with him. But the question is
whether abstractions and tinkering are mutually exclusive. The examples I've
seen seem to suggest that they are. But it would be a shame if that were true,
because I think both are very important.

~~~
etherael
Spot on. And the question you have is quite similar to the question I have,
but I'd phrase it differently;

Are low cognitive friction abstractions implicitly the exclusive domain of a
locked down ecosystem? Or is it possible to have one's cake and eat it too?

I think it's very clear that it's much _easier_ in a locked down ecosystem,
I'm just not certain beyond that.

~~~
wtallis
It feels to me that the problem isn't so much the the total cognitive
load/friction of a user interface, but the kind of thinking an interface
demands. In order for an interface composed of high-level abstractions to be
something that we can tinker with, it has to be a system that follows logical
rules that can be easily inferred, and those rules have to be Turing-complete.
Once you have formed an accurate mental model of the software, you can use it
very efficiently with a low cognitive load.

But ordinary, everyday people don't want to infer how a system works. They
don't want to think scientifically, and they won't invest any effort in
building a mental model of how the device really works. Instead, they want the
device to conform to the mental model they cobble together on the fly -
logical consistency be damned. They have no curiosity to learn how to fully
use the tool (or else what little curiosity they have is overwhelmed by the
frustrations of the software violating their expectations). If they did, they
would be programmers, because anybody who understands how computers work will
end up using that knowledge to make it do something new. (The power and
possibilities of a programmable machine are irresistible to anybody who
actually comprehends it, but first you have to overcome the initial task of
understanding how the machine computes. And to continue as a programmer, you
have to tolerate the frustrations of the leaky abstractions.)

If you want a user interface that can be used without any scientific thinking,
you have to restrict it to a finite set of capabilities so that you can
anticipate what the user will try to do. But at that point it is no longer
really a computer: it's a shiny toolbox. You can't make new tools with it; you
have to buy them at the App Store.

The problem of users being unwilling to think analytically isn't one that
Apple can overcome. I doubt that any for-profit corporation can do anything
about it, because it is a broad cultural and generational problem. Computers
haven't been around long enough for us to know how well the general population
will eventually adapt to using them, but it's a sure thing that without true
AI, people have to adapt their ways of thinking in order to use the full power
of computers. Until then, people will keep wondering why we programmers can't
manage to give their Photoshop an "enhance" button that works like the ones
they see on TV.

~~~
bad_user
> _But ordinary, everyday people don't want to infer how a system works. They
> don't want to think scientifically, and they won't invest any effort in
> building a mental model of how the device really works._

That's not true ... "everyday people" (whatever that means) always have a
mental model of how something works (be it a computer, a car or a washing
machine) ... that's how humans think, the problem is that their mental model
is usually different from reality.

And you can almost always put blame for this on the interface or on the
education given to them.

It's like you're saying I should know how the internals of my car works (lots
of science in there too). Well I don't ... all I know is how to drive it, and
I've had problems because of that ... but it's not like I care. If it ends up
bugging me so much, I'll search for a smarter car.

Regular users are lazy indeed ... some of them just don't get it, others are
doing just fine for their day-to-day needs. Those that are doing fine, like my
wife, tend to have a good teacher nearby ... so the only meaningful way of
countering their laziness is through proper education.

~~~
wtallis
I know that people always have a mental model of how something works. But it's
rare that somebody actually stops to consider whether their mental model is
accurate or logically consistent, and unless they do, it is neither. If your
mental model of a piece of software is self-contradicting, then it obviously
can't reliably tell you how to use the software.

There's a lot you can do in designing a user interface to try to make sure
that the user develops an accurate mental model, but it gets harder as the
software gets more complex and more powerful. A car's controls are a pretty
good abstraction of it's drivetrain, but most of the complexity of a car
exists for the sake of efficiency and increased power, and it's a physical,
observable system anyways.

I think that once a piece of software becomes Turing-complete, it is
impossible for user interface affordances to lead a lazy user toward an
accurate mental model. After all, to program, you have to be developing a
mental model of what you're creating, _within the context of the programming
environment_. That's two mental models that both have to be accurate, and one
of those can't be anticipated at all by the user interface designer.

Programming is inherently harder than almost any other thing we do with
computers. If a piece of software is Turing complete (and thus subject to
tinkering), then the mindlessly-easy-to-use subset of it's capabilities can't
be Turing complete.

------
Tichy
Most iPhone users who are also blogging are tech freaks, who suck up every
word Apple lets out. So they know every trick for using the iPhone.

Recently a non-geek friend of mine bought an iPhone. She already had if for
over a week when I met her and finally showed her how to zoom...

Not saying the iPhone interface isn't great, but it is still not problem free.

Also, doing specific things might require using iTunes (to buy the required
apps), and the iTunes interface is definitely not great.

