
Computing’s Fundamental Principle of No Learning - mpweiher
http://www.sicpers.info/2017/12/computings-fundamental-principle-of-no-learning/
======
Chaebixi
Related: [https://99percentinvisible.org/episode/of-mice-and-
men/](https://99percentinvisible.org/episode/of-mice-and-men/)

It's a podcast from the excellent 99% Invisible that talks about Douglas
Engelbert's (of Mother of all Demos fame) vision for computing that expanded
human capabilities through systems that required mastery, and how that vision
has been replaced by things this "principle of no learning."

~~~
seanmcdirmid
Is there a more specific quote by Engelbert? I don’t remember reading that in
his 1962 paper, but maybe I need to go back and read it again.

~~~
Chaebixi
I didn't quote him. My comment was just a summary of the podcast synthesized
with this post (since it coined a more compact term for the concept).

~~~
seanmcdirmid
Ah, too bad, that would be an awesome thing to put in a paper!

------
nerdponx
The problem is not that apps are usable without learning. The problem is that
the "power user" features are removed instead of just hidden.

It is a great achievement that apps have sane defaults and dead-simple
workflows for those who do not care to learn anything more complicated. It is
a damn shame that those who want or need more power simply do not have access
to it. More applications need an "expert mode".

------
seanmcdirmid
Programming by demonstration can solve many of these problems by substituting
real action for the apprised abstraction. For example, one could drag and drop
some item, capture that action from a log, and then repeat that action in a
script.

~~~
mwkaufma
This! This! This! This is the workflow in the 3D modelling/animating program
Maya -- every UI action is backed by a MEL (Maya Embedded Language) command,
and is logged (it reads like a shell script). It's very easy to perform a
task, copy-paste the log, parameterize it, and add a new shelf or inspector
for the new compound task. I've seen technical artists without programming
backgrounds adapt to this very quickly, and create amazing extensions to the
base application. I wish every program worked this way.

~~~
satori99
Blender does the same thing. Every executed command get added to the info log
as regular python snippets that can be copy/pasted into a script or the python
console if/when you need to automate something that you have just done via the
GUI.

------
munchbunny
I'm not sure I understand what is meant here by the "principle of no
learning." Is it a rule that an interface should not require training/learning
in order to use?

------
brudgers
The People, Places, Things metaphor was evident in the early versions of
Microsoft's "Metro" interface on WP7. The form it took was "Hubs". It didn't
win. Maybe because it was never practical to build complex workflows on a
phone...or maybe because building complex workflows is complex and therefore
statistically unlikely to be preferred _in any context_ to brute force most of
the time.

~~~
goatlover
On the desktop, Lotus Notes was a popular complex workflow builder for a time
in the 80s and 90s.

~~~
bitwize
And people fucking hated it.

~~~
goatlover
I thought it was great for building custom workflows, and I don't know that
anything has replaced it since. People seem to hate Lotus Mail in favor of
Outlook, for some reason.

I always thought Microsoft was way behind Lotus in the workflow department,
but people were used to MS products. Maybe it got worse over time with IBM's
mismanagement, and I haven't used it in 15 years.

I recently had to cobble together a worfklow in Drupal, and it was terrible in
comparison to what could easily be done in the late 90s with Notes (which did
have a web front end by then).

------
vog
From the article:

*> The Taligent frameworks look [...] like fairly standard 1990s OOP-in-C++ which almost certainly makes them less fun to use than modern Qt.

What exactly does the author mean with that? In how far does Qt deviate (for
the better) from classic OOP?

------
bitwize
The author of this article is obviously a programmer who spends much of his
time engaged in programmer status games showing off how smart he is. What he
_doesn 't_ understand is what it's like for the people who do the vast, vast
bulk of computing: ordinary working-class folks found in the cubicle farms and
factory floors of the world. Their jobs are simple: enter new purchase orders
or inventory shipments, type up letters, etc. They don't _need_ a powerful
tool, and if they have to do much learning then they can't be as productive as
quickly. Apple and Microsoft are building software for this crowd because it
is -- by far -- the biggest potential market for computers, and their software
has vastly expanded the set of people who might become computer users -- or
even programmers. Easy-to-use software has benefits for programmers and other
knowledge workers too: less of a cognitive burden means more of your brain can
focus on performing the task instead of wrestling with the tool. This is why
Emacs is doomed to failure: IDEs are simply far better at onboarding people
towards programming, and those people tend to stick with what they know rsther
than struggle with an ancient, confusing UI. (Vim appears to have been spared
for hipster-cred reasons.)

~~~
gumby
The point of powerful tools and metaphors is not snobbishness, its so these
activities can be automated and made easier, eliminating drudgery and reducing
mistakes.

Sometimes that process can be done by the people on 5he factory floor (through
excel of course, and, _pace_ your own comment, admins famously writing their
own macros in Emacs), but mostly it’s via other folks being able to write
stuff for them.

And the tragedy produced by this “no learning” premise is that most people who
operate computers spend their time imperfectly translating and t4ansferring
from one silo to another

