I also sell software to customers with substantially the same technical aptitude as Andy's. My comments:
1) Copy/paste: they can't reliably copy/paste. Of those customers who can copy/paste, a number of them know exactly one way to copy/paste, and will fail if it does not work in the context they need it in. (i.e. they either know what an MS Word copy/paste button combo looks like, OR they know how to right click, OR they know the keyboard shortcut -- and it is, far and away, the keyboard shortcut which is the most widely supported and least widely understood option)
Andy's suggestion is to make it easier to type things out longhand. I suggest making it unnecessary instead -- you can do some client/server trickery to avoid this (discussed here: http://www.resultsjunkies.com/blog/back-office-exposed-bingo... under the heading "When a sale comes in, can you walk us through the process?")
9) My sole point of disagreement with Andy: this type of user really wants to relate to their computer like they relate to a toaster. No one reads a toaster's documentation, and no one should have to. If the UI needs external documentation, that is probably a bug.
Post it anyways? Your articles and style fill me a mix of joy and inspiration. And I think hundreds or thousands of other people too.
I saw quite a neat trick the other day when I bought EagleFiler. Having already used the demo, the program had registered a URI protocol handler, and included a link in the registration email that looked like:
x-eaglefiler://p?n=<my name>&sn=<my serial number>
Although, the text in the email makes no mention that I could just click on the link to register the program, so the point of the original article still stands...
From the other side, I had a boss with a Master's in Computer Science who had lots of trouble with normal desktop usability. He certainly never learned to guess which actions were reversible or not. My sister, who teaches history and is barely technically self-sufficient, could run rings about him on the Windows desktop, not to mention my friend the accountant who has never written a program but could be a passable professional programmer with six months' full-time training.
So instead of saying, "this is what non-technical users don't understand," it's better to say, "these misunderstandings are part of the normal spectrum of user savviness that you will have to deal with." Whether a user is technical or non-technical only allows you to make a rough guess about the mistakes they'll make.
1. Many users don’t understand how or where data is stored or even that it is separate from the application.
That is correct. My photographing neighbour keeps his photos 'in' Adobe Bridge, another neighbour keeps them 'in' Adobe Elements. They don't have a notion that those programs only offer a view on the pictures which are stored in some directory on their harddisk (they don't understand the hierarchical file system either). When they have to import photos from their camera or email a photo, they always do it from their photo cataloging program, never from the Windows Explorer.
Likewise, a few of my collegues think their texts exist solely 'in' Microsoft Word, and that the 'Internet' is the same as the WWW.
One day I was helping a friend of a friend with transfering some texts over to another computer. We reviewed a text, by accident I deleted a paragraph and restored it promptly with Control-Z (undo). The person was looking flabbergasted at me: 'How did you do that?'. He told me that often when some part of his text 'disappeared' magically without apparent reason, he had to type it all over again. He didn't know that Word has an Undo. I told him he could benefit from a basic Windows and Word training, but he wouldn't hear of that. He did remember the Undo trick, however ;-)
I recently helped my aunt install a new scanner. She asked me how to send a document to someone. After exploring the driver software for a while, I adviced her to start the scanner wizard/pamphlet printer/photo editor/document manager, click "scan", click "email", and let Outlook take it from there.
Steve Jobs recognizes this and that's his genius in a way. So, instead of creating a widget that does everything they choose to make a widget that does everything the user wants. There's a subtle, but important difference over here. Only a small subset of their users want extensive options so that they can put FLAC and other stuff onto it. Those are the users who fret about the product to the nth degree and they know that they don't exist to serve them.
So, they pick a feature and implement it in a way that those other users (80%+ of the market) know how to use it, and that makes all the difference. Just look at enterprise software and how hideously broken it is. You have SaaS that try to do everything at once which directly results in their users doing nothing at once. This instead causes grumbling against the software and in the longer turn your sales go down. The hardest lesson in business, I think, is learning how to bite the bullet. No one wants to admit that their software can't do something. So, they rush and put the feature in to satisfy themselves without checking if it works or not. Big mistake. What's worse is that it is so ingrained in us that we don't even realize what damage we have done.
What I am trying to say is that it's perfectly fine to not have the insanely great feature X, but it's inexcusable that the user has to read a 300 page manual to use feature X.
This is why I suggest everyone to read the Apple Human Interface guidelines. They aren't perfect, but they've gotten quite a few things right. (see: http://developer.apple.com/mac/library/documentation/UserExp... )
Then some OS X features completely baffle me. Like installing software. I'm not a "Mac guy" and each time I ask someone how to install software downloaded off the web I get a different answer, all preceeded by "You just ..." and then 10 or 11 simple steps. Apparently they have never heard of installers. I'd guess from this, that the average Mac has never had ANY software installed after it left the factory.
"Whenever I try to run this application, I get the warning 'This was downloaded from the internet', etc., etc. I have to click 'Okay' every time I use the app. It doesn't seem safe."
The problem? They had dragged the app straight from the disk image onto the dock.
That said, OS X installation is about as simple as you can get. (Though I wish Apple would finally include Homebrew as its default package manager.)
There's a perfectly good system wide Installer on the Mac, but nobody uses it unless they have to, because installers suck (they ask, among other things, which drive you'd like to install the software to -- what normal user knows the answer to that question?).
A number of vendors are working to improve application downloads by distributing applications as self-extracting zipfiles that put the original archive in the trash on extractions, and making it so that if the application is launched from the downloads folder or the desktop it initially politely asks if you'd like to "install" the application. This is pretty foolproof. Even if the user declines to move it, the Application will still work from the Downloads directory or the Desktop, it just won't be as easy to find.
It didn't help that PackageMaker used to be one of the oddest, most incomprehensible pieces of software you ever had the misfortune to use. It's a bit better now, but most developers I know have sworn off it for good.
Apple does have a technology called "Internet-Enabled DMGs", which automatically mounts, copies the application, then unmounts and deletes the dmg. It's not widely known about or used.
No, drag-and-drop is NOT unintuitive, and if that was all that it took then great. But Mac guys are so used to the ringamarole that they forget this part: "Oh yeah, then mount the dmg, then open the virtual drive, then drag one of the various icons (which one???) to your Applications folder, oh yeah open that first in the finder, then close the virtual drive and unmount the dmg, then find the downloaded wad (what was it called again?) and delete it." All of those operations accessed from different menus/tools.
First, Safari used to download things to the desktop. This meant the DMG file was sitting right there, looking ugly, taking up space.
Second, Safari used to, by default, open "Safe" attachments (which included disk images), and open the Finder window to the newly mounted image automatically.
The upshot was that, assuming the developer had made an effort to indicate what to drag where with a big honkin' arrow, after your download was done there would be a pretty clear sign of what to drag where to install the app. Unmounting an deleting the DMG has always been an issue (and the internet-enabled DMGs are better for this), although only the unmounting bit is unique to OS X. Under Windows you're still going to have an installer file lying about that has to be deleted.
Both of these things went away, though, and many developers haven't caught up with the times.
Actually, I get the impression that most Mac users have laptops.
Meanwhile, junctions on Windows largely solve the installation location problem, except for those apps that use hard links behind the scenes (especially for transactional installation changes) and expect those hard links to work, especially across temporary directories etc. In practice, I've found this is only Microsoft apps.
Given that, I agree with you in thinking that the majority of Macs haven't had any applications installed on them since they left the factory. Why? Macs come with a much wider range of applications than Windows boxes. As long as its not common for consumer PCs to ship with Office and Firefox pre-installed, there'll be more software installation on Windows than on PCs.
I'm curious - coming from a Windows background, how did you know to do this on a Mac?
It sounds so simple when you describe it but I think its one of the last thing a power Windows user (who knows about extracting zips, running installers, exes and dlls, etc) would think to do - since dragging and dropping a single icon would work for a very small percentage of Windows apps.
How could any user be confused by that? (Also, Safari is by default configured to open DMGs automatically, so all the average user needs to do is click the download link and follow the arrows).
Edit: Adium's (http://cl.ly/5b23589883bbc4467a04) is probably an even better example.
Because it looks more like an static image than something you can interact with?
Not that I have the problem now but I remember the first app I downloaded in osx was Quicksilver, which, at the time, the DMG opened to a grey background and a giant quicksilver logo. My windows/linux brain assumed the DMG was empty due to a corrupt download or something and I ended up downloading it a second time. Of course turned out the logo WAS the appfolder.
There is even a line of text that explains how to interact with the two object presented, if you aren't already familiar with the paradigm.
Specifically it doesn't do anything more to imply there are two objects presented, like say "drag the (image|icon|folder) below to (your|the) Applications folder to install."
It seems to me that while it might be a great way to install something - incredibly simple - years of Windows use would completely program someone to not consider something so simple.
Or even better, something akin to Zero Install (http://0install.net/) coated with OS X system sugar would be even better. That would also resolve the problem of software updates which are a pain on OS X, unless the program goes handle them by itself.
I can recognize in him pretty much all the behaviours described in this article, especially the fear of setting everything up on fire with a click, but I've also noticed that he tends to use the machine with a weird sequential approach.
Click the fox icon, go to thataddress.com, click there, type this, click yes etc.
For some complicated tasks (such as burning a DVD) he's got all the buttons and actions he needs to perform written down on a piece of paper.
The amount of struggle and effort he puts in it always fascinates me.
As far as they're concerned, each step in their "get my email" checklist is equally important and equally arbitrary. You could literally tell them that, after clicking on the fox icon and going to thataddress.com, the next step is to spin around counter-clockwise three times in their desk chair while singing "God Save the Queen", and they'd believe you- and then, one day when they only spun around twice in their chair, they'd be calling you in a panic to ask if that's why their "internet is broken".
This isn't a matter of stupid vs. smart, or anything like that- it's a matter of learning. Getting past that first step is hard for a lot of people, especially with computers... and, while I'm on my soap box, I gotta say that I'm not convinced that GUIs do a good job of making it any easier (although I certainly can't think of anything that does a better job). GUIs make it really easy to aquire "knowledge" about how to use a program, but they can impair their users' abilities to get past that first step and really understand what's going on.
I think that this is because GUIs sort of imbue the computing experience with a sense of, for lack of a better word, "false concreteness"- they make what are, in reality, highly abstract tasks appear to be very concrete, and let novice users get away with treating them as such... until, of course, something breaks or changes, at which point the user panics: since they don't have any sense of the larger context surrounding their use of the computer (i.e., why they need to click on one button as opposed to another, what the address in the address bar really means, etc.), they have no way of telling what "broke", or whether the changes they're seeing are important or not, and they don't have the mental tools they need to reason effectively about ways to get around whatever problem they've encountered.
Neal Stephenson touches on this subject a little bit in his essay, "In the Beginning, There Was the Command Line." It's more than a little dated, but I actually re-read parts of it a week ago and I'm happy to report that it's held up better than a lot of things that were written in 1999 about computers. Not surprisingly, the things that have held up the best are the parts that aren't tied to any specific piece of technology, but rather are about the abstract concepts and theories underpinning modern (i.e., post-1984) UI design.
 Anybody who's tried to teach somebody how to program will tell you that, while students have trouble with syntax and what-not, their bigger problems are almost always related to learning to think about problems in an abstract and generalized way.
 Here, I'm not necessarily talking about understanding the algorithms and data structures behind the program- I don't think it's particularly important for most users of most programs to understand the software at that level, although there are some cases where it might be desirable (certain medical or industrial applications, for example). What I'm referring to is the abstract understanding of how the different parts of the user interface fit together with one another and with the task that the program is trying to accomplish for the user- sort of like what Joel Spolsky's talking about in his "Figuring Out What They Expected" essay, but at an even more basic level.
Of course, good luck competing in a market where you're the only guy trying to (gasp) make the user understand what's going on. I don't even know if I expect the problem to get worse (as interfaces become more and more abstracted) or better (as users increasingly grow up with computers in their lives) as time goes on.
How does my car work? I press on the gas and it goes. I press on the brakes and it stops. Drive shaft? Carburetor? What's those things?
That users can't use a computer is usually, not always, the fault of developers and designers.
For example, if backing up is so crucial, why don't computers come with that functionality built in?
Why _is_ data storage separate from the application? In the iPad, it's not. And that's a brilliant improvement in computing!
You don't need to have a correct mental model of how it works, just a mental model that given the most common input, predicts the output.
From your comment, I can see you have a mental model of how the car works. You press on the gas, and expect the car to go faster. You press on the brakes, and expect the car to slow down until it eventually stops. You turn the steering wheel to the right and you know that the front wheels point rigthwards, making the car go in that direction. This mental model allows you to predict the outcome of pressing the gas and the brakes at the same time, and therefore know that you should not do it.
You do not need to know the inner workings of the car. You might as well think there are midgets under the hood doing all the work.
The problem is that the mental model that most people form about computers is so wrong, that it doesn't predict anything. So they can't use a computer properly.
And this is mostly the fault of interface "designers". An interface to _anything_ should allow the user to form a mental model that predicts the outcome of the operations that users will need to perform. This does not mean exposing the inner workings of the machine, but also not over-simplifying and use metaphors excessively. Like I said, the mental model does not need to be accurate, it just needs to help the user do whatever it needs to do.
Why do cars need to be refilled with gas? Why do their engines require regular oil changes? Why do their doors and ignition devices have keyed locks? Why do you need to learn to drive it, coordinating your arms, legs, eyes, and ears? Why do you need to learn the legal and social rules of traffic?
There's essential complexity here. The "mental model of how a computer works" is learning the UI paradigm (rules of the road), not busses and syscalls (drive shafts and carburetors).
The iPad doesn't have a 35-year legacy of physical removable media for user data. It also requires syncing to iTunes as the sole means of initialization and backup.
And here lies the challenge of designing computers that are useful for people without the mental model, and efficient for people with it. Gadgets tend to do one or the other, but accomodating both is a good bit harder.
I use this approach all the time for bureaucratic "machines". It's so much more efficient than trying to actually figure out how the bureaucracy works.
Sure, it's a lot more difficult due to the non-deterministic nature of most social systems, but even a loose, stochastic model allows you to "hack" organizations in a way that is likely to make your endeavors within an organization that much more effective.
Interestingly, the fuzzier nature of social systems gives it a kind of "undo" function - you can, in most circumstances, make up for mistakes and misjudgements - so the cost of experimentation is more similar to software than to, say, structural engineering.
I am increasingly of the opinion that the single most valuable asset to any consumer software company is at least one "old grey". The more time I spend with ordinary end-users, the more I realise that I cannot even begin to comprehend their mental processes. They are a total black box to me, with a completely different set of instincts and intuitive responses. I think that the majority of software developers, even professional UX folks, massively misunderstand their users.
As STBs have got more and more complex though, taking on major media centre capabilities, interfacing to Internet services such as YouTube, Netflix and FaceBook, we are really struggling to keep the interface simple. But at least we are aware that this is a critical aspect of our software - computer app developers sometimes can lose sight of that.
1) Put a dialog popup in your MS desktop app that says, "Clicking OK below will destroy the chair beneath you. Clicking Cancel will make a box of chocolates appear on your desk."
2) Observe when a user performs a task that produces the pop up.
3)Ask the user if they were disappointed that no chocolate appeared.
The user will look at you with incomprehension.
To clarify, I am not criticizing users. Users are users, we're not going to change their behavior with training or help manuals. We need to design around them.
Users don't have an a priori instinct to dismiss dialog boxes out of spite — on the contrary, the elderly noob reads every one carefully as if the wrong choice would actually destroy the chair beneath them. A new dialog box like that would ruin their day, and mine when I get their panicked phone call now that they've gone off script. If they ever do build a mental model of the UI, they'll have learned otherwise.
That network/DNS/router/etc issues aren't your fault and you can't fix them.
Everything regarding passwords and account security.
URLs, esp typo's and that The Google (esp it's search bar) != Internet.
Phishing, and a slew of other attacks. Also that punching the monkey is probably a bad idea.
Browsers, different browser versions, different browser manufacturers, cross browser incompatibility. Also that you did not create the browser and you can't fix it. Also why they have to download install new version of browser to use your software.
Why your webapp quits working when they aren't connected to The Google.
Related to number 4, english expressions can be as bad as technical jargon. There's no reason to call a file 'file', if the localised Datei or Fichier is better understood.
This is the fundamental difference between a "technie" and a "non-techie". In fact the inclination to explore, to fiddle, to hack is largely how someone gains the knowledge, experience, and craft to become a "techie".
I've noticed that my own depth of comprehension is far better when I explore and experiment first, and not study the formalized knowledge from a textbook/documentation until I'm satisfied that I have at least grasped the fundamental patterns on my own. This works not just in computing, but in almost every context (or at least those that don't have the potential to cause serious damage or injury).
Really!? I've never, ever observed this.
In the words of Zoolander, the files are IN the computer.
Back when I had downloadable software prominently available I'd get this about, oh, five times a month and twenty times right around the start of the school year.
I think my brother-in-law was able to get a hold of the old machine before it got re-installed.
Not really sure what software would be though.
Analogies are useful when they allow you to not think about something low-level so you can work on something high-level. If I had to think about electrons every time I sat down to write code, I'd never get anything done. If I had to learn it before I started learning to write code, I'd still be in school.
You sort of lost me there with the electrons thing. Are perhaps you really meaning the abstractions that enable you to not have to think about electrons while writing code.
It doesn’t mean that your users are stupid, just that they haven’t spent the thousands of hours in front of a computer that you have.
Like office workers spending 8 hours a day in front of a computer? They easily make that time only on work time.. not that it helps.
..they don’t know how to (or even that they can) copy and paste text.
Some users don't. But what should I do, send a copy of computer 101 with every eMail I send?
Many users are used to web applications and don’t understand that they need to download and install new versions of desktop software to get access to the features in a new version.
So, what's the maintenance staff there for? In my experience regular office users don't set up or update anything, ever!
File system hierarchy and network mount points are so far off the average user horizon that you shouldn't bother anyway. You just point to it in the file explorer. Everything else is completely pointless.
As for filetypes and converting: ok, that's a valid point. It's a good idea to send instructions with it, as it is a somewhat non-common task.
The jargon you use
I agree. Talk the users language, not computer slang.
You should therefore never put something only in a right click menu or anywhere else that it can’t easily be discovered.
Please, I beg you. Put stuff where you would normally expect it from your WM / OS. If it would be in the context menu, then put it there. Normally you also put stuff in the menu bar / ribbon / icon bar depending on conventions. Don't try to invent new interface guidelines, you'll probably fail.. horribly.
Try not to expose them or avoid it entirely (rework the workflow if necessary). Otherwise build the interface in a way that clearly notifies the user of this circumstance.
Non-technical users aren’t so confident and won’t try things in the same way. In fact some of them seem to think that a wrong move could cause the computer to burst into flames.
Most of them don't try anything at all. They will ask you how to do it. If you have some kind of system that has tweaks: train your users (that's what tech training is for).
So try to stick to conventions they will understand
The need for backups
I preach this all the time. People also always agree to me in this respect. Doesn't help, they never actually do it, much less spend money on it. Could as well talk to the wall..
That they should read the documentation
I totally agree. No one will read your precious documentation. If you roll out a new software, train them with the workflow. You do train your workforce to use the tools they use, right?
Unskilled users often don’t realize how unskilled they are. Consequently they may blame your software for problems that are of their own making.
May? They always do. I figure this must be human nature or something.
One just has to be as polite as possible in such cases.
In other words, stay professional.
Otherwise I found it the article as entertaining as irksome. Raises some good points.