>>The user tests were conducted in a specially constructed room featuring a one-way mirror, so observers could watch the tests without being intrusive. The tests were conducted by a moderator who made sure the user felt comfortable and showed her the basics of using a mouse. Then, with no further instruction, users were asked to perform specific tasks, without help from the moderator, like editing some text and saving it. The moderator encouraged each user to mumble under her breath while doing the tasks, revealing her current thinking as much as possible. Each session was audio or videotaped for later analysis.
Amazing to see the importance they gave to UX/UI and the level of sophistication they had forty years back.
That's still the right way to do UI testing. You bring in people, you give them a list of tasks to do, and you take video of screen and user. Someone then has to watch the video and mark the places where people got stuck, had to back up, or gave up. Then create a highlights reel of every problem seen more than once. The devs have to watch this.
This is generally correct, but it’s often better to go to the user rather than bringing them in. People can have quite different reactions in their own homes, especially if they have specific setup needs, than in a research lab.
True, it's much easier now, and you can get a much wider sample of end-users and a more accurate representation of their behavior in their natural context. It's absolutely worth doing most of your usability testing remotely. But to be honest there's still something powerful about having the development team watch a real person through a one-way mirror that I haven't managed to replicate using video. It reminds me of the difference between live theater and watching a movie at home.
The closest I've come to replicating the impact is to have a conference room where the team sits together and watches the usability test being streamed live.
That description should also supply a warning about a difficulty that UX designers, and Apple in particular, have struggled with over the years: they're going to a great deal of effort to optimise their designs for the first hour of use.
Over-optimizing for the walk-up-and-use experience at the expense of a more realistic subset of usage is sometimes referred to as the "Pepsi Challenge" or "sip test" observational bias. You might even consider it a special case of the "streetlight effect" where people choose their research methods primarily based on how easy it is to collect that kind of data. (When I used to teach UX I had a whole lecture dedicated to alternate approaches to avoid falling into the trap of over-optimizing for novices, things like GOMS/KLM, log analysis, heuristic evaluations, diary studies, cognitive walkthough, surveys, etc.)
In 01981 if users didn't like your design in the first hour trying it out in the computer store they wouldn't buy your computer. Now if they don't like your design in the first five minutes of downloading it from the app store they'll uninstall your app. There's a real tradeoff between ease of use and ease of learning, but software merchants are in a real bind here.
Honestly, a leading zero in human-readable text is a non-solution.
Year numbers are variable length decimal strings, not a four or five digit fixed length field. Any historian concerned with 6 AD-999 AD will likely not include a leading zero in any year figures they use. I don't see why the people of the present or the future should do any different.
Machine readable implementations, sure. Have as many digits as you wish. But for human written material, it just looks weird.
Most of us have probably violated all sorts of good UI principles out of either haste or ignorance, but our crimes are mitigated by the fact that for the overwhelming majority of software the number of users, and hence the number cursing the idiot who designed this crappy interface, is extremely low and/or constrained by the bespoke nature of the application; hence the investment in time and money to perform this level of rigour is not justified.
The only time I've seen UI work carried out with this level of care was for an air traffic control system, where of course the consequences of a poor design can have significant and fatal consequences.
Don't forget flight simulators, the flip side of the same crucial situation, have equally elaborate design UI/UX simulations. And I believe power plants, especially nuclear power, have this type of elaborate design simulation, as well as everything in the (outer) space industry.
Equally remarkable is the amount of stuff we take for granted today that the Mac team got right the very first time.
Had it not been for that user feedback, Windows might have copied the Mac and gone with “Do It” as well and we would be living in a slightly more awkward UI timeline.
I always think about this when I look at current Apple's UI. Just a little user feedback would have gone a really long way.
That's still possible today, but nobody can actually be bothered, because there's plenty of other stuff to do as well.
I do recall ~8-10 years ago, we were building an app and did a user test; we had a phone with a webcam rigged to it so we could see from the user's point of view.
But for this you need dedicated (in multiple senses of the word) UX people, and they are both rare, and companies seem to be reluctant to hire them.
Tesler was a computer scientist. He worked on AI and Smalltalk at Xerox, and was a manager at Apple at this time.
What we need to do instead, is bring down the wall between design and development that was built up in the past decade, encourage developers in the UI field to have multi-disciplinary skills, and stop assuming every minute not spent coding or in meetings is wasted.
As a UI engineer, I wish it was more common for UI engineers to learn about accessibility and how to contribute to doing UX research, rather than trying to pick up some back-end and calling that "full-stack". There's much more useful cross-pollination to be gained when branching out that way.
That would be the equivalent of a car designer complaint that engineering ruined it. You cannot produce quality work without that collaboration in either direction.
Form over function? Or beautiful but useless (snark intended) The citrus press by Philippe Starck comes to mind. Many bars own it and display it but don't use it because it sucks. The last sentence came from a bartender who has it on the shelf.
They just hadn't figured out that one can get results of the same quality from a usability testing room without one-way mirror. I don't blame them, the standard literature on how to run tests on the cheap appeared one decade later.
The goto place is the Nielsen Norman Group¹. You can get any of their books, it will describe how to do tests. It's basically catch up to 5 users from your target audience, ask them to do the tasks you want, observe without saying a word, and don't ever come out with any variation of "users are stupid" explanation because even if they are, they are the ones you are creating the software for; fix your software, and repeat.
But if you want details about why those large "record the user" sessions aren't worth it, you'll have to dig into their papers.
I worked on a public facing project with many retiree users. We had a code audit to remove the word 'Invalid' so that users would not think they had been identified as 'a person made weak or disabled by illness or injury'
Dude. I wrote a piece of software for a company that takes online signups, and then matches them to customers who purchase things in the store. Many years ago they asked me to write a report that output local customers who had signed up online, but never bought anything.
The database table for transactions is called `trans`. So the report I coded was labeled "No_trans_customers".
Last year I get a call from the company CEO -- can we rename this report? Apparently someone saw the name of it and freaked the fuck out.
I'm not aware of any community of disabled people or disabled person who finds the usage of the term "Disable" offensive in any context.
I've never actually heard any controversy over the word "Disabled" EXCEPT in the specific context of it being used to refer to disabled people (person-first vs identity-first language debate, but that's a digression)
There's quite a long history people arbitrarily deciding they speak on behalf of all disabled people and know that language must change to protect their dignity and feelings. Inclusive language is actually plain language, since that's the easiest for people with certain disabilities to understand, so the inclusive thing to do is to avoid arbitrarily stigmatising words.
I have a disabled son and agree that using the word "disabled" or "disable" isn't an issue. Please DON'T use the word retarded (unless you're talking about the timing of the spark plugs in your car) - this word originally had a clinical connotation which was largely synonymous to "disabled" but it is now offensive because people use it to disparage an able-bodied, able-minded person who's acting stupidly by comparing them to someone who's legitimately disabled.
personally I have very little exposure to the realm of disability. So if someone suggested to me that 'disabled' might not be a good term to be customer facing, although its not obvious to me, I know I do not have basis to disagree. I might just accept the suggestion and keep moving. I don't think it would occur to me to suspect someone of inventing nonsense.
actually I'm not even sure that 'disabled' is a clear term. In some contexts, 'unavailable' or 'not ready' might be more clear / less complicated. Kinda like how I don't personally recognise 'master' as a problematic term (انا لست امريكيا) but in many situations I find 'main' is a better fit anyway
With disabilities things get very complicated very fast because disabled people are a very diverse group. You'll have different groups of disabled people having different views on things, given they're often in pretty distinct and unique situations. This has resulted in a number of linguistic disputes over the years.
"Not ready" has a space in it which is frequently annoying in software. I think it also implies that thing may become ready after some time elapses or a condition is met, which disabled does not imply.
"Unavailable" and "Disabled" are already used adjacent to eachother in software, the former implies something becoming unusable for reason outside of the software's control, and "disabled" tends to imply something being made unusable intentionally. Here's F5 using the two words in this way https://support.f5.com/csp/article/K12213214
I agree with you regarding master vs main. Main is a syllable less, two letters shorter, and more clear. I feel people mostly resist the change to STICK IT to the politically correct.
What I find fascinating is that my comment about my personal reaction got a bunch of downvotes and several dismissive replies, including this one lecturing me about inclusion. And yet not one person who reacted negatively bothered asking me:
- whether I have a disability (yes)
- whether my reaction is consistent with views expressed to me by some others who have disabilities and/or work with/advocate for others who do (yes)
- whether I even have any interest in trying to convince others to change this language usage (not much; it would be welcome if people consider it but it’s nowhere near my priority list)
Now I’m not claiming that I—or anyone else who has a similar reaction—speak for all disabled people. But uh, you may want to re-read your comment. Particularly the last paragraph.
First, out of curiosity and practical interest, which disability? Views on these matters tend to cluster and I'm not savvy on every disability communities views. If I'm making software for such a community, I will not use the word disable.
Second, My post(s) were consciously written with the possibility I was talking to a disabled person in mind.
My visceral reaction to an implicit comparison between disable/disabled and master/slave being made is of annoyance and trepidation given what happened with master/slave and given the immediate replies (not from you) that somebody who DIDN'T want to get rid of terms like disable/disabled simply weren't interested in their users comfort. You had already influenced people on HN before I made my post, you already had the ball rolling.
I started out with an aspergers diagnosis, that became disused in medical circles, and is now considered offensive in some circles. Then I was an autistic person, that became offensive in some circles, and got redefined to person with autism, which also became offensive in some circles. There is literally no non-offensive way to refer to my own disability presently. I have seen various other redefinition of slurs, euphemisms, and medical terms being changed around due to influences from various groups and half the time these groups are a minority of disabled people of some sort.
Disable/disabled are good BECAUSE they evoke disability. This promotes familiarity with the concept of disability, and notably, that disabled people can't do things which is a surprisingly difficult thing for many people to understand. Conversely, disable/disabled makes a poignant analogy for what happens in these programs. I have a self-interest in making it so that more stigmas do not enter peoples head, I am so exhausted by the very concept of disability being a magnet for stigma, I believe such stigma causes normals to behave in bizarre ways.
Thank you for this reply. Apologies if any of this is short, I’m tired and going to bed quickly but wanted to acknowledge this before I forget.
> I started out with an aspergers diagnosis, that became disused in medical circles, and is now considered offensive in some circles. Then I was an autistic person, that became offensive in some circles, and got redefined to person with autism, which also became offensive in some circles. There is literally no non-offensive way to refer to my own disability presently. I have seen various other redefinition of slurs, euphemisms, and medical terms being changed around due to influences from various groups and half the time these groups are a minority of disabled people of some sort.
I’m sensitive to this too (as noted below I’m also autistic). We also have the misfortune of being something which the diagnostic criteria defines as external to us, so we get little say in even who we are or what we experience. That’s all the more reason to be skeptical when we feel someone is speaking for us.
> First, out of curiosity and practical interest, which disability?
Also autistic, and ADHD. Chronic pain and fatigue that I don’t even know what’s going on.
> Views on these matters tend to cluster and I'm not savvy on every disability communities views. If I'm making software for such a community, I will not use the word disable.
I really feel like I should re-emphasize that I was expressing my own feelings regarding the specific words. There are many disabled people who don’t like using the term “disabled” as a software state, and I’ll just say that you should be able to find that information if you’re looking for it. My personal negative reaction to the “disable” action is more a trauma response to verbs that conjure abuse. That is definitely more personal to intersecting experiences for me.
> Second, My post(s) were consciously written with the possibility I was talking to a disabled person in mind.
I see that now but to be honest it really didn’t feel that way at first. In hindsight I think I responded to you the same way. I appreciate that we came to this point though.
> You had already influenced people on HN before I made my post, you already had the ball rolling.
Maybe but I doubt that I influenced them much, other than making a bunch of people mad at me. All of these conversations have been going on well before this. My comment was clearly taken as suggesting language use, but I really sincerely was just saying how I personally react to the language. It’s a real challenge for me to always remember that expressing an opinion might be taken as a directive, and I forgot that in this instance.
> Disable/disabled are good BECAUSE they evoke disability[…]
There’s a lot in your last paragraph that I want to engage but I’m honestly too tired to do so tonight. I wanted to make this effort to respond. If you want to keep talking I’m here for it, but HN doesn’t make days old threads easy to find. But I’m easy to find around the net. Take care. And thank you again for a good genuine response to this. It made my day to see it.
As a (partially) blind person, I agree with you, and I know that some prominent blind people have spoken out against person-first language, for example: https://mosen.org/person-first-language-it-does-more-harm-th... (seems to be down at time of writing)
Tree [1], master [2], submission [3], abort [4], driver [5] – words can have different meanings in different contexts. Do you feel the need to "wince every time" you submit a link on HN?
hmm yeah I dunno, I don't think my boss was 'virtue signalling' by asking me to replace instances of 'invalid'.
I suspect by 'virtue signalling' you mean he was angling to impress me with how virtuous he was. (because the users never saw my code before the audit)
Why would he try to impress me? or think it would?
I never noticed the quixotic reaction against perceived 'virtue signalling' until sometime this century, and that code audit was back in the 90s.
I'm not certain but I do think it was actually prompted by a previous experience with a confused user who had the word printed on a receipt and mistakenly thought the term referred to them!
I think it's more that if users actually have mistakenly read a word and taken offense, it's easier to just use a different word than have to deal with the occasional upset person.
I do think the people going round proactively trying to change all the words with alternative negative meanings are a bit ridiculous though; if nobody's offended, you don't need to pretend to be offended on others' behalf.
Well yeah, people can be unreasonable sometimes - but from a business perspective, changing a word is better than having a few upset customers. That's literally what I was saying above.
Polysemy is orthogonal to the cases I am aware of in which people are concerned about the primary meaning, an example being "slave" to refer to an item that is bound or subservient.
And even in that case I think it depends on the word and how prominent the negative association is. "Disabled" seems a stretch to me, so I'd want to dig deeper and confirm it's actually a concern with some subset of users as opposed to trolling. But on the other end of the spectrum people who insist on using terms like "slave" at this point seem to be mostly using it as an excuse to virtue signal their adherence to principles of free speech.
Depends on what you replace it with, I don’t mind replacing ‘master’ with ‘main’, but all the hubbub around it made me obstinately stick with ‘master’.
Replacing ‘invalid’ with something like ‘incorrect’ seems much less likely to set people off.
You have an understanding of how the computer means things and what everything on the screen represents. This is by no means universal. To you it is like breathing, beneath thought.
Those who lack context as to what they're seeing, especially if frustrated and unable to parse what is visual noise to them, are likely to take it as the computer judging them negatively.
Depends.. there's always this one user who seems to take things differently. can't please everyone
I think society is losing the concept of context, which is pretty worrying. Can't even make jokes I used to, because people seem to take things a lot more literal these days. If I do any self-mockery, people look at me like I'm crazy for example. Most sarcasm is lost too..
In this light, I think we should KEEP the words master, slave, disabled, etc. So we can educate the meaning of context back to society.
That's a funny story, but i can't help but to feel that there's another problem to be solved:
> It turns out he wasn't noticing the space between the 'o' and the 'I' in 'Do It'; in the sans-serif system font we were using, a capital 'I' looked very much like a lower case 'l'
Why couldn't most fonts be designed in a way as to not mix up o and O and 0, l and I and L as well as other symbols like that? Of course, there are often historical constraints to deal with but it surprises me that this isn't some rule of font design that would need to be followed in 99% of the cases.
The limited resolution of the time was a big constraint on the design space. The first Mac had a resolution of 512 × 342, and that was considered generous.
The original font had the additional constraint that it had to be legible after “graying out” characters by AND-ing their pixels with a 50% gray grid (the screen was black and white, so that’s how you got gray)
I think, this must have been about the LISA System font. (Easily identified by the characteristic "v" and "w" glyphs. "I" and "l" actually look the same and both have a bit of a kerning problem, probably because of the requirement to match said 50% gray grid in a certain way.)
(Mind that the LISA didn't have square pixels and that the font would have appeared more condensed on the screen than in the printed sample due to pixel aspect ratio.)
See my comment above (I have a handicapped son ...) - I don't think that use of retard is offensive. The other common non-pejorative use that I can think of is talking about the timing of the spark igniting the fuel in your (gasoline) ICE.
Trucks and buses for continuous, smooth braking (which would require stepping on the pedal for an extended amount of time, but this is discouraged because brake pads wear out and/or lose grip when engaged for too long) use a mechanical or electric assisting device that has the same name: https://en.wikipedia.org/wiki/Eddy_current_brake
Is that difference in stress universal across English dialects? I thought there were some dialects that stressed nouns and verbs with the same spelling in the same way.
That's interesting, it shows the context menu, and it looks like the camel case function names from the code were directly copied over, as it shows 'printIt' and 'doIt' among the options.
One of my HCI favorite books is "TOG on Interface" by Bruce Tognazzini because it's filled with stories about why certain GUI elements work and the failures that were tried as they finally adopted working GUI concepts. TOG was instrumental in making the Mac OS UI consistent - https://amzn.to/3sKKKvA.
I can’t see using “Do It,” anyway. It’s an awkward phrase.
Picking good names for confirmation/cancellation buttons is difficult. I’m actually reviewing the ones in the app that I’m writing, now. Some of them are quite weird.
> Picking good names for confirmation/cancellation buttons is difficult. I’m actually reviewing the ones in the app that I’m writing, now. Some of them are quite weird.
Especially egregious when you are confirming a cancellation. I hate when I try to perform an action that is semantically a cancellation, and I am warned of its potentially deleterious effects and then asked "OK" / "Cancel". (Where, of course, "OK" means "OK, cancel", and "Cancel" means the opposite.)
Adding a word to the beginning does however make a good marketing slogan. And as the parent of a child who's highly-skilled as a procrastinator (don't put that in the skills section of your resume), adding a fourth word to the end of the marketing slogan is a common way to give someone a verbal push.
This was a very interesting connection between headline and content. Based on the headline I assumed an article about entrepreneurship and tech. Even the first paragraph seemed to support my hypothesis. Snobby academics not getting what technical progress combined with market demand looks like.
Ultimately that was not the topic of the article. Yet, reading between the lines it still confirmed the my underlying bias that lead me to interpret the headline in the way I did.
That reminds me of the first Windows program I wrote at my first job on Windows 3.11 in early 1995. IIRC, I referenced Programming Windows (I think) by Charles Petzold and it had a basic Windows program that had two menu items - "Do It", "Quit". I adapted that program to build my prototype and perform the task in the "Do It" menu handler. :)
I also think that at the time computer users may have been used to dismissing extra space around letters, since the computers they commonly interacted with would use monospaced fonts for technical reasons.
To sort of butcher a great line from Layer Cake: “Only very stupid backend people think that the front end is simple.”
Being a more backend-type person myself I have to remind myself that less equations on the human factors side means that side of things is harder, not easier than the “math-y” stuff.
If you've designed a good front-end from an HCI perspective, you'll still end up with sooooo many edge cases. Human behavior is much more variable than the calls sent to an API or SQL sent to a database. (It also helps that there are syntax requirements that limit what's considered valid.)
I wonder if the newness of kerning in a computer interface threw those few people off.
It was after all a monospace world for computers (and typewriters, for that matter — the other keyboard people likely sat down in front of at that time).
>> The designers observed that a few users seemed to stumble at the point that the dialog was displayed, clicking "Cancel" when they should have clicked "Do It", but it wasn't clear what they were having trouble with.
Let me tell ya... people are still having problems with this. Red/green buttons, "OK" instead of "Do it"... doesn't matter. We're dealing with apes here, people. Goddamn apes.
I've noticed a lot of webpages recently (eg government sign ups, "confirm payment" screens) are putting the action button on the left which mildly irritates me, maybe they're considering that people read left-to-right and aren't total degenerate computer users like I am.
Might have been a Smalltalk influenced thing. "Do it" is a menu option in the smalltalk scratch pad like thing called a workspace, you can type in a bit of code, select it with the mouse and "do it" which means run the code, or "inspect it" which will open a display of what was returned from your code when it ran.
Tesler's work was already on the HCI (human-computer interaction)
syllabus in 1988. So only 6 years after that work at Apple it was
already considered exemplary research and taught to undergrads. I
remember we did stuff on Ivan Sutherland, David Canfield, Douglas
Engelbart, Alan Kay and so may others, but that course actually had a
profound impact on me because the lecturer had a way of getting into
the psychology and philosophy - if you think about it most UI advances
are _technologically_ unremarkable.
I think what the OP article is talking about we called "Structured
Observation" and there was something like "mind state" analysis. I
visited the HCI lab at Imperial where they had the one way mirror and
talk-back microphones, like a recording studio but set up for recording
computer user behaviours.
What this accords with in psychological language is mentalisation and
intersubjectivity. In other words "getting inside th user's head and
feeling and thinking what they are feeling and thinking". Early HCI
people spent a lot of time in that zone.
An important _negative_ shift is that we now use telemetry and
logging, because today (as another comment says "who can be
bothered?") we are interested in what is happening from the
_application's_ viewpoint, not the user's. That is a subtle but
massive change of priority.
Look at those videos of early HCI today and you will be amazed at the
intensity of engagement and excitement. Using a computer was like
driving a car.
Structured observation on users today would show very long periods of
zombie like behaviour, one finger twitching, dribbling, eyes half
closed and body almost motionless, neck bent over a phone screen. Then
occasional outbursts of high expressed emotion, almost existential
angst, throwing the device on the floor and crying, calling the
developers "Nazis" and "Monsters". Then back to long period of passive
dribbling and swiping.
Some quote by Englebart I remember was that we must amplify
humankind's intellect to cope with the raid changes and challenges of
the world. But in shifting our concern from the users to the
manufacturers, sellers and profiteers of technology we have betrayed
that mission.
In a similar way that changing the "personnel department" to "human
resources" shifted the focus of practically all industry, the move
from human-computer interaction to user experience (UX) has effected
an equally imperceptible but devastating outcome upon technology. We
live in a time where the computer user is the _target_ (to have
"experiences" imposed upon) not an active component of a system with
greater ends.
Amazing to see the importance they gave to UX/UI and the level of sophistication they had forty years back.