Hacker News new | past | comments | ask | show | jobs | submit login
Do It (2004) (folklore.org)
283 points by kristianp on March 7, 2022 | hide | past | favorite | 102 comments



>>The user tests were conducted in a specially constructed room featuring a one-way mirror, so observers could watch the tests without being intrusive. The tests were conducted by a moderator who made sure the user felt comfortable and showed her the basics of using a mouse. Then, with no further instruction, users were asked to perform specific tasks, without help from the moderator, like editing some text and saving it. The moderator encouraged each user to mumble under her breath while doing the tasks, revealing her current thinking as much as possible. Each session was audio or videotaped for later analysis.

Amazing to see the importance they gave to UX/UI and the level of sophistication they had forty years back.


That's what people did before cheap TV cameras.

That's still the right way to do UI testing. You bring in people, you give them a list of tasks to do, and you take video of screen and user. Someone then has to watch the video and mark the places where people got stuck, had to back up, or gave up. Then create a highlights reel of every problem seen more than once. The devs have to watch this.


This is generally correct, but it’s often better to go to the user rather than bringing them in. People can have quite different reactions in their own homes, especially if they have specific setup needs, than in a research lab.


True, it's much easier now, and you can get a much wider sample of end-users and a more accurate representation of their behavior in their natural context. It's absolutely worth doing most of your usability testing remotely. But to be honest there's still something powerful about having the development team watch a real person through a one-way mirror that I haven't managed to replicate using video. It reminds me of the difference between live theater and watching a movie at home.

The closest I've come to replicating the impact is to have a conference room where the team sits together and watches the usability test being streamed live.


That description should also supply a warning about a difficulty that UX designers, and Apple in particular, have struggled with over the years: they're going to a great deal of effort to optimise their designs for the first hour of use.


Over-optimizing for the walk-up-and-use experience at the expense of a more realistic subset of usage is sometimes referred to as the "Pepsi Challenge" or "sip test" observational bias. You might even consider it a special case of the "streetlight effect" where people choose their research methods primarily based on how easy it is to collect that kind of data. (When I used to teach UX I had a whole lecture dedicated to alternate approaches to avoid falling into the trap of over-optimizing for novices, things like GOMS/KLM, log analysis, heuristic evaluations, diary studies, cognitive walkthough, surveys, etc.)


In 01981 if users didn't like your design in the first hour trying it out in the computer store they wouldn't buy your computer. Now if they don't like your design in the first five minutes of downloading it from the app store they'll uninstall your app. There's a real tradeoff between ease of use and ease of learning, but software merchants are in a real bind here.


Certainly. They were also optimising for positive reviews in newspapers and magazines, which were also typically based on a short period of use.

I don't remember seeing this one on the usual lists of forms of market failure, but it's real.


>In 01981

Any reason for the leading zero? Is it a native language thing?


The Long Now Foundation uses five-digit dates, the extra zero is to solve the deca-millennium bug which will come into effect in about 8,000 years.

https://longnow.org/about/


Honestly, a leading zero in human-readable text is a non-solution.

Year numbers are variable length decimal strings, not a four or five digit fixed length field. Any historian concerned with 6 AD-999 AD will likely not include a leading zero in any year figures they use. I don't see why the people of the present or the future should do any different.

Machine readable implementations, sure. Have as many digits as you wish. But for human written material, it just looks weird.


They also promote NFTs, so I have lost my respect for them.

https://twitter.com/longnow/status/1436364868131586054



Most of us have probably violated all sorts of good UI principles out of either haste or ignorance, but our crimes are mitigated by the fact that for the overwhelming majority of software the number of users, and hence the number cursing the idiot who designed this crappy interface, is extremely low and/or constrained by the bespoke nature of the application; hence the investment in time and money to perform this level of rigour is not justified.

The only time I've seen UI work carried out with this level of care was for an air traffic control system, where of course the consequences of a poor design can have significant and fatal consequences.


Don't forget flight simulators, the flip side of the same crucial situation, have equally elaborate design UI/UX simulations. And I believe power plants, especially nuclear power, have this type of elaborate design simulation, as well as everything in the (outer) space industry.


Equally remarkable is the amount of stuff we take for granted today that the Mac team got right the very first time.

Had it not been for that user feedback, Windows might have copied the Mac and gone with “Do It” as well and we would be living in a slightly more awkward UI timeline.

I always think about this when I look at current Apple's UI. Just a little user feedback would have gone a really long way.


That's still possible today, but nobody can actually be bothered, because there's plenty of other stuff to do as well.

I do recall ~8-10 years ago, we were building an app and did a user test; we had a phone with a webcam rigged to it so we could see from the user's point of view.

But for this you need dedicated (in multiple senses of the word) UX people, and they are both rare, and companies seem to be reluctant to hire them.


> But for this you need dedicated UX people

Tesler was a computer scientist. He worked on AI and Smalltalk at Xerox, and was a manager at Apple at this time.

What we need to do instead, is bring down the wall between design and development that was built up in the past decade, encourage developers in the UI field to have multi-disciplinary skills, and stop assuming every minute not spent coding or in meetings is wasted.


As a UI engineer, I wish it was more common for UI engineers to learn about accessibility and how to contribute to doing UX research, rather than trying to pick up some back-end and calling that "full-stack". There's much more useful cross-pollination to be gained when branching out that way.


But this might break the beautiful visual design.


That would be the equivalent of a car designer complaint that engineering ruined it. You cannot produce quality work without that collaboration in either direction.


Form over function? Or beautiful but useless (snark intended) The citrus press by Philippe Starck comes to mind. Many bars own it and display it but don't use it because it sucks. The last sentence came from a bartender who has it on the shelf.


If you look at GUI/web at least since a decade it is form over function.


They also had a target user base who'd literally never used a computer before, making UX testing critical for product adoption.

There was a time when tech products would start with a tutorial on the differences between clicking, right-clicking and double-clicking.


They just hadn't figured out that one can get results of the same quality from a usability testing room without one-way mirror. I don't blame them, the standard literature on how to run tests on the cheap appeared one decade later.


could you point to that literature?


The goto place is the Nielsen Norman Group¹. You can get any of their books, it will describe how to do tests. It's basically catch up to 5 users from your target audience, ask them to do the tasks you want, observe without saying a word, and don't ever come out with any variation of "users are stupid" explanation because even if they are, they are the ones you are creating the software for; fix your software, and repeat.

But if you want details about why those large "record the user" sessions aren't worth it, you'll have to dig into their papers.

1 - https://www.nngroup.com/


I worked on a public facing project with many retiree users. We had a code audit to remove the word 'Invalid' so that users would not think they had been identified as 'a person made weak or disabled by illness or injury'


Dude. I wrote a piece of software for a company that takes online signups, and then matches them to customers who purchase things in the store. Many years ago they asked me to write a report that output local customers who had signed up online, but never bought anything.

The database table for transactions is called `trans`. So the report I coded was labeled "No_trans_customers".

Last year I get a call from the company CEO -- can we rename this report? Apparently someone saw the name of it and freaked the fuck out.


That's golden


I similarly wince every time I use software which inflexibly uses “disabled” or especially the more affirmative “disable”.


I'm not aware of any community of disabled people or disabled person who finds the usage of the term "Disable" offensive in any context.

I've never actually heard any controversy over the word "Disabled" EXCEPT in the specific context of it being used to refer to disabled people (person-first vs identity-first language debate, but that's a digression)

There's quite a long history people arbitrarily deciding they speak on behalf of all disabled people and know that language must change to protect their dignity and feelings. Inclusive language is actually plain language, since that's the easiest for people with certain disabilities to understand, so the inclusive thing to do is to avoid arbitrarily stigmatising words.


I have a disabled son and agree that using the word "disabled" or "disable" isn't an issue. Please DON'T use the word retarded (unless you're talking about the timing of the spark plugs in your car) - this word originally had a clinical connotation which was largely synonymous to "disabled" but it is now offensive because people use it to disparage an able-bodied, able-minded person who's acting stupidly by comparing them to someone who's legitimately disabled.


personally I have very little exposure to the realm of disability. So if someone suggested to me that 'disabled' might not be a good term to be customer facing, although its not obvious to me, I know I do not have basis to disagree. I might just accept the suggestion and keep moving. I don't think it would occur to me to suspect someone of inventing nonsense.

actually I'm not even sure that 'disabled' is a clear term. In some contexts, 'unavailable' or 'not ready' might be more clear / less complicated. Kinda like how I don't personally recognise 'master' as a problematic term (انا لست امريكيا) but in many situations I find 'main' is a better fit anyway


With disabilities things get very complicated very fast because disabled people are a very diverse group. You'll have different groups of disabled people having different views on things, given they're often in pretty distinct and unique situations. This has resulted in a number of linguistic disputes over the years.

"Not ready" has a space in it which is frequently annoying in software. I think it also implies that thing may become ready after some time elapses or a condition is met, which disabled does not imply.

"Unavailable" and "Disabled" are already used adjacent to eachother in software, the former implies something becoming unusable for reason outside of the software's control, and "disabled" tends to imply something being made unusable intentionally. Here's F5 using the two words in this way https://support.f5.com/csp/article/K12213214

I agree with you regarding master vs main. Main is a syllable less, two letters shorter, and more clear. I feel people mostly resist the change to STICK IT to the politically correct.


ah yeah rite,

I'm horrible at naming things, which is another reason I wouldn't push back on this kind of change, even if I didn't see an issue.


What I find fascinating is that my comment about my personal reaction got a bunch of downvotes and several dismissive replies, including this one lecturing me about inclusion. And yet not one person who reacted negatively bothered asking me:

- whether I have a disability (yes)

- whether my reaction is consistent with views expressed to me by some others who have disabilities and/or work with/advocate for others who do (yes)

- whether I even have any interest in trying to convince others to change this language usage (not much; it would be welcome if people consider it but it’s nowhere near my priority list)

Now I’m not claiming that I—or anyone else who has a similar reaction—speak for all disabled people. But uh, you may want to re-read your comment. Particularly the last paragraph.


First, out of curiosity and practical interest, which disability? Views on these matters tend to cluster and I'm not savvy on every disability communities views. If I'm making software for such a community, I will not use the word disable.

Second, My post(s) were consciously written with the possibility I was talking to a disabled person in mind.

My visceral reaction to an implicit comparison between disable/disabled and master/slave being made is of annoyance and trepidation given what happened with master/slave and given the immediate replies (not from you) that somebody who DIDN'T want to get rid of terms like disable/disabled simply weren't interested in their users comfort. You had already influenced people on HN before I made my post, you already had the ball rolling.

I started out with an aspergers diagnosis, that became disused in medical circles, and is now considered offensive in some circles. Then I was an autistic person, that became offensive in some circles, and got redefined to person with autism, which also became offensive in some circles. There is literally no non-offensive way to refer to my own disability presently. I have seen various other redefinition of slurs, euphemisms, and medical terms being changed around due to influences from various groups and half the time these groups are a minority of disabled people of some sort.

Disable/disabled are good BECAUSE they evoke disability. This promotes familiarity with the concept of disability, and notably, that disabled people can't do things which is a surprisingly difficult thing for many people to understand. Conversely, disable/disabled makes a poignant analogy for what happens in these programs. I have a self-interest in making it so that more stigmas do not enter peoples head, I am so exhausted by the very concept of disability being a magnet for stigma, I believe such stigma causes normals to behave in bizarre ways.


Thank you for this reply. Apologies if any of this is short, I’m tired and going to bed quickly but wanted to acknowledge this before I forget.

> I started out with an aspergers diagnosis, that became disused in medical circles, and is now considered offensive in some circles. Then I was an autistic person, that became offensive in some circles, and got redefined to person with autism, which also became offensive in some circles. There is literally no non-offensive way to refer to my own disability presently. I have seen various other redefinition of slurs, euphemisms, and medical terms being changed around due to influences from various groups and half the time these groups are a minority of disabled people of some sort.

I’m sensitive to this too (as noted below I’m also autistic). We also have the misfortune of being something which the diagnostic criteria defines as external to us, so we get little say in even who we are or what we experience. That’s all the more reason to be skeptical when we feel someone is speaking for us.

> First, out of curiosity and practical interest, which disability?

Also autistic, and ADHD. Chronic pain and fatigue that I don’t even know what’s going on.

> Views on these matters tend to cluster and I'm not savvy on every disability communities views. If I'm making software for such a community, I will not use the word disable.

I really feel like I should re-emphasize that I was expressing my own feelings regarding the specific words. There are many disabled people who don’t like using the term “disabled” as a software state, and I’ll just say that you should be able to find that information if you’re looking for it. My personal negative reaction to the “disable” action is more a trauma response to verbs that conjure abuse. That is definitely more personal to intersecting experiences for me.

> Second, My post(s) were consciously written with the possibility I was talking to a disabled person in mind.

I see that now but to be honest it really didn’t feel that way at first. In hindsight I think I responded to you the same way. I appreciate that we came to this point though.

> You had already influenced people on HN before I made my post, you already had the ball rolling.

Maybe but I doubt that I influenced them much, other than making a bunch of people mad at me. All of these conversations have been going on well before this. My comment was clearly taken as suggesting language use, but I really sincerely was just saying how I personally react to the language. It’s a real challenge for me to always remember that expressing an opinion might be taken as a directive, and I forgot that in this instance.

> Disable/disabled are good BECAUSE they evoke disability[…]

There’s a lot in your last paragraph that I want to engage but I’m honestly too tired to do so tonight. I wanted to make this effort to respond. If you want to keep talking I’m here for it, but HN doesn’t make days old threads easy to find. But I’m easy to find around the net. Take care. And thank you again for a good genuine response to this. It made my day to see it.


As a (partially) blind person, I agree with you, and I know that some prominent blind people have spoken out against person-first language, for example: https://mosen.org/person-first-language-it-does-more-harm-th... (seems to be down at time of writing)


Tree [1], master [2], submission [3], abort [4], driver [5] – words can have different meanings in different contexts. Do you feel the need to "wince every time" you submit a link on HN?

[1] https://en.wikipedia.org/wiki/Tree_(data_structure)

[2] https://en.wikipedia.org/wiki/Mastering_(audio)

[3] https://en.wikipedia.org/wiki/Electronic_submission

[4] https://en.wikipedia.org/wiki/Space_Shuttle_abort_modes

[5] https://en.wikipedia.org/wiki/Device_driver


some of us are interested in taking small simple steps to make our users more comfortable.


You mean virtue signaling. Replacing an established and perfectly fine word has never helped a single user in the history of the universe.


hmm yeah I dunno, I don't think my boss was 'virtue signalling' by asking me to replace instances of 'invalid'.

I suspect by 'virtue signalling' you mean he was angling to impress me with how virtuous he was. (because the users never saw my code before the audit)

Why would he try to impress me? or think it would?

I never noticed the quixotic reaction against perceived 'virtue signalling' until sometime this century, and that code audit was back in the 90s.

I'm not certain but I do think it was actually prompted by a previous experience with a confused user who had the word printed on a receipt and mistakenly thought the term referred to them!


I think it's more that if users actually have mistakenly read a word and taken offense, it's easier to just use a different word than have to deal with the occasional upset person.

I do think the people going round proactively trying to change all the words with alternative negative meanings are a bit ridiculous though; if nobody's offended, you don't need to pretend to be offended on others' behalf.


Polysemy is a fundamental concept in most (all?) languages. I am not aware of a language without polysemes.

Should we replace “driver”, because people might confuse them with people driving cars? How about “tree”? Are you confused by river “banks”?

“What, commit trees don’t grow on soil?”

“What do you mean? I can’t sit on the same bank that manages my money?”

You cannot and should not design for the lowest common denominator, which is a person not familiar with basic language concepts.

If you think that “master” branch is offensive, YOU are the problem, not the language.


I might not find "master branch" to be offensive, but if somebody I'm working with asks to rename it, I'm okay with that.


Well yeah, people can be unreasonable sometimes - but from a business perspective, changing a word is better than having a few upset customers. That's literally what I was saying above.


Polysemy is orthogonal to the cases I am aware of in which people are concerned about the primary meaning, an example being "slave" to refer to an item that is bound or subservient.

And even in that case I think it depends on the word and how prominent the negative association is. "Disabled" seems a stretch to me, so I'd want to dig deeper and confirm it's actually a concern with some subset of users as opposed to trolling. But on the other end of the spectrum people who insist on using terms like "slave" at this point seem to be mostly using it as an excuse to virtue signal their adherence to principles of free speech.


Depends on what you replace it with, I don’t mind replacing ‘master’ with ‘main’, but all the hubbub around it made me obstinately stick with ‘master’.

Replacing ‘invalid’ with something like ‘incorrect’ seems much less likely to set people off.


I don't get it. If it says something's disabled, why would you think the software is talking about you? Or personally insulting you?

I mean, I also don't understand the guy who thought the software was calling him a "Dolt"


You have an understanding of how the computer means things and what everything on the screen represents. This is by no means universal. To you it is like breathing, beneath thought.

Those who lack context as to what they're seeing, especially if frustrated and unable to parse what is visual noise to them, are likely to take it as the computer judging them negatively.


we don't need to understand it or agree:

either we are interested in and sensitive to how users react, or not.


Depends.. there's always this one user who seems to take things differently. can't please everyone

I think society is losing the concept of context, which is pretty worrying. Can't even make jokes I used to, because people seem to take things a lot more literal these days. If I do any self-mockery, people look at me like I'm crazy for example. Most sarcasm is lost too..

In this light, I think we should KEEP the words master, slave, disabled, etc. So we can educate the meaning of context back to society.


edit: said something rong?


You're arguing in bad faith [1][2]. Please don't do that here.

[1] https://en.wikipedia.org/w/index.php?title=Wikipedia:Assume_...

[2] https://en.wikipedia.org/wiki/Good_faith


What one-word replacement would you use? ('Deactivate'?)


Thank you for engaging this with curiosity and good faith! Yes, I think de/activate is a great alternative.


Activate would make sense for opt-in, no?


That's a funny story, but i can't help but to feel that there's another problem to be solved:

> It turns out he wasn't noticing the space between the 'o' and the 'I' in 'Do It'; in the sans-serif system font we were using, a capital 'I' looked very much like a lower case 'l'

Why couldn't most fonts be designed in a way as to not mix up o and O and 0, l and I and L as well as other symbols like that? Of course, there are often historical constraints to deal with but it surprises me that this isn't some rule of font design that would need to be followed in 99% of the cases.


The limited resolution of the time was a big constraint on the design space. The first Mac had a resolution of 512 × 342, and that was considered generous.


The original font had the additional constraint that it had to be legible after “graying out” characters by AND-ing their pixels with a 50% gray grid (the screen was black and white, so that’s how you got gray)

That’s why characters in Chicago 12 (http://kare.com/apple-icons/) have ‘fat’ stems (https://en.wikipedia.org/wiki/Typeface_anatomy)

I don’t think there was much room for making lowercase l different from both uppercase L and uppercase I while keeping the font look nice.


I think, this must have been about the LISA System font. (Easily identified by the characteristic "v" and "w" glyphs. "I" and "l" actually look the same and both have a bit of a kerning problem, probably because of the requirement to match said 50% gray grid in a certain way.)

Compare: https://www.applefritter.com/content/lisa-font-charts

(Mind that the LISA didn't have square pixels and that the font would have appeared more condensed on the screen than in the printed sample due to pixel aspect ratio.)

Edit: For an on-screen sample with word spacing in button text as eventually released compare: http://toastytech.com/guis/lisaos1Clock_Close_Dialog_Large.j...


A (poor quality) image of the "Do It" screen: https://usability.typepad.com/photos/uncategorized/dolt.jpg


Given the random spaces in "R ename" and "R emove", I'm not surprised that they ignored the space in "Do It"


I'm not convinced that this is the one, the article says it was in a sans serif font.


It would have been better to have it as "Do it"

But yeah, OK is better


Reminds me of cockpit recordings of the automatic callout from the Airbus landing system: "50...40...30...Retard! Retard! Retard!"


See my comment above (I have a handicapped son ...) - I don't think that use of retard is offensive. The other common non-pejorative use that I can think of is talking about the timing of the spark igniting the fuel in your (gasoline) ICE.


Trucks and buses for continuous, smooth braking (which would require stepping on the pedal for an extended amount of time, but this is discouraged because brake pads wear out and/or lose grip when engaged for too long) use a mechanical or electric assisting device that has the same name: https://en.wikipedia.org/wiki/Eddy_current_brake


I think it's a Family Guy episode where the callout also gets called... names in reply to the announcement.


Is it RE-tard (noun) or re-TARD (command)?


Is that difference in stress universal across English dialects? I thought there were some dialects that stressed nouns and verbs with the same spelling in the same way.


Isn't this a special case, because isn't the noun actually just short for "retarded"?

Which I think just has one pronunciation.



That's interesting, it shows the context menu, and it looks like the camel case function names from the code were directly copied over, as it shows 'printIt' and 'doIt' among the options.


I found that in quick search for a screenshot of the context menu. I thought I remembered "do it" from a ParcPlace Smalltalk implementation.


This shows why it's so important to be with the person or at least be able to talk to the user directly when conducting user research.

I've seen too many founders write by email/instant messaging, or worst sending forms when asking questions about their product.

Talking to a person directly will get you so much further !


One of my HCI favorite books is "TOG on Interface" by Bruce Tognazzini because it's filled with stories about why certain GUI elements work and the failures that were tried as they finally adopted working GUI concepts. TOG was instrumental in making the Mac OS UI consistent - https://amzn.to/3sKKKvA.


That’s a riot.

I can’t see using “Do It,” anyway. It’s an awkward phrase.

Picking good names for confirmation/cancellation buttons is difficult. I’m actually reviewing the ones in the app that I’m writing, now. Some of them are quite weird.


> Picking good names for confirmation/cancellation buttons is difficult. I’m actually reviewing the ones in the app that I’m writing, now. Some of them are quite weird.

Especially egregious when you are confirming a cancellation. I hate when I try to perform an action that is semantically a cancellation, and I am warned of its potentially deleterious effects and then asked "OK" / "Cancel". (Where, of course, "OK" means "OK, cancel", and "Cancel" means the opposite.)

EDIT: mlok (or, ironically for this thread, do I mean mIok?) made the same point an hour ago: https://news.ycombinator.com/item?id=30587293 .


Adding a word to the beginning does however make a good marketing slogan. And as the parent of a child who's highly-skilled as a procrastinator (don't put that in the skills section of your resume), adding a fourth word to the end of the marketing slogan is a common way to give someone a verbal push.


This was a very interesting connection between headline and content. Based on the headline I assumed an article about entrepreneurship and tech. Even the first paragraph seemed to support my hypothesis. Snobby academics not getting what technical progress combined with market demand looks like.

Ultimately that was not the topic of the article. Yet, reading between the lines it still confirmed the my underlying bias that lead me to interpret the headline in the way I did.


That reminds me of the first Windows program I wrote at my first job on Windows 3.11 in early 1995. IIRC, I referenced Programming Windows (I think) by Charles Petzold and it had a basic Windows program that had two menu items - "Do It", "Quit". I adapted that program to build my prototype and perform the task in the "Do It" menu handler. :)


Bad keming strikes again.


But the issue wasn't kerning. There was a space between Do and It, kerning is how two letters look next to each other.


Keming is also how two letters not adjoined to each other look like they are.


I've never heard it used that way but it makes sense. Thanks!


I also think that at the time computer users may have been used to dismissing extra space around letters, since the computers they commonly interacted with would use monospaced fonts for technical reasons.


To sort of butcher a great line from Layer Cake: “Only very stupid backend people think that the front end is simple.”

Being a more backend-type person myself I have to remind myself that less equations on the human factors side means that side of things is harder, not easier than the “math-y” stuff.


If you've designed a good front-end from an HCI perspective, you'll still end up with sooooo many edge cases. Human behavior is much more variable than the calls sent to an API or SQL sent to a database. (It also helps that there are syntax requirements that limit what's considered valid.)


I wonder if the newness of kerning in a computer interface threw those few people off.

It was after all a monospace world for computers (and typewriters, for that matter — the other keyboard people likely sat down in front of at that time).

Or we can blame the Chicago 12 font.


>> The designers observed that a few users seemed to stumble at the point that the dialog was displayed, clicking "Cancel" when they should have clicked "Do It", but it wasn't clear what they were having trouble with.

Let me tell ya... people are still having problems with this. Red/green buttons, "OK" instead of "Do it"... doesn't matter. We're dealing with apes here, people. Goddamn apes.


Well, in 2022 I still stumble upon dialog boxes in the vein of :

Do you really want to cancel ?

[OK] [Cancel]


Which do you prefer? (Which is better design)

[ OK ] [Cancel]

or

[Cancel] [ OK ]


There seems to be a lively debate across the internet on this one. I've always preferred the "action" button to be on the right like this example:

https://miro.medium.com/max/1838/1*RSGWrWvqA0ubbg7BW8rz8w.pn...

I've noticed a lot of webpages recently (eg government sign ups, "confirm payment" screens) are putting the action button on the left which mildly irritates me, maybe they're considering that people read left-to-right and aren't total degenerate computer users like I am.


I prefer

    [ᶜᵃⁿᶜᵉˡ]        [ OK ]
I think more visual weight should be given to the action you expect/wish the user to take.

Also I always put the happy path on the right, but I can't articulate a reason for this.


I’ve had some users that definitely needed a “dolt” button……


what did the user interfaces at Xerox Parc say? Do it, OK, or something else?


Might have been a Smalltalk influenced thing. "Do it" is a menu option in the smalltalk scratch pad like thing called a workspace, you can type in a bit of code, select it with the mouse and "do it" which means run the code, or "inspect it" which will open a display of what was returned from your code when it ran.


Tesler's work was already on the HCI (human-computer interaction) syllabus in 1988. So only 6 years after that work at Apple it was already considered exemplary research and taught to undergrads. I remember we did stuff on Ivan Sutherland, David Canfield, Douglas Engelbart, Alan Kay and so may others, but that course actually had a profound impact on me because the lecturer had a way of getting into the psychology and philosophy - if you think about it most UI advances are _technologically_ unremarkable.

I think what the OP article is talking about we called "Structured Observation" and there was something like "mind state" analysis. I visited the HCI lab at Imperial where they had the one way mirror and talk-back microphones, like a recording studio but set up for recording computer user behaviours.

What this accords with in psychological language is mentalisation and intersubjectivity. In other words "getting inside th user's head and feeling and thinking what they are feeling and thinking". Early HCI people spent a lot of time in that zone.

An important _negative_ shift is that we now use telemetry and logging, because today (as another comment says "who can be bothered?") we are interested in what is happening from the _application's_ viewpoint, not the user's. That is a subtle but massive change of priority.

Look at those videos of early HCI today and you will be amazed at the intensity of engagement and excitement. Using a computer was like driving a car.

Structured observation on users today would show very long periods of zombie like behaviour, one finger twitching, dribbling, eyes half closed and body almost motionless, neck bent over a phone screen. Then occasional outbursts of high expressed emotion, almost existential angst, throwing the device on the floor and crying, calling the developers "Nazis" and "Monsters". Then back to long period of passive dribbling and swiping.

Some quote by Englebart I remember was that we must amplify humankind's intellect to cope with the raid changes and challenges of the world. But in shifting our concern from the users to the manufacturers, sellers and profiteers of technology we have betrayed that mission.

In a similar way that changing the "personnel department" to "human resources" shifted the focus of practically all industry, the move from human-computer interaction to user experience (UX) has effected an equally imperceptible but devastating outcome upon technology. We live in a time where the computer user is the _target_ (to have "experiences" imposed upon) not an active component of a system with greater ends.


> In a similar way that changing the "personnel department" to "human resources" shifted the focus of practically all industry, ….

I read recently, though where I can't remember, that this was not just a gradual change but a specific one championed by, among others, Ernestine Gilbreth Carey (https://en.wikipedia.org/wiki/Ernestine_Gilbreth_Carey), co-author of Cheaper by the dozen (https://en.wikipedia.org/wiki/Cheaper_by_the_Dozen). The Wikipedia page https://en.wikipedia.org/wiki/Human_resources#Origins_of_the... does not mention her, but https://en.wikipedia.org/wiki/Human_resources#History suggests that her work at Macy's was around the right time.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: