Hacker News new | past | comments | ask | show | jobs | submit login
Disrespectful Design – Users aren’t stupid or lazy (somehowmanage.com)
402 points by Ozzie_osman on Sept 13, 2020 | hide | past | favorite | 227 comments

A better model for thinking of users is to realise that they are not stupid, just busy and distracted. From an app point of view you can count yourself lucky if you get even 10% of their attention at any given time.

Users aren't thinking about your app when they use it.

That person in MSWord isn't thinking about the ribbon bar. They are thinking about how to address that new prospective customer.

That person in Excel isn't thinking about cells either. They are thinking about whether Robin in R&D is going to get the figures for Q2 to them in time.

Users have tasks to concentrate on. They don't have the attention left over to deal with your app's crappy UI.

The side effect of being busy and distracted is that they're extremely thrifty with their attention, and sometimes their split-second subconscious decision not to look closely at something gets locked in and not reevaluated. About 70% of the IT support I do for my family goes like this:

Voice from the next room: "Can you help me, dkarl? The computer won't save my document."

Me, comfy in my chair: "What does it say?"

"It's doing something weird."

"Weird how? What does it say?"

"I don't know, I was just saving like I always do, and now there's this window and it wants me to do something."

"What is it asking you to do?"

"It wants me to click something? I don't know, maybe it wants a password or it's not going to do it because of iCloud or something? It's been weird lately."

"Okay. What are the words in the window where it wants you to click something?"

"It says... oh, it says there's another file with this name already."

"Okay, do you want me to come over there and help you find the other file and see what's in it?"

"No, I can do it. Thank you!"

I think one of the reasons I find computers relatively easy is that I compulsively read whatever you put in front of me. I always read cereal boxes when I was a kid, even the non-kid cereal boxes that were all about colon health and fiber. But even I get this blindness sometimes, especially when I'm writing code and running builds and tests and trying to work quickly and efficiently. I'll get hung up on something for ten minutes where the answer is literally spelled out for me in front of my face.

This description rang pretty true for me. My mother is much, much better about it now, but practically all of my tech support with her for a while was little more than stopping her from closing error messages immediately, then just asking her to read them. At times I was almost impressed with how quickly and accurately she could click the Cancel button.

> I think one of the reasons I find computers relatively easy is that I compulsively read whatever you put in front of me.

Telling my mom this is what eventually got her to read error messages instead of closing them. She had to learn not to close them and then, separately, learn to also read them. I'll admit there were times where I was on the verge of crying out of frustration, begging her to "please just read the screen" because she would leave the error, but then still not actually read it and go with:

Me: "What does it say?"

Mom: "I don't know."

So glad things are better, now.

I think one of the reasons I find computers relatively easy is that I compulsively read whatever you put in front of me. I always read cereal boxes when I was a kid, even the non-kid cereal boxes that were all about colon health and fiber. But even I get this blindness sometimes, especially when I'm writing code and running builds and tests and trying to work quickly and efficiently. I'll get hung up on something for ten minutes where the answer is literally spelled out for me in front of my face.

This is an astute observation.

I also didn't know that other kids read cereal boxes like that... I thought it was just me!

There are likely some generational differences here... I definitely read even the nutritional information on the cereal boxes but it was because there was nothing whatsoever else to do during breakfast. If I had a smart phone I'd find something better to read.

I totally did this, and I found reading labels even more fun in Canada because they were also in French, and I didn't know any French!

There's a fair number of them. If there are any parents here with children like that, try to see if you can get them introduced to local trivia competitions such as Science Bowl–it's an invaluable skill that will put you leagues ahead of anyone else since they will never be able to replicate your child's "years of studying".

Another datapoint for developer and also reading all the cereal boxes.

Or even just signs when driving down the highway. I can't help but read everything.

Same here.

It took me the longest time to realize that for some people reading is something that doesn't happen until they choose to do it.

We need some data points from non-developers now.

Engineering student. Compulsive reader of all text within eyesight.

UX designer here who read all the cereal boxes. Still a compulsive reader, and the internet makes this a dangerous trait.

The other day my wife clicks "Save" in word. A folder comes up that she previously saved to. She types in a new file name.

Then says "hey hon, It won't let me save."

I see the error. Can't access disk. Immediately I think, well I replaced the HDD with and SSD, so that's probably not the problem. I look at the folder she's accessing, everything looks normal. It shows a bunch of other files in it. I go up a folder and back into the folder, and everything looks great, click Save - > same error again.

Finally I go to my computer -> and path my way back all the way into the same folder, click Save. It works and she says thank you!

And I'm thinking, what the fuck that shouldn't have fixed it.. walking away dumbfounded. No problem since, there's nothing wrong with the disk she's been using the computer for days since. It was literally a bug in word's specialized save dialog.

I wonder if it was a bug in the way Windows handles "libraries". Like going to "documents" instead of going to c:\users\blah\documents. In fact I bet that's what it was. Maybe it thinks a USB stick is part of the "Documents" library or something.

Non-english Windows has magic names for program, downloads, documents etc folders.

I see alot of apps using hardcoded paths since they create a "downloads" folder and save stuff to it.

Alot of bugs are exposed in a non US-eng install.

I mostly run into this with people terrified of pressing the wrong thing and breaking the computer. Might be a little like how the first thing you do when you're learning to, say, kayak is to flip the boat so that you're confident you can recover from the worst case scenario.

I have broken so many computers in my life and repaired them that I am not very worried about pressing the wrong thing because I am confident I can recover from a mistake. Someone who has never repaired their own computer runs an unknown (to them) risk of being wrong that would force them to take their computer to an actual store and the hard drive probably being erased without asking.

Always reading the dialog boxes is usually a very good idea, but I've also seen that strategy backfire.

Specifically, I was once trying to help a CS professor register a new domain name. This was at some low-cost registrar and there were 6 pages of upsells on the way to checkout. For whatever reason, he was definitely taking his time reading the entire pitch, and eventually I had to step in and click 'next' several times. "No, you don't need a new virus scanner as part of your domain name purchase. No you don't need a company to submit your new domain name to search engines. No you don't need SEO services. ..."

Chances are it you actually read though the terms and conditions before clicking the checkbox, your session will have timed out and you'll have to start over.

> I compulsively read whatever you put in front of me

I'm going through a contract-heavy legal process right now, and it is amazing how strongly the agents/lawyers/etc. involved in the process clearly expect me to just sign without reading. Obviously they're not pressuring me not to read it, but the implied expectations of time/effort that it should take is much less than necessary.

Yep, I always have this with mortgages. People have been pretty good about agreeing to send me the documents the night before so that I can actually read them. But I also at least skim them before signing to make sure they say the same thing they did the night before. I haven't found anything objectionable, so maybe the rational thing to do would actually be to just do what everyone else does and sign based on what the person says they say.

Did you know that hospitals and urgent care providers are now having you sign agreements you don't even see? They say "ok, you need to sign the general release"..."now you need to sign that we can bill your insurance company"... and it's entirely verbal/on faith. You're just signing an electronic pad that displays no information.

Any lawyers around who could discuss how enforceable this kind of thing is(n't)? I'd assume at the least it's treated like a contract of adhesion -- what about how it's handled compared to verbal contracts?

The ones I say explicitly say that you're signature acknowledges that you've read the policy. You have to ask to see it before signing. It feels skeevy to me to be sure but honestly it's not all that different from signing something you have been given but still haven't read.

Edit to add: also, I've seen this in more places than just health care.

Nope. No verbal or written statement about a written policy being read or existing. I don't know what happens if you demand to see one, but one isn't offered at all. There's no fine print on the display saying you have read anything. It's just verbally "sign for this" and you do. Yet they don't condense it down to one signature.

I complained and was ignored in the same way people are ignored when they complain about the wait in the ER or ask how much treatment will cost.

See I find this an example of a user being lazy, and I experience this frequently. They are not even bothering to read what the computer is saying, just giving up and knowing/assuming that you will figure it out. They don't value your time, and cannot be bothered to expend the slightest effort to solve their own problem.

It is easy for those who are comfortable with computers to read an error message then determine whether we can solve the problem on our own or request support.

For people who are not comfortable with computers, they just see an error message and expect it to be something that they cannot handle.

Which is dumb. 80% of error messages are likely something they are perfectly equiped to handle.

Perhaps, but it is that 20% of error messages that leave people believing that they are unable to handle the other 80%.

Part of the problem is how error messages are presented. A lot of software uses tools like modal dialog boxes to present everything from "a file with that name already exists" to "the file cannot be written". The former can easily be handled. The latter may involve a call to technical support (and part of the reason for that is the ambiguity of the error message).

I always have the same problem with novice developers too --



"I have an error"

What error?

"It won't compile"

Whats the error message?

"Function lib.X not found"

Did you import the library?


Import it

"It worked"

Most everyone seems to be trained to ignore error messages, regardless of background. Expecting them to go the next step and google the error is a lost cause, regardless of background. Even 5yr devs sometimes need to be taught this... I'm still not clear why, but it feels like people are being trained to use computers this way -- i just can't figure out when, why and who's doing this mis-training

This is likely because many error messages are not helpful. Blame all the developers who just put an error code in their error messages and called it a day.

I'm the same way. Modern UI is often (seemingly) purposely obtuse. It's maddening. It (the UI) doesn't seem to respect people who read the information given to them by actually presenting that information. I shouldn't have to click through a dozen links to get to documentation.

A big frustration of mine is sites/apps that bury their support info. If I need support, I'd like support. Expose your FAQ to me, point me towards some forums, but don't bury the actual contact info somewhere like the footer, or worse, omit it completely. If I type my problem into the supplied form and the chatbot that opens up asks me to type that info again (often after a redirect where the session info gets overwritten so that I can't even copy-paste my already written text) I'm much more likely to be unnecessarily rude to the support person in the other side... Assuming it's a person and not the aforementioned chatbot.

You're being overly generous. If some feature that usually works suddenly stops working and now there's some text on the screen instead, and you go all the way to ringing up a family member for help without even reading the text then I struggle to call that anything other than "stupid and lazy".

Don't think for a second that IT professionals are any better.

You have no idea how many times I've seen the RDP certificate warning: "Click here to never show this again."

Nope, no. Nobody but me ticks that checkbox. NOBODY. It's infuriating.

That, and the VMware vSphere first-time tab that "explains what a cluster is". Click here to close and never show again.

Nope. Every time I look over the shoulder of a dedicated, full-time, VMware-certificate ESXi cluster administrator... there it is. The certificate warning. The first-time popup. Every time.

I'm one of those people that never checks the "click here to never show this again" box on RDP cert warnings so I'll chime in.

In my mind, I know I really should manually grab a copy of the cert from work so I'm actually certain I'm not being MITM'd, but on the other hand I know the likelihood of that being the case is pretty darn low and I'm lazy so I don't. But I make sure that warning keeps popping up because I don't want to forget that I'm doing something dangerous.

This method is much, much worse for security. It's literally the worst possible thing that you could do.

The easy way is decently secure: if you tick the checkbox, it memorises the certificate serial number for you, the same as if you had manually trusted a self-signed cert. Any change to the certificate will show the warning again. This is comparable to trusting an SSH certificate the first time you connect to a host with Putty.

The best way is to issue RDP Certificates from an Enterprise PKI: They're free, you never get warning popups ever again, and they're secure.

Your method means that if the certificate changes (due to a MitM attack), you won't notice. The warning looks the same with a new certificate as an old certificate that you didn't choose to trust. You're 100% vulnerable and you've both hidden an a warning for an actual attack and also trained yourself to ignore all such warnings.

> The easy way is decently secure: if you tick the checkbox, it memorises the certificate serial number for you, the same as if you had manually trusted a self-signed cert. Any change to the certificate will show the warning again. This is comparable to trusting an SSH certificate the first time you connect to a host with Putty.

Then it should say so.

"Click here to never show this again"—never show for this certificate? For this connection? For any connection?

Be clear and specific, even if it takes three more words.

Thanks. I will rectify this now :)

This is so spot on. My family tech support goes EXACTLY like this. Once I get them to actually articulate the issue I usually just google it and after I help them I send them the google search.

Basically me, when I tell my IT colleagues that if they just run "git status" in 99% of the cases it will tell them what to do if something didn't work "magically"...

> A better model for thinking of users is to realise that they are not stupid, just busy and distracted.

I'd put it differently. Realize that your app is not the center of the world. Your users have things to do, and using your app is a small fraction of their life. Even though it might be your baby and you might be spending all of your waking ours thinking about your app, user perspective is different.

It reminds me of teachers' and professors' attitude towards their classes/courses. From their point of view, it's maybe the only thing they teach or one of 2-3 things and it's the general topic they spend all their time on. From a student's perspective it's one of many courses in various topics. They can't be expected to laser focus on just the class that you teach. But many teachers don't get this concept. They think everything revolves around their course only.

That's funny, because the inverse problem exists too. Many students think that teaching is all that professors do. They assume that the prof's whole life revolves around teaching their class, when in fact the bulk of the prof's job is research, supervision and administration.

Or, more bluntly: realize that your app is a tool, not a goal. Tools should be predictable, reliable, and easy to use correctly.

And intuitive! Since apps are like a blackbox to the user, they shouldn’t do things that make the user hesitate or get confused. With feature creep most apps become an unmanageable piece of mess

This is what frustrates the heck out of me when platforms decide to redo their entire user experience, just because they feel the need to "change something".

Thereby forcing all of their users to learn the new UI immediately, not even with due warning.

One or two platforms doing is in a week or so is a pain. But when lots of platforms do this in a short period of it, it can be "throw hands up in the air and do something else". :(

Maybe the user is like a person walking down the street and your app is your kid with girl scout cookies boxes it's trying to sell, and you're helping it out. :) And it's not about the money as much as the app growing up, or something.

>A better model for thinking of users is to realise that they are not stupid, just busy and distracted

but this is functionally the same thing. whether the user is an idiot giving you 100% of their mental capacity, or a genius giving you 1%, you have to make the same decisions when you're designing a product. "Users are stupid" isn't necessarily a judgemental thing meant to demean your users, just a reality that you have to accomodate the people who are going use your product in a way that's indistinguishable from a stupid person.

You can't assume that anybody will remember anything they did in a previous step, or that anybody will correctly interpret the label on a button no matter how clearly you think it is worded, or when presented with multiple options will be able to correctly choose the one they want.

The product's design is the tool you use to distinguish.

Say that, unbeknownst to you, 90% of your users are non-native English speakers. Is a non-native English speaker "stupid" because they misinterpret the simple English idioms, metaphors, and symbology sprinkled throughout your product?

A group of actually-stupid-but-native English speakers might exhibit the same behavior.

You hypothesize that you have a large percentage of non-native English speakers using your app and change the design to accomodate. Behavior improves, distinguishing the bulk of your users from merely "stupid" English speakers.

The problem is that if you conceptualize your users as "stupid" the hypothesis that something else might be going on _never even enters your head_. You never do the experiment. Or, if you do, it's not faithful to an alternative model and is basically a wild stab in the dark.

This is a really good point. “Stupid” is a fun and dramatic framing (I will sometimes use “drunk user” as the equivalent), but it’s also reductionist and less useful for specific design decisions. Thinking deeply about why they are operating at less than peak efficiency helps you make decisions that accommodate those reasons.

Or sleep deprived, to be less judgemental.

Imagine your user was not able to sleep on a Trans-Pacific flight in bad weather, they have hay fever and a very sore neck.

Better yet, get user studies.

> whether the user is an idiot giving you 100% of their mental capacity, or a genius giving you 1%, you have to make the same decisions when you're designing a product


An idiot giving you 100% will never learn your product.

A genius giving you 1% will eventually develop a highly optimized way to use your product.

If we assume all users are the former, we never design for the latter. That pretty much sums up the last 20 years of de facto UX.

And if you disagree, I'd recommend you go find a call center still using mainframe apps, and watch a random sample of 100 users.

Completely agree and your call center example is dead on. I work in telephony and our feedback from clients is always to simplify displays and reports so that they can manage the call flows better.

It was hilarious watching some of the reactions in our team when a client gave us a screenshot that was just boxes with numbers in them arranged neatly with no more than red/yellow/green for colors. Some saw ugly... I saw enhanced productivity.

Personally I think more UI/UX designers need to spend time doing data entry with time constraints so they learn the pains of repeated useless actions and time wasting features.


perhaps, if the user is a continual user. but most users of most products are infrequent, and if your UI relies on being learned over time, it's probably not going to work for most people.

That only applies to... niche stores and food ordering, maybe.

If you're working on SaaS that is meant to help someone make money or work faster, there's a good chance you'll have people using your software as a part of their dayjob - i.e. 2-8 hours a day, 5 days a week, for years. If you treat your users as idiots and don't implement space for them to grow (i.e. power user features), you'll be wasting a lot of life for a lot of people (and a lot of money of their bosses).

That's a heuristic I'd like to see being used in UX design: assume your software becomes a tool in some company, and you have full-time users spending their workday on it. Design to that group.

"Not going to work for most people" in this context feels more like "going to impact my growth metrics."

Which, yes, probably. But there are more goals in software development than hyper growth.

Particularly if your growth metric is number of registered users, silently ignoring those who didn't delete their account (almost nobody ever does). Following such metric prioritizes first-use experience (i.e. baiting registrations with UI glitter) over actual utility. Which is what I suspect happens a lot today (along with its related problem, optimizing for sale over continued use).

What you are talking about is consumer entertainment. Even utility apps like Google Maps are something you have to learn and yes I'm tired of all the "What's new! Don't do anything until you click me!" garbage. I need the app to open, do what I want, and then close. Thank-you.

It’s qualitatively different in that one assumes that your product deserves 100% of anyone’s attention and the other doesn’t.

In most situations most of the time, your code is between someone and something they want. Your main importance in their life is when you fuck it up, not when you get it right. You just aren’t that important. The user is.

Which is partly why we need books championing good design. Because good design gets out of the way, and would go mostly unnoticed if not for other designers and aficionados giving them kudos.

> whether the user is an idiot giving you 100% of their mental capacity, or a genius giving you 1%, you have to make the same decisions when you're designing a product.

I disagree that you have to make the same decisions. I think you need to design your product contextually. Sometimes, users might be giving you 100% of their capacity, sometimes they will only give you 1%. It depends on context and motivation. I'm working on a personal finance app. Sometimes, users are in a "I just want to make sure that latest transaction isn't fraud"-mode and you'll have seconds of attention. But sometimes (albeit much more rarely), users are in a "I want to sit down and think about my financial future mode".

> "Users are stupid" isn't necessarily a judgemental thing...

I hope not, but unfortunately, I've often found that it is.

I think the idea is about developer attitudes. I've heard variations of "the user is dumb" too many times to count in the last 15 years.

People who think about users as idiots dismiss their issues. People who think about them as distracted change app design.

At least in my experience, the two ways of framing users lead to different behavior of development team.

You really can't bucket your users on a single mental model. It depends entirely on application and usage patterns. Daily users will become experts pretty quickly and look for fast, simple design. Doubly so if it's a line of business application with a captive audience (like a POS interface). Something like my homeowner insurance is a site I log in to like once a year. I'm not ever going to even remember my password let alone how to do anything without giant, colorful buttons leading me along. This is why UX is a profession.

The seminal work on interaction design isn't Don't Make Me Think, it's About Face. About Face makes thee general rule that you should not design for beginners or experts but for the perpetual intermediate.

I can see how this breeds resentment in some contexts, though.

A developer might spend a huge amount of time and effort modeling a workflow or a process into something crisp enough for a machine to process. That might take months of what feels like pulling teeth from people to extract requirements - people who are supposed to be domain experts. By the time he's done, he probably understands everything better than the people whose job it is to use the tool - the tool that now enables them to ignore and forget about parts of their job and complete it with 10% of their attention.

"Stupid users - they barely understand their own job and I spent months learning the finest details about it to build them a tool so they can keep doing it without understanding it at all."

Not saying it's right, but I can see how it happens.

> Users have tasks to concentrate on.

Right, we also need to remember that simple is not better than possible.

If your interface is too simple and feature limited that your user can not complete their task, then you have gone too far.

We want to reduce friction as much as possible while still allowing a customer to solve their problem.

And then we should reduce the friction further to allow the user to solve their problem as fast as possible. That means batch-processing and some other complex features. This is because it's very likely that a chunk of your users will not be solving a problem once, but will be solving 200 identical problems a day, 5 days a week, for a year.

> simple is not better than possible.

Dang, that's just the most perfectly succinct phrasing of (an answer to) my pet hate in software/developer attitude/life in general.

But it also seems that the 'possible' facet is just nowhere near as financially lucrative as 'simple'.

It depends. Mostly on what kind of users you might have.

If the underlying model is very powerful but so complicated that it's very difficult to convey it to anyone, is it of any use? "Financially lucrative" in that case is simply a proxy for "can I get anyone go understand what is even going on?"

I've built an online pub quiz, and I'm doing as much hallway usability testing (in the Zoom hallway) as I can. Early on, every single one ended up with me changing things from being more flexible and powerful to being simpler. Either the underlying model, or just what is being presented/allowed in the user interface at any given time.

There is just no point in trying to provide lots of options instead of a well thought out middle-of-the-road path of obvious next steps if no-one will be able to get around to doing what they came for. People can't become power users if they aren't users.

UX design is so hard, but luckily it's also very interesting :)

>> Users aren't thinking about your app when they use it.

Yep. It's probably over used as an example, but I always come back to pinch-zoom. People don't care about your UI, the best you can let them do is directly manipulate their data. Anything else is a compromise. That compromise is always going to be there, but we need to minimize it.

MS Office has turned into a sort of industrial machine with a control panel. You can do anything to your data if you know what sequence of buttons to push to tell the machine what to do. Actually that peaked several years ago and they have been getting better about more direct access to features. OTOH they also have so much functionality that such UI will never go away completely.

On a related tangent, I feel like tools for programmers suffer even more from this. Because we're used to editing text files to get things done, the notion of editing a config file or writing a script to get something to do what you want doesn't seem that unreasonable to software developers. Maybe it should.

Programming tools should work this way because we write code; config files are effectively code that doesn't execute (or sometimes does.) A lot of the config files we use (think stuff like: gitignore, eslint/babel configuration, Dockerfiles, package.toml, etc.) we version into source control so that we can track changes over time. I can't think of a good UI that would make editing these files better; VSCode probably has done the best by providing autocompletion in YAML and JSON files using JSONSchema.

Text files aren't perfect, but they're hard to beat. It's not like the hardest part of programming is actually syntax, anyways. You can always improve on this by providing better assisted editing.

If there's anything I'd actually want it would be for software to have pluggable configuration so I can instruct it to pull its configuration from a database or something along those lines.

edit: and on that note, it’s probably the command line interface that actually would strongly benefit from some consideration. CLI arguments are currently a flat unstructured list... but almost everything at least has key value pairs.

> I can't think of a good UI that would make editing these files better

Alternative views into those same text files would be nice. You could color-code the type of a snippet (yaml string/float issues) or nested blocks. You could offer a graph view of dependencies (context-dependent -- could be services, function calls, modules, docker base images, etc...).

Text files are nice (and I've yet to see anything that would be a good replacement for most uses), but that doesn't mean we can't offer extra tooling/views into that data when it would be more convenient.

Now tell us what graphical interface developers spend their time thinking about.

Are they real world problems like the examples given above.

What is more important.

Someone once said the best interface is no interface.

If I do not have to think about an "interface", if the program is doing its job without requiring interaction, then that is more time I have to focus on what is important.

Not all software can be like this but a lot can. As the article indicates, there is an enormous amount of "forced interaction" in today's computers. This is in part because companies that employ interface developers rely on the online advertising business to make money.

That person in MSWord isn't thinking about the ribbon bar

I think about it every time I'm forced to use it. I think about how much I hate it, how constantly clicking between tabs slows me down, and how much more productive I was without it.

For mainly this reason, I still use Word 2003 and have no plans to switch until the fad passes and someone in UI at Microsoft finally stops designing for the lowest common denominator and remembers the point of toolbars wasn't to replace menus, but rather to put commonly used functions one click away.

I would think using Office 2003 on an internet-connected machine would be a bit of a liability these days...

Third party tools like 0patch help. Obviously so do practices like sensible firewall settings and not opening attachments you don't trust.

I have the latest version of Office installed for when it's needed, I simply prefer doing my serious Word work in '03.

I don't think I'd hold up any version of Office as the paradigm of "internet safe" software.

busy, distracted, obliged and not taught

a recipe for rote disaster

I'm a dev and I cannot cringe at a user failure to operate a piece of software.

Even those with a solid enough mind and education will have issues, french teacher told me she couldn't bear the improper use of verbs and nouns in her daily usage. It's a gigantic mess.

> realise that they are not stupid, just busy and distracted.

Though from a practical PoV those two states are often hard to distinguish. Very similar hand-holding is required in each case.

> Users have tasks to concentrate on. They don't have the attention left over to deal with your app's crappy UI.

This is very true. We provide accountability/competence/compliance management software for regulated industries (mainly savings and investment banking ATM) and for most of our users touching our software is a secondary task linked to their main roles. Even the day-to-day admins and UAT testers for new releases, are seconded from other areas or worse effectively being asked to use our software on top of their usual roles (an extra task given with no extra time assigned or other tasks paused).

A good UI is the one that the user doesn't notice. That means it has to be intuitive and consistent, and shouldn't ever distract the user.

Distractions include sudden popups that aren't the result of an immediately preceding interaction (looking at you, iOS). And those login forms. No one ever wants to see a login form for something they've already logged into from this browser/device.

Also, it might seem silly to point out, but use your goddamn product yourself the way your users would. That alone really helps with many UI/UX issues.

Years ago I was switching my father from DOS to Windows. He was resisting learning about the computer. Not because he was stupid, he has a PhD from Harvard. He was busy trying to consolidate his understanding of 1000 years of English literature and move the field forward. He needed the computer to write with, and was willing to learn how to do that. Everything else was a distraction.

The article talks about respecting users but at the same time describes using Facebook as "mindlessly scrolling through feeds of what can most easily be described as garbage content". There is a big disconnect here. If people are actually smart and able to make their own decisions, maybe there is something more to Facebook than "mindless scrolling"?

I'm taking the "minimum cognition principle" ... the least time required to make a decision the better. It's like riding a bike - if you had to think about it you might get in an accident.

> Users have tasks to concentrate on. They don't have the attention left over to deal with your app's crappy UI.

Therefore, it shouldn't distract from the task at hand.

I get what the author is trying to say, and to a large extent I agree with it. But...

I have to admit the first thing that came into my head after reading the title was “well actually, some of them are, dude”. There’s a reason why no-one really wants to work the hell-desk and act as the front line for user support. There’s a reason why websites like ‘Not always right’ exist. There’s a reason why Reddit has /r/TalesFromTheFrontDesk...

I think it’s as much a disservice to assume that all users are enlightened angelic creatures, who only need that little pointer to go their own happy way, as it is to assume they’re all lazy and stupid. Sure, start off with the helpful approach, but shutting down people who are only there to scam a deal or make themselves a nuisance would go a long way to curbing that behaviour.

Stupid, on the other hand, is something that needs to be coped with well - and good design can certainly help. Stupid needs the hand-holding, because there’s nothing the client can do to help it. “Stupid” is also often just unfamiliarity, so good hand-holding will prevent a repeat of the situation.

Lazy/entitled I have no truck with. If you’re not willing to help yourself, I’m sure as hell not going to do it for you. I’d never make it past the first day in customer support...

Yes, users are fucking dumb. I use to work help desk for a number of years, and the stories I could tell...

Though, I sorta object to treating them all like their stupid. There are stupid users, and then there are users that are stupid because you treat them like they are. Let me explain.

When I worked in healthcare I was pretty stupid. It wasn't that I didn't understand the mechanics behind it, but it just didn't click with me because the training I had was deplorable. I was regularly yelled at for making mistakes, and didn't have a really supportive environment. So in my haste to ensure I did everything quickly, and didn't make any mistakes it actually caused me to either overthink or underthink certain steps due to anxiety and nervousness. So in other words, when someone started treating me like I was stupid I was more prone to acting like I was stupid...

So when I worked at help desk no matter how stupid a users was I always did my best to treat them as kindly as possible. In the end what wound up happening is that I wound up getting much better results than my coworkers who wouldn't treat them respectfully. That wasn't even my super weapon though. It was anticipation of future problems...

I always really liked Roman Mars perspective on design. Good design is effortless. When you go through a door if there is a handle on it and you pull that handle you feel stupid when you see that it says pull only. So in my own way I am always trying to view things I do in IT, whether programming or helpdesk from a user centric perspective. If I think ahead of time about what sort of potential issue someone would have, and I can change something to prevent that then I have much better results.

This is something I see being a problem all up and down the entire stack in IT. It is time to stop treating our users like idiots. But it’s also time to start returning to idiot proof design.

Computers can be really frustrating to use, the biggest offender in my opinion is hidden state. So many programs don't mention something has changed, and so if you fumble a click or a shortcut you can end up in a program state you didn't want to be in and may not even realise it until something goes wrong.

You wouldn't accidentally hit a switch on your car and now you can only go left, with no indication on why that is, and no transitional information given during the switch. Yet programs and websites will happily modify the state without so much as a brief text flash asaying "Oh hey, you're in Paint Only Mode now" for example.

My absolute favorite way to deal with this is small status indicators in some kind of toolbar, that when I hover over it, it gives me a tooltip on what it is and how to toggle it.

Accidentally hit debug mode? No worries, a little debug symbol just flashed briefly, I can go look for it in the toolbar and hover over it to figure it out.

> Computers can be really frustrating to use, the biggest offender in my opinion is hidden state. So many programs don't mention something has changed, and so if you fumble a click or a shortcut you can end up in a program state you didn't want to be in and may not even realise it until something goes wrong.

There is a high degree of irony to this when you think about the extremely popular Unix philosophy, which is kind of the antithesis to what you're saying.

I know that it's meant for power users, but even for those in many cases it's still bad UX.

You mean the "silence is golden" rule?

If so, that's completely different. It just states that after the user change something and the thing actually changes the way the user expects it to do, you should not interrupt the UI flow. It has no realtion at all with hidden state.

State on a classical Unix system is pretty much entirely explicit.

Isn't Unix philosophy the one about small, focused tools that compose well? It fits the GP's view, since the composition is meant to be user-driven, explicit and visible.

(Also composability usually implies lack of hidden state - in Unix-land, state issue may just come from the baggage of C language.)

Yeah, but all those tools by default don't print anything if they run successfully, which is indistinguishable from silent failure.

If you look at all modern tools, this approach has been abandoned, especially for long running operations, since it's obviously bad UX.

You mean like grep not returning anything when it searched for something and didn't find it?

It is more about a lack of feedback than a hidden state. It is a problem but not the same one. Talking about Unix tools, the hidden state would be more like environment variables like the PATH or locale.

Environmental variables are essentially dynamic binding at process level. Super useful, but I wish there was a way for programs to declare what env variables they're using, in a way that can be inspected programmatically.

Related: cmdline arguments are essentially lexical binding at process level. I wish there was an idiom there like there is in Lisps (Common Lisp in particular), where a function would take an optional argument whose default value is the value of a dynamic variable. In process terms, that would be declaratively specifying that an optional cmdline parameter takes its default from an environment variable.

Or just some kind of generic way to advertise where they're reading configuration. The worst are programs that search 20 places for configuration, so you have no idea whether you've turned some feature on or off.

What I really want is a generic system of algebraic effects for processes so "read configuration" becomes something I can inspect or wrap with my own handler.

> which is indistinguishable from silent failure

What is why the same tools do not have silent failures. And OS failures aren't silent.

> If you look at all modern tools, this approach has been abandoned

Gladly, you are wrong. Modern tools have indeed add optional noisy modes, nearly all opt-in.

> What is why the same tools do not have silent failures. And OS failures aren't silent.

Thankfully, there are no bugs in our software :-)

> Gladly, you are wrong. Modern tools have indeed add optional noisy modes, nearly all opt-in.

Almost everything network facing (which is almost everything, these days), is verbose by default. Git, curl, etc.

> Thankfully, there are no bugs in our software :-)

Yep. Bugs are why OS failures are not silent.

But you've got a point. Network facing software need some progress confirmation.

Right. That I agree with.

I agree with a lot of things you say, but I think that “lack domain knowledge” is a better description than “stupid”.

There are many people who have little interest in the underlying “how” and “why” of computers work. They haven’t been educated, more importantly they don’t want to be educated. They want to focus their time and energies on (What they regard as) more important things.

I have two bright teenage girls who need to use computers every day to get school work done. But their focus is art and reading, and writing. Their eyes gaze over whenever I try to explain why their computer is behaving the way it is.

My eldest did an amazing project in just a couple hours the first time she ever used Photoshop, removed a person imperceptibly from a old scanned image, realigned it to be upright from a mislaid scan, recolored it to make it look present day without oversaturating it.

Most of this she learned from in app tutorials. But I had to show her how to save the file and explain file formats and the difference between cloud and local storage. She just didn’t care about that part any more than she needed to turn the assignment on.

But how to use Photoshop is like another form of sketching or painting, so she ate those tutorials up like chiclets.


Yet how often do you ask a waiter about the challenges of scheduling the maintenance of a commercial refrigerator?

> Yes, users are fucking dumb. I use to work help desk for a number of years, and the stories I could tell...

That sounds like textbook selection bias; users will usually end up at helpdesk only when they're out of their depth.

I see your point, but I dealt with users far more often than just whenever they needed tech support.

Interactions with users were more common in my position than just when they needed help. I provisioned all the computers for the company I worked for. I also upgraded any software, usually manually for whatever reason, and dealt with a lot of the day to day maintenance.

Like I said, users are not universally stupid, but when they are they can be especially stupid. A lot of it could be mitigated with better design. A lot of it could be mitigated by critical thinking on behalf of the users.

Exactly. Users rarely if ever go to tech support to show off things they figured out themselves.

Tech support pros saying users are all dumb is like brain surgeons saying everybody has brain damage.

> when I worked at help desk no matter how stupid a users was I always did my best to treat them as kindly as possible.

This resonates with me because it's something I try to do all the time.

The XKCD "Ten Thousand" comic[0] really spoke to me and ever since reading it I've tried to treat people as 'temporarily inconvenienced geniuses' rather than morons when they ask me for help and it seems to work wonders.

If you enter requests for help with the mindset of "let me show you how to solve your problem" and not "let me solve your problem" you end up speaking to people as if they are an equal which always seems to provoke better responses

Of course the reward for being good at digging holes is more holes and to coin a malaphor if you teach a man to fish he'll come annoy you every time his boat won't start, but it's also pretty nice to be the person everyone assumes can fix anything.

(Also it's a great outlet for the typically annoying desire to solve everyone's problems)

[0]: https://xkcd.com/1053/

One of the things that I have discovered is that treating people in this way can be a bit of a double edged sword though. When I was working at my last company I was on a team of three, and I wound up being the guy that dealt with all of the difficult users. That was fine by me though, because I had previously also worked in a group home with individuals with developmental disabilities, and emotional problems so a difficult user to me was just whatever. But... I can see how it might negatively effect someone else. lol

But! On the other hand, if I ever needed something it wasn't that hard to get someone to help me. Kindness in my experience has always multiplied. There is, as you said, always something that someone doesn't know, including yourself. So there was always something that the users were willing to teach me in return.

PS. I love this comic. When I am finally done going to school maybe I'll hang that in my cubicle or something. I estimate with all of the programming I have done for college, and all of my personal time on side projects that when I am finally ready to get back into IT again I'll end up being stuck in help desk again, anyway...

> So there was always something that the users were willing to teach me in return.

This is very true and why I love mentoring people. Once you have shared knowledge with someone, they usually want to return the favour and you can learn a lot from that.

> I love this comic. When I am finally done going to school maybe I'll hang that in my cubicle or something.

Nice! Prior to that comic I had Hanlon's razor in a frame above my desk which uh, while a good sentiment isn't the most positive mindset.

Author here. I actually agree.

Cheesy (but true) anecdote: As I was writing my article this morning, there were two distractions. First, I tabbed to Facebook where I saw one of my "friends" had posted something political that was both factually incorrect and also very offensive. It reminded me of All the Bad Things, and made me doubt my argument, to the point of almost just scrapping the article. Not just because that FB post was an example of someone being stupid and lazy, but because I had mindlessly interrupted myself and tabbed to Facebook and was about to engage with that post.

My second distraction was my 2-year toddler, who had just woken up and was tugging at my sleeve. And I thought about the world I want him to grow up in and the view I want him to have of it.

I know I can come across as idealistic (hell, I've even gotten that as formal feedback in a performance review—but I've also gotten feedback that I'm cynical, so shrug). But honestly, one of the wonders of us as humans is that we can carry all these different capacities. We can be smart and motivated and social. We can be stupid and lazy and anti-social. All those capacities are worthy of admiration, not contempt. Technology is a great tool, we can decide where we focus it.

I agree completely. If we pander to the lowest common denominator we only serve in lowing that denominator even farther. A “race to the bottom” where the bottom is never ending.

We should instead approach issues in our lives with this “how the world should be” attitude, instead of “how do I make gains out of the world that is”. This means treating people with respect and courtesy even when we believe they are wrong. We work with them to achieve a better result, because lack of knowledge is not equal to stupidity and laziness in all cases. For instance I do not have a medical degree, but appreciate not having doctors talk down to me. Work with me, help fill in the gaps, don’t pander with hand wavy explanations. If it requires too much time in the moment then point me to were I can independently find the information and study on my own.

In other words, live by the “means” not by the “ends”.

Ah, a fellow idealist :)

I really respect this ideal.

All too often I've found people dealing with external users become calloused. We forget that questions with obvious answers (to us) may not be obvious others and that this doesn't imply that others are stupid.

I feel that it's our responsibility to design things to be as obvious to as many people as possible. I also think a default attitude of respecting the users of our products can go a long way to making this happen.

I have a bias toward designing user interfaces with simple composable mechanisms (that are easy to understand and compose), rather than single-purpose actions.

Yes, sometimes the action is really frequent or it's part of the main workflow, and having a shortcut for it is good, but I think one often ends up with a more resilient tool by allowing the user more control so they can recover from mistakes and exceptional cases you didn't think of.

A recent example is an app I'm working on that involves helping the user work through a list (they do a task for each item, then move to the next one, and so on). It's a high priority for the user to get through the whole list efficiently, so the PM wanted us to have only a "next" button because showing other buttons "might be confusing." The thinking was that it would streamline the workflow if the UI is minimal: do the task, click "next", do the task, click "next", etc.

This is the kind of situation where I really want to give the user more control — if we provide "next" and "previous" buttons, they can go back if they made a mistake, or check one of the earlier items and then return to what they were doing, look ahead to see what's coming up and plan for it, etc.

They may also be in a rush, and have more important things to do. That can appear to be lazy or stupid, but it isn't really.

I have that attitude towards things like expense filing software. It's a nuisance that I don't want to spend time learning, and it doesn't come up often enough to learn it well. So my click stream probably looks stupid or lazy.

I feel the same. An app shouldn’t be too complicated especially if you used different similar apps in the past and can sense the complication is unnecessary. Why spend energy and effort on something that is stupidly designed and which would probably be a one time use anyway?

I think you're looking at it wrong.

Have you ever taught someone to drive? They have their attention completely occupied by ... everything at once.

After a while you can see the pattern. "turn here. right here. WAIT TURN HERE!" "here?" "no, back there."

It's not that they're stupid, it's that you have the pattern ingrained in you and you don't remember what it's like to experience something for the first time.

Now yes, there are idiots - in terms of social interactions - they lack maturity or patience or speak too quickly (or are having a bad day). But don't lose compassion for 99 people for the one unenlightened person.

Yep. Not everyone comes from the same baseline. Different biases, experiences, knowledge, affinity for a topic, capacity to understand something at a certain speed, etc.

Tangentially related, I’ve noticed some online products almost purposefully make their websites less appealing to filter out certain types of users. I think landing page copy call it “speaking to your audience,” but it’s the same thing where you’re selecting for the kind of customer you want.

I recently went live with a project that does okta integration with mfa. The most common support ticket is people complaining they didn't get their mfa code. Root cause: they never clicked the "send code" button.

It's not clear whether you intend to blame the user or the tool, but my inclination, absent any other information, would be to blame the tool.

Maybe they clicked forward from a previous screen and were presented with an active text field and the "send code" button, but didn't recognize it as a button, or misunderstood its purpose, expecting the code to be delivered automatically, or thinking that it meant "send code from the user to the website" (i.e., after they have typed it in).

Whatever the details, a bunch of users have an expectation that doesn't match the tool's reality. It's the job of the designer to recognize or discover that gap and find ways to remedy it.

A user submitting a support ticket is rather like your code throwing an exception. If just one pops up in Sentry you might add a note to look at it later, but if it happens constantly you need to go fix something.

If one user submits a support ticket you can count on there being ten others that had the same problem but didn't bother.

I filled out a custom job application on a software companies site yesterday. When I finished I couldn’t figure out how to submit the form. I must have stared at it for minutes not wanting to abandon my work, until I scrolled down and the submit button popped into view. On my large screen monitor.

While I’m sure your “send code” button was visible, apparently that’s not what they expected. Maybe they expected “Send MFA code”, or “Text me my code”. Maybe the terms you are using don’t match up well with their limited understanding of the domain problem.

Sounds like a problem A/B testing or simple user interviews could help with.

It's your fault because of poor naming, it should be "get code". They aren't the ones sending you something.

If the primary goals of your product's design are engagement and ad revenue, of course you're going to chafe at the fact that users don't want to sit comatose in front of screens all day making you money in whichever way you've found is the most efficient. And, of course, when you notice this sort of ~~disengagement~~ behavior, you're going to try and figure out how to trick the user into doing what you want.

These days it seems just as likely that some other EvilCorp™ with more money than you has done a better job hijacking your potential users' attentions as that your users really do have something better to do. But either way, I think the "focus on the user" kind of mantras painted on the walls of so many tech offices have been twisted into something horribly exploitative or beaten into irrelevance by rote repetition without thought. So many "products" being "sold" to users these days aren't really products at all, in the sense that if they do offer the user something of real value it's almost by accident. There is no value, absolutely none, placed on building a quality product for its own sake.

If you're all tangled up in this mess and "users are stupid and lazy" is what lets you sleep at night, I don't blame you.

As someone with ADHD, I really struggle with this.

I spend a lot of time turning off features that were built to distract me. Notification badges, unread item counts, related content, news feeds put in places where they don't belong etc.

Some simple tasks are made deliberately difficult by this. For example, why is it so hard to get just the weather, without any bullshit?

FWIW, if there is a government website for the information you're after, it is likely to have way less bullshit that the ad-supported alternatives. In the US, weather.gov is far from the worst place to get weather information.

We could just go all the way in and say "users are not sentient [therefore we can exploit them to our heart's content]".

Related is my issue with most "data-driven" companies these days. Extensive telemetry and A/B testing isn't "being user centered", it's just the feedback leg of the control system over your users that you're designing to mine them for all they're worth.

A favorite quote of mine from The Humane Interface:

An interface is humane if it is responsive to human needs and considerate of human frailties.

There is a temptation as a designer/programmer to present the world with MY PERFECT VISION of what a piece of software should be like. Nothing wrong with that, as far as experimenting with and presenting new ideas. But a mindset that produces better software is one of serving the needs of the users. Of course there is an entire profession in identifying the "needs" and studying the "users". But the mindset: "we build tools that enables users", is the first step.

Far more insidious and destructive to the general computer experience has been the rise of businesses, whose entire purpose is to produce software that exploits it's users. I don't know the solution to this phenomenon but I find it repulsive and strive to avoid this software as much as I'm able. Raskin's dedication for the above book:

We are oppressed by our electronic servants. This book is dedicated to our liberation.

Might have been tongue in cheek then but feels disturbingly serious now.

I’m going to be honest. I have many times said, “users are stupid, if we don’t make this obvious they won’t get it”.

Now. Am I really saying that they users are stupid? Of course no. Saying that a user is stupid is just an assertion to say that you really have to design as if users were highly incapable of inferring complex mechanics or interactions, because the reality is that there’s always a subset of users that legitimately won’t get it.

I get what the author is saying though. Referring to a user as stupid is derogatory, but when I have personally used that phrasing I always include myself as part of that group of users. I don’t know anyone that uses it differently.

It’s us collectively as software users that we are stupid. It’s not a direct insult to an specific user or cohort of users. It’s just a token to assert that your software should be “stupidly” easy to use.

If someone has used this phrasing beyond it’s rhetorical meaning, then they don’t have the empathy to design software. Simple.

I just believe that nobody uses this phrasing to personally insult another person, and that we shouldn’t be focusing in such small trivialities. Designing software has more complex ethical implications than trying to poke holes at your communication style.

Again. Most people that use this phrasing do it to assert a point. I don’t think I have ever seen this phrasing used in anything formal. This is just a desk phrase to say that a particular thing is too complex. Seems really unnecessary to even argue around this, honestly.

> If someone has used this phrasing beyond it’s rhetorical meaning, then they don’t have the empathy to design software. Simple.

We're several generations into the field of designing user interfaces. Like with all other saws and pieces of folk wisdom, it is the case that when you learn an adage in context of knowledge and experiences it summarizes, you understand it's just a shorthand, a helpful reference in your mind. But if you learn the adage without that context, you derive your own meaning that's not necessarily the intended one (or you just dismiss it as being trite).

It's a problem as old as human societies - you have to live through relevant situations to understand as piece of received wisdom. There is no known solution, but it is still super helpful, if an adage is phrased in such a way as to minimize misunderstanding by people who hear it without experience to understand it. "Users are stupid" fails that.

> Am I really saying that they users are stupid? Of course no.

> If someone has used this phrasing beyond it’s rhetorical meaning, then they don’t have the empathy to design software. Simple.

If you assume that engineers are also (rhetorically) stupid, then if they regularly don't realise that it's a metaphor does that not mean we need to change the way we communicate to help them understand the the system of design?

Exactly. It can be harmful to your culture and the perspective of people on the team even if it's implied to be a rhetorical device.

Regarding the example of people mindlessly scrolling. This is also in part a result of the measured objectives/metrics by the company. When a company like Facebook over optimizes for likes and shares, as opposed to say depth of interpersonal connections or something else, then the product evolves to treat users as if they’ve no personality beyond engagement on the feed. That happens even if there is no explicit judgement passed on the users. Source: I used to work on the algorithmic newsfeed of a social media, not Fb.

If you optimize for sheep, you get sheep

I designed software for quantum computing researchers. They were clearly not stupid people, and the design was completely tailored to their needs, but in early iterations of the design, they still struggled with a lot of the same things users of an ordinary website would have struggled with: trouble orienting, trouble finding the primary actions in a screen, etc. It wasn't a catastrophe, we just observed them and fixed the problems.

But, I realized that I'd taken for granted that they were "experts users", not dummies, and that they would easily figure this stuff out. Heck, they might even appreciate not having their hands held! Unsafe assumption.

From this experience, I took away the lesson that users may not be stupid , but you won't go wrong pretending they are, and taking the same precautions you would if they were.

I noticed an interesting dichotomy watching people use software I worked on:

- When engineers design software, we design it with the mindset that people like using software and will appreciate all the cool features.

- When non-engineers use software, they do it because they need to use it, not because they like it. They have other stuff to do. If anything it's an obstacle to what they need to do, so they interact with it as little as possible to do what they need.

So, imo that line about respect for your users was spot-on. Don't get in your user's way more than you have to.

While I am a software engineer, I don't particularly identify with your self-ascribed and generalised engineers' view. Maybe I'm different because I've previously manned help desks. Maybe I'm more neurotic than you are.

I always design from a perspective of having experienced a lot of truly horrible software, from websites that require spelunking to find the nugget you came to find to HR time-tracking applications that don't even allow you to define a standard week.

While I do assume people like using software and all the cool features, that stance takes a backseat to thinking about how to:

* fit flows to what they actually want to achieve

* minimise the amount of data entry and clicks

* produce meaningful messages (errors, tooltips and descriptive text)

* not overwhelm the user with features irrelevant to the current view/flow.

I'm by no means particularly knowledgeable in "formal" UX, but I find focusing on usability does make users appreciate your software more. YMMV

I think you're interpreting my comment too literally, I'm basically saying the same thing you are; I also try to think of software from the perspective of people who will actually be using it. All I'm saying is that I've noticed that often engineers and product-owners ( or whatever we call a boss nowadays) are out-of-touch with what their end users actually want and need.

> I think you're interpreting my comment too literally

That's been known to happen :)

I really resonated with this post, I share a lot of its perspectives. And if the author is reading, self-determinism is exceptionally useful when raising children[1].

And if there is anything that my life has taught me, it is that potentially everyone responds differently to the same stimulus. There are few "universal" things about people and so generalizations are difficult if not impossible with regard to how people will react.

Understanding that can help avert the 'dual user stories' quagmire that some products find themselves in. Imagine your text editor product development team is half hard-core vim users and half hard-core emacs users. Maybe you put together this team because you wanted a product that appealed to all text editor users, maybe it was just thrown together, however such a team was formed, my experience is that it will produce a substandard product that will not appeal to anyone. This seems to happen because the conflicting visions of "a good editor" within the team seep into the product and the users of the result are often confused as different parts of the same product seem to have different philosophies about how the product should work.

That said, I'd love a product like 'Monarch Money' that was a) not subscription based and b) not cloud hosted :-).

[1] Anecdotally of course, just a survey of my general social group of which roughly 40% chose to have children and pretty much half took a "self determination" route in child raising and the other half took a "guided/trained" route.

> And if the author is reading

Author here, reading, and thanks for this, I really enjoyed your comment. And the 'dual user story' problem (aka split-brain) is so true, I wish someone had told me about it earlier in my career.

> That said, I'd love a product like 'Monarch Money' that was a) not subscription based and b) not cloud hosted We chose a subscription-based business model to try and align ourselves with users. If I had to write a second part of my article, I'd write something like "Your product becomes your business model". I've worked on enough "free" products to know that they lead to the very dynamics I criticize in my article (there's another comment in this thread that summarizes that quite nicely[1]). As for not cloud-based, I hope that's something we can deliver some day down the road, but for now, cloud-based products are the easiest way to deliver the type of cross-platform product we're trying to build.

[1] https://news.ycombinator.com/item?id=24462452

FWIW, I got to see the 'split brain' product team when Sun Micro decided to do a 386 based workstation and had the east coast team design the system software. Between their Systems Group and the west coast Systems Group much confusion was created.

On the product, I'm not interested in a 'free' product, I'd pay you $300 for such a product and buy periodic updates for $100/year. I tend to do this will all 'tools' type applications (Illustration, CAD, Simulation, Etc.) I love having the company invested in making it something wonderful and open source products with their, by necessity, indiscriminate addition of developers, suffer for that reason[1].

My concern, especially with financial software, is that your cloud gets compromised and I and all of your customers are scrambling to avoid losses. Frankly I am amazed that people can get liability insurance for that sort of thing given the stakes!

That said, Quicken and Intuit in general, really needs competition to help them focus.

Edit: And per the summary comment. I am not suggesting a 'freemium' or ad supported model, just one that keeps the data where I can be assured that if someone else let's their guard down it won't compromise my data. That is the big weakness with 'cloud' in my mind.

[1] FreeBSD tries really really hard and comes close, I expect it would really shine if it had a revenue stream to support a full time product management team and lead developers.

Ah sorry, I thought by "not a subscription" you just meant free.

What you're saying makes sense and is definitely something worth thinking about. We do take security very seriously, of course, but I understand your concern about data still living in the cloud.

An interesting model here is Evernote which has the option of storing your notebooks in the cloud or locally.

Regarding [1], would you be willing to expand on what you mean by a "self determinism" versus "guided/trained" parenting approaches?

Thanks ahead from one parent to another.

Sure, nothing too deep. With the understanding that this is just peers and acquaintances, not some referred psychology paper or anything.

Among my friends, there exists generally two very broad schools of thought with respect to raising children.

In one school of thought, which I relate to self determinism and is how I raised my kids, is that as a parent you have no idea what the "best way" is for your kids, they aren't you. As a result, parenting is a process consisting of three things; Teaching your children the underlying reasons and rationale of the world. Sharing with your kids the choices you made, why you made them, and what you learned. And finally, helping them explore different things while giving them a framework for capturing any new learning or understanding that their exploration yielded.

The other school of thought, which I think of "You'll do better than me because I'll tell you how to be a better me." is about using your understanding of the world and your experiences in it to create a path for your child that will be more successful (or as successful) as your path. Parenting in that case focuses mostly on setting the direction for your children, giving them goals to help them evaluate how far they have moved along the path, and correcting them when they stray from the chosen path.

Both styles fail at the extremes. For self determinism that would be completely disengaged "observer" parents, and for guided path, that would be over functioning "helicopter" parents.

Parenting is a big job, but you have to remember that you will be lucky if you to spend a total of five years worth of time being with them[1]. Planned path parenting fails kids who are completely different than their parents (in my experience) and that situation can be expressed by people who graduate from college with a degree in X and then completely change their life after they have run off the end of the list of goals their parents gave them.

Anyway, my 0.02 for what it is worth.

[1] Assuming at age 18 they go off to college, from about 3 yrs to 5 yrs you get to spend all day with them, then half days, then maybe a quarter day + weekends, and then mostly weekends until they are gone. Chronologically it is nearly 2 decades by in terms of face to face time, especially time where you can talk and really share ideas and thoughts, it is incredibly short.

The purpose of UX in my mind is to be able to constantly create beginners that can become engaged and advanced users.

Knowing your audience, and how to create beginners is critical. Helping them get up to speed as quickly as possible is the goal, not assuming they are stupid or lazy.

This article was plausible for me, up until it might not account for users who do not have digital literacy or capabilities like the writer's own. There might be a bit of a blind spot there, ironically.

Many of our major services started the same way.. Facebook, Twitter, Instagram all started as very simple services for people to learn new digital experiences and interactions.

With these social media digital interactions as a foundation, maybe it's possible to place more complex initial interactions in front of some users.. but I feel that the digital alienation of people is real and one of the things that may be fueling the divide in society when it comes to access to opportunity.

This post by Google's CEO articulates the alienation that not helping create beginners can fuel: https://www.nbcnews.com/think/opinion/digital-technology-mus...

Edsger Wybe Dijkstra in 1982 on "users" [1] (please read the rest of the EWD as it is too much to paste here):

"The computer “user” isn’t a real person of flesh and blood, with passions and brains. No, he is a mythical figure, and not a very pleasant one either. A kind of mongrel with money but without taste, an ugly caricature that is very uninspiring to work for. He is, as a matter of fact, such an uninspiring idiot that his stupidity alone is a sufficient explanation for the ugliness of most computer systems. And oh! Is he uneducated! That is perhaps his most depressing characteristic. He is equally education-resistant as another equally mythical bore, “the average programmer”, whose solid stupidity is the greatest barrier to progress in programming. "

[1] https://www.cs.utexas.edu/users/EWD/transcriptions/EWD06xx/E...

As a product designer, I don't really see many fellow designers who would think that "users are stupid".

What I do se a LOT though is designers disrespecting users' time. Many see their product as the center of the world, expecting and kinda forcing the user to spend much more time and energy with their product that is really needed.

At some level, I wonder how possible it is for your bespoke software product to be as usable as a door. A door is simpler than most software, and we've built the same exact door millions of times. We may be spending a lot of time on design when no design will ever help anything.

I am instead going to propose what I think software should aim for.

1) Software should be safe to explore. Trying things out is how you learn. Accumulating learning is how you become productive. With that in mind, you should never allow the user run a command that creates a situation where they can't revert to the previous state. When one wrong move can erase hours of work, your users will never become experts. (Historically, software has been absolutely awful here. I think this is the number one reason why people are afraid of software -- the one UI paradigm they've learned throughout every new "UI innovation" is "one wrong move will destroy your work". So they tread carefully, and being careful is slow and painful.)

2) A user that knows the jargon that describes some command should be able to run that command by name. Teaching users the jargon can be hard, but once they've learned the word, they should be able to at least run the command. An example is watching a YouTube tutorial for some operation that was recorded a few versions ago. Sinc the tutorial was made, someone moved every menu item around (hello, Fusion 360). Now the tutorial is useless -- you know the command you want to run and how to use that command, but you have to dig around in the UI looking for it. That's a big fail. (I called out Fusion 360 for this, but they have a command search, so it's not a big deal. The search could be better, like maybe telling you how to do it faster next time ("next time, just press X"), but at least it's there. I'll also point out that Emacs does this, and even tells you the shortcut after the command is over. If a 1970s LISP app can do it, so can your whizbang Electron app.)

Anyway, I'm not sure I'd understand how to use a door the first time I saw one. Normally you can't walk through walls, but with this innovation, you can. Weird! But once you see someone else use it, or someone walks you through it (literally), it's pretty easy. With software, we are creating new concepts like that several times a day. At some point, we have to accept that the user will have to try it out and learn how to use it. There is a first time for everything, after all.

> Software should be safe to explore. Trying things out is how you learn.

When I was a kid, I was surrounded by computer-illiterate adults. The biggest fundamental difference between their approach and mine, was that they were "afraid to explore" and could only understand things if they were provided in the form of a prescribed set of instructions. Meanwhile, I was comfortable casually exploring, and thus gained a far better intuition for how to actually use things.

> I'm not sure I'd understand how to use a door the first time I saw one.

You wouldn't. You'd see this great, menacing piece of matter in front of you. If the door was closed, that would be it for you. But if you saw it being opened, or better yet, encountered it open, then the fun would start.

You'd touch it and it'd move. You'd notice it has inertia. You'd grab it by the side or push, and see it move more. Eventually, you'd close it by accident, and then trying to move the door back would fail. You'd cry. Your parent/caretaker would come and open the door for you. You'd grab it again, and close it again. Then maybe you'd play a game with your caretaker, with them holding you in their hands and opening the door, and you closing it with a strong push, laughing at the noise. Some time later, you'd figure out how the door handle works.

Source: I have a 15 month old daughter. This is how she learned how to operate doors. Nowadays, when she doesn't want some thing we want from her (e.g. getting dressed), she sometimes runs ahead and closes the door behind her. She understands the door in more abstract terms now. But it took a lot of trial and error.


Many doors are really are difficult to first use. Recent experiences:

A door I needed to open for an elderly person who didn’t have the strength to open it (fire stop door with strong closing spring - a potentially fatal UI).

A door that needs two hands to open - one for key (spring loaded) and one for the handle. My friend with one arm complained about it because they wouldn’t be able to open it.

Teaching someone how to use a touch unlock door on a car (great UX once learnt, but multiple weird UI interactions that one needs to learn to use).

An automatic door that wouldn’t open even with face up or waving hands (I’m guessing I was close to background temperature, but who knows why? Took 20 seconds)

A door that seemed to be locked (needed to push harder? Or push handle harder? Someone else then opened it).

Many locks are very unobvious which way the key turns (e.g. on my car anticlockwise unlocks the drivers, clockwise unlocks passenger: I regularly still get it wrong).

So many shop doors that are unclear whether you push or pull.

A shop door that says “PULL” but I don’t realise I am reading it from the reverse side so I pull when I should have pushed... (I often don’t notice I am mirror/reverse reading something).

> "And I’ve found that good products, ones that respect their users, give them more control. Bad products take away control."

Thank you for this.

Also, even better products have ways to signal how they are providing control. Sadly, UIs have reduced the means of doing so in a standardised, immediately accessibly way over the years.

I've often seen this attitude doesn't apply to just "users". Indeed there are many people that come from college backgrounds that generally treat "the public" as stupid. To get into the good graces of such people you have to agree with them.

When wading through the actual issues "the public" has, individually, you'll find that the issues are actually real. Usually some un-turned nugget of wisdom that was not applied in a very specific way. Many people learn things in a variety of ways. Different learning methods don't make people stupid either.

Considering the current approval number for tubbs-in-chief, the general public are stupid.

Or actively malicious, which I ran into consistently when doing internal support for a company that employed a lot of pharmacists. There are plenty of even well paid people who are just miserable and will intentionally break things for self gain or even no reason whatsoever.

People are trash. I've made a career out of designing to accomidate that fact.

This strikes me as semantics. I've never heard that anybody says "users are stupid" in a truly derogatory way. It's a shortcut for a long-winded explanation that you'll deal with a lot of different people, some of whom will barely understand what's happening.

An example I like is Amazon's question feature. If you buy a product and somebody else has a question about it, Amazon will mail you and ask whether you can answer the question. When you browse Amazon, you'll notice that quite a few people don't understand what's happening at all. They believe that some person has directed a question at them personally, and feel compelled to answer, even if they have no idea. Amazon being Amazon lets those answers go live, and so you'll see answers like "Dear Mr. Doe, unfortunately this was a gift for my nephew, so I cannot tell you how large the item is compared to a banana". Others are using it for support, I've seen things like "Hi Amazon, I don't know. Can you please tell me how to find more from manufacturer X?"

"Users are stupid" is short and catchy, and it beats holding an hour long explanation session each time you want to remind somebody that they're not designing something to be used by people like them, but by society at large. And many users are, in regard to this technology, stupid.

Is anyone actually judging them, saying they're horrible people or something similar?

Android 10 is so full of usability bugs. It always makes me think! I still cannot use my phone in a semi-automatic pattern after months. There are always notifications that cannot be dismissed, smartlock working randomly, etc.

Take AndroidAuto: the order of the icons always changes on the front screen. That is so stupid I have no words for it!!!

Google Drive: you have to scroll up to the top so that new items appear at the bottom of the list.

And so on ...

Agreed. I seem to see this sort of narrative of users being stupid as less and less prevalent (as opposed to 5-10 years ago) but it seems to still remains in some people's minds, although I'd like to think not in the majority. If discussions like this one can make those people's opinions change even just a bit then it's a net benefit.

The cure to this sort of thinking is two spoonfuls of empathy: one for others (truly putting ourselves into other people's shoes or lack thereof), and another one for ourselves (understanding our own issues better).

You might think that a GUI is a waste of your time because you can use CLIs, yet non-intuitive flags or help messages lead you to google time and time again the same awk or sed command (or whatever else). You're always someone else's stupid.

Meet people where they're at, not where you're at. That is by the way true for users as much as it is for your manager, your sales team, and so on.

Brings this to mind: https://xkcd.com/1053/

One of my all-time favorites.

I have no good idea about the origin of users as lazy. However, there is a good argument for an origin of users as stupid. And this might be the close connection of Human Computer Interaction with Cognitive Science, which understands thinking as computational. Thus, the ideal way of thinking is computer-like and diversions from this are framed as due to limitations that cause problems and need to be worked around: The user is like a limited and erratically failing computer.

Gedenryd showed the problems of understanding the user and designer as thinking computationally in his PhD thesis: https://www.semanticscholar.org/paper/How-designers-work-mak...

The user and the designer have very different motives. It is no wonder that there is a significant misunderstanding.

The user just wants to do what they want to do. Usually the motive is very straight-forward, and almost fleeting. A small subset of their life is affected by this product, and they can easily choose away from it. The relationship is thin.

The designer not only designs for themselves, but for other people. This includes understanding the world, but also has to do with social value. What you create, in many contexts, dictates how you are viewed. The designer cannot choose away from their own product. Their relationship is very thick, complex, and personal.

When the users do not understand your own brainchild, it is hard to lose your own ego. It is easier to keep your own ego and criticize those that threaten it.

Seems like a distinction without a difference.

ok, so you've created a model X for the emotional/mental state of the user.

This article is heavy on the psychology, and very light on any actual design advice.

Tell me how this actually influences the design in contrast to what one would design if what you are arguing against was true.

The best description I heard from an old boss was "Users may be inexperienced or stuck in their methods, but they are capable."

Design your software for your users. They might be lazy. They might be eager. They might be both.

Certainly, an app for those struggling with depression should be structured differently than one for those struggling to beat their weightlifting personal record.

Good design in many products does what it is intended to do for all users down to the lowest common dominator. So, sometimes you have to think about users who aren’t super intelligent and might actually be lazy. Still, the author has a point

The Windows 95 MFC style toolbars and menu, and everyone who inspired and cloned that, were probably at the peak of general UX usability. If you had to take an MFC application and "modernize it" a lot of the functionality that would be staring right at you on the main screen of the application would be buried 10 menus deep in a hamburger menu.

Doing something, other than the most frequently used operation which would be on the application's main screen, now requires remembering which path it is in a series of menus which cannot be on the screen at the same time.

I was verifying the Alan Kay quote: “Simple things should be simple, complex things should be possible” and came across a Quora answer by him that provides a lot of useful context. See https://www.quora.com/What-is-the-story-behind-Alan-Kay-s-ad... Here are two excerpts

"I think I came up with this slogan at Parc during discussions wrt children, end-users, user-interfaces, and programming languages. Chuck Thacker (the genius behind the Parc hardware) also liked it and adopted it as a principle for many of his projects.

So e.g. Smalltalk needed to work with children and end-users even more intuitively than (say) JOSS or Logo. But we also wanted to write the entire system in itself, so that those who were curious — especially later on — could “pop any hood” in the system and see a live program/object written in exactly the same terms as what the children were learning.

Similarly, the GUI had to be easily learnable by children, but — looking ahead — it had to handle “50,000 kinds of things we hadn’t thought of done by 50,000 programmers we hadn’t met” and be as simple as possible." Alan Kay

Neal Stephenson explores similar ideas in his "In the Beginning was the Command Line."

"Another part of this was that we were determined to have a very easy to learn UI would also incorporate end-user programming (scripting) as a natural part of it — in other words to combine what had to be simple yet possible with the programming language with what had to be simple yet possible with the UI.

The general zeitgeist against this idea — both back then and now. Basically: those artifacts that do simple things usually wall off next levels of complexity, and those that do complex things don’t do anything simply.

But, given that there have been some really good examples of how to do both, it’s hard not to see most computer people as (a) not caring, or (b) being lazy or unskilled, or (c) both." Alan Kay

Bonnie Nardi explores the value of this approach in "A Small Matter of Programming: Perspectives on End User Computing"

Thanks for this. I hadn't read that Quora answer before, even though I used to work at Quora.

I will use it for my "Quotes for entrepreneurs collected in Sep-2020" blog post (full set going back to 2006 at https://www.skmurphy.com/blog/category/quotes/). I will show the basic version and this twitter-length one that incorporates context from his Quora answer:

"Simple things should be simple, complex things should be possible. Despite good examples to the contrary, it's unfortunate that those artifacts that do simple things usually wall off next levels of complexity, and those that do complex things don’t do anything simply." Alan Kay

I like to find the original source of a good quote, one of two things often happens: the person credited did not actually say it, someone else do who has other insights to offer; the quote is part of a longer passage that adds value to the original quote.

Many users ARE stupid and lazy, though certainly not all.

But when you create a platform that incentives lazy dumb behavior from both your consumers and your content creators, you end up having a platform with mostly lazy dumb content that mostly only appeals to lazy dumb users, which then creates a feedback loop.

Twitter is a prime example of this. The limited tweet length, and the nature of their algorithm incentivizes the "content-creators" on the platform, I.E. the top 1% of users with a significant follower base, to tweet quick, low information, "hot takes" that mostly appeal to dumb partisanship. Also, the way "engagement", I.E. "views" works, incentivizes "zinger" style shit-slinging rather than actual discussion.

This ends up driving away people looking for interesting discussions, opinions, or information, reducing the "market" on that platform for a content-creator who wants to create informed or interesting content.

This feedback loop cycles over and over, until eventually, a platform that might have once provided interesting content ends up being a place that only provides the lowest brow kind of echo chamber drivel.

And this is how Twitter has ended up being a place where the most popular voices are on either side of a binary partisan line, either celebrating the ambush and shooting of two police officers or pushing the idea that anyone with even the slightest past criminal history has no human rights and deserves to be shot by the police.

I think this is a situation that's only going to get worse. Tech used to appeal to the curious and geeky because those were the only people who could get value out of using it. Now it's about the lowest common denominator. You make more money selling to a thousand fools than a hundred nerds. So the menu button goes away. The UI customization goes away. The root privilege goes away. 90% of everyday people will trade 50% of their control for 50% more convenience every time.

UX is often thought of as solving an issue "for the user", which dismisses or redirects attention from the communication problem onto the individual. No one reframes it, nor should they, as solving engineer/developer "laziness or stupidness". Reading "The Design Of Everyday Things" made me more aware and empathetic, from both perspectives, when I can't find the right button to click or have to think about which way a door will open.

I wish the author had provided more in the way of examples to ground his points. This is a broad topic that could manifest itself in different ways, some of which are more and less applicable. For example:

- Ads with claims like "This single mom makes $$$ working from home, you can too!"

- Clickbait news titles. (You could define "clickbait" here as the most egregious, or as the sensational nature of headlines across the industry.

- More subtle, but pervasive trends manipulation like $_.99 price tags, or arranging grocery stores not for customer convenience, but to encourage them to buy more things

- Even more subtle: The ubiquitous practice of advertising (Perhaps on a product webpage) the positive qualities upfront without mentioning the drawbacks or competition. Is that disrespectful, or just good business practice?

- Video games puzzles designed to be easy to solve. Eg I could paraphrase some of the Portal levels with dev commentary as "We wanted to make this more challenging, but our playtesters kept getting confused, so we made the solution obvious."

- Junk food manufacturers micromanaging the sugar etc content to keep the user hooked and get just the right amount of good-feelings flowing.

- Restaurants loading every dish with salt.

Does the article apply to all these? Maybe? They're on a spectrum, with some more clear-cut as disrespectful than others.

The article doesn't really apply to those, because it assumes UI/UX designers are only disrespectful of their users. Meanwhile, all your examples except the Portal one, are ones in which the vendor is openly malicious - they want to exploit other people, make money by making other people's life worse.

There's another article to be written on those cases. One that points out these practices for what they are: immoral.

Also, many users are not willing to accept; or even good at moving to a complete redesign of your UI/UX.

Skype would be a great example of an application I think very much had it completely in the bag before the buyout and redesign. Windows 8 would be the largest example of the worst redesign of all time, and I dare say will go down in history as the worst version (hopefully!) of Windows, even beating ME; who at least didn't dare make the same rudimentary mistake.

The issue is that especially older users and non-tech-savvy users in general know how to use a handful of applications a handful of ways, and that's it.

I'm sure, on this forum in particular; we have been the typical example of the son/daughter that some family member calls when they need tech help.

It's not about users being stupid or lazy, it's about designers having no sense of perspective and living in a bubble of the advancement of tech. We always think we need to adopt new things and it leaves most of our users in the dust as they are just not on the same level as we are, nor should we expect them to be.

> Firstly, products should definitely be as simple as possible.

Absolutely not. Simplicity is _beneficial_ - as the author argues and assumes - but it is not the alpha and the omega of product and interface design. When something has both simple and advanced uses, both a default mode of use but also complex customizability, the design needs to balance the extent to which it caters to the different levels of use-cases complexity. Just simplifying everything all the way makes the complex uses entirely impossible or effectively impossible; and medium-complexity uses cumbersome and difficult.

> Because I respect users’ time, not because I look down on their intelligence.

The author of this piece (and perhaps the author of "Don't make me think", which I haven't read) respects only the simple desires of the majority of users. S/he does not respect the complex desires and needs of minorities of users, nor the potential for the majority-user to refine and develop their needs and desires.

Sounds like you're re-making the arguments in the article without realizing it.

No amount of work will make the inherently complex less complex, but doesn't have to be more complicated than it has to be. That's what "as simple as possible" means. If you have to peel down a strange interpretation of a problem or a misguided attempt to "dumb it down" before you get to the complexity, it's definitely more complicated. Over-simplification is pretending the complex doesn't exist, or chipping away at some parts until it looks simple, or straight-up disbelieving that there are complex needs and forgoing them.

I haven't read "Don't make me think", but I have read other books by the same author and his general message is that everything should be clear and approachable such that you can tell what things are and spend your energy actually solving the problem or accomplishing what you want to accomplish.

> The author of this piece (and perhaps the author of "Don't make me think", which I haven't read) respects only the simple desires of the majority of users. S/he does not respect the complex desires and needs of minorities of users, nor the potential for the majority-user to refine and develop their needs and desires.

This reading pairs poorly with this quote from the article: "And I’ve found that good products, ones that respect their users, give them more control." If the author really was out to knee-cap software, that's an odd sentiment to hold. I think you're in agreement with the author, but somehow take offense at the word "simple", maybe taking it as meaning "stripped down in function". There are certainly a lot of products, services, companies and people who take it to mean that.

Agreed. Mostly. But some ui decisions dont do favors neither for newbies nor powerusers. (disappearing scrollbars, links and buttons that dont have any visual cues tha they are links and buttons might look nice, but they do obscure the controls of interface for everyone; im still thinking somebody having a stroke is the reason for ribbon interface)

A saying I remember from the Army: treat a man like a man, and you have a man. Treat him like a child, and you’ll have a child.

You might not like the word stupid, so how about "computer illiterate"? And instead of lazy, perhaps "uninterested and distracted"? The end result is exactly the same, and it's indisputable, but I suppose it sounds better.

classic double speak

End users are stupid. This isnt something people are just finding out. As technology advances more sects of the tech population will simply stop learning and get comfortable in whatever software or processes they have relied on.

Its not about thinking the user is stupid.... They are. But then again im not a mechanic, plumber or tow driver. I domt need to know all their "things" .. But i know how to pus gas in my car, change a tire and drive a car.

Im stupid after the very basics.. Just like most end users are. Turning it on .. Checking email.. Looking at facebook .. This is all things people just do.. Its process.

Not everyone needs to understand specific errors or mpd.

Make the effort .. Earn the understanding.

Any discussion like this has no meaning unless you first design what users are you talking about. People are more than willing to invest a lot of time and effort into learning your product if they use it professionally and this is going to make them much more productive in the long run.

The samer person can build their own custom Emacs distributive with Evil mode, and at the same time not figure out how to use your web app made out of couple of forms. Because she needs Emacs for her work, but it's you who's trying to convince her to use the web app.

I guess Krug pissed this person off.

I'll withdraw the comment. lexicality was correct, and I probably didn't write my thoughts well. I do believe in promptly admitting when I was wrong.

> I guess Krug pissed this person off

Why do you think that? The author seems to be lamenting that Krug's advice isn't followed more closely.

Good point.

(Author here). Appreciate the correction. And yes I do love Krug and I think his thinking has been misinterpreted by many.

It’s a pretty emotionally-charged area. I worked for a Japanese camera manufacturer (one of their top ones), and had an up-close-and-personal relationship with the way they designed the UX of their devices. They have tremendous respect for their users, who often wrap their entire careers around the equipment. The company would often do things like make an angle just a tiny bit steeper in the handgrip; which would necessitate a great deal of re-engineering the internals, because users would report issues with the grip.

But I allowed myself to read your article, colored by my own emotional cant, and I should know better. I can get a bit grumpy about what I consider the ham-handed approach to usability, practiced these days.

To offer a slightly different frame: making products that suit people with differing cognitive ability doesn’t have to include a derogatory idea about them. Lots of wonderful people have difficulty with certain kinds of cognition. In addition to thinking through cognitive approaches to design for users with high cognitive ability we should also make simple, self evident interfaces that are therefor accessible to a broader range of users (when we can).

lol, at the risk of sounding callous there exists a certain set of user that is indeed "stupid" for lack of a better word and it's not worth wasting the last 10-20% of your team's time trying to account and perfect your product for them.

anybody who has sat behind someone not familiar with modern web UI conventions and watch them hunt the UI and try to even log in will understand what i'm talking about.

give people autonomy in something like a site builder and "trust their competence" and 9 times out of 10 you will end up with some fugly, frankenstein site which might be what they "want" but doesn't serve their goals.

it's a lesson in futility and frustration. it's the same people who never bother to learn shortcuts, opting to go to edit menu -> copy every single time. my mother traded in her iphone for an "old" candybar phone because it was too complicated in her mind.

in the end, identify your audience and build for familiarity while balancing usability aspects. this fetish for the holy grail of efficiency and simplicity is damaging IMO.

I'm getting a ´name not resolved´ error on the article link. Here's a mirror:


I call this quality (perhaps the opposite of the one in the article), "considerate design". Has the creator of this tool, directory layout, documentation, source code tree, help function, key-shortcut, or whatever, been considerate of the context in which this function or feature is being used and what state of mind the user is likely to be in at the moment of use?

It depends on the product.

If you're designing for specialists in a space, you can make some assumptions.

If you're designing for the average user, and you have little information about their preferences, style, desires, or how they approach problem solving, you have to remember that half of everyone is below average. So if you want mass appeal, you're going to have to account for that half.

> half of everyone is below average

This is almost never true unless you have a sharply bimodal distribution. Assuming a normal distribution, ~34% are right around average (within half a standard deviation of the mean). That would mean about 33% are below average.

By definition, 50% are below average in a normal distribution.

> So what explains the rise of products like Facebook, which have gotten a large part of humanity mindlessly scrolling through feeds of what can most easily be described as garbage content?

That’s what I think of LinkedIn every time I log in to read messages and accidentally start reading the god damn feed of utter business non-sense.

The first point is quite valid. The second point, however, is 100% culturally biased, and if you want to respect your users, don't assume "people value autonomy, relatedness, and competence", but instead first find out which type of cultural grounding applies to which parts of your demographic.

Yet we should call these people "users," thereby dehumanizing them? Design scolding begets design scolding.

Actually, I prefer apps that allow me to not think too much. Because I am lazy.

A certain subset of developers and/or companies want to keep users stupid and lazy, so they can push the users in the direction they want without too much opposition.

This is the exact quote from tech support email: "Hi. I downloaded your software. Something is wrong. What do I do?"

I can't but notice the name of the blog and wonder if it's in any way related to "The Office"

Ha, good catch! I'm a big fan of The Office. That said, the name of the blog is more about me reminding myself not to take myself too seriously :)

was anything said in this article?

Examples would have been helpful.

After reading this design critique, I looked up the author's bio and I'm unsurprised to see he's not a designer, but a long-time developer/dev manager. As a mid-career developer and art school trained designer, I encounter a lot of exclusively technical people who think they completely understand the product design process because technical pursuits themselves involve some amount of design, and they help implement the decisions and plans made by the design team. Sometimes I encounter well-reasoned external input that helps break down echo chambers and challenge group think; I don't think this article falls into that category.

Within the single example given— infinite scroll traps, specifically citing Facebook— the author did not present any real evidence that those designs were even partially driven by contempt for users' intelligence. It's a pretty controversial assertion that he treated as self-evident. Also, the author citing Facebook as a prime example of low-agency infinite scroll traps when services like TikTok and Instagram are far better examples hints he might not be very familiar with the current design landscape which he is critiquing.

Lots of people— mostly designers— have presented very valid critiques of infinite scroll. Generally, the goal of infinite scroll is to reduce the cognitive barriers to staying engaged with a product by having content just appear as soon as users look for it. It's incredibly effective. Too effective. Removing these natural pause points removes one of the ways users subconsciously gauge how much time they've spent on an activity, and removes the places that users tend to insert natural stopping points. For example, when reading a book, flipping pages and passing chapters both gives us a general sense of how long we've spent reading without having to manually track it, and gives us a nice place to hit the pause button and put the book down. I agree that infinite scroll is disrespectful to the user and poor design because it favors business needs over many users' need to not get sucked into scrolling through TikTok for a few hours without even realizing it.

Overall, this article seems to argue against a design perspective I've never once seen a designer present. (Support? Maybe.) I've not once seen anybody justify infinite scroll by saying users are too dumb to use something more complicated. Either this guy has worked for some companies with seriously bad design departments, or he's a bit overconfident in his layman's interpretation of designers' perspective and intent.

I'm sorry you seemed to have interpreted this article as a critique of designers from an "exclusively technical" person. I'm sure this wasn't your intention, but some of your choice of words supports what I think is a pretty harmful idea in software development: that technical people aren't qualified to critique or discuss product design. Ultimately, you are right that I am not a designer, but I've actually worn product management and business hats in the past. Still I don't claim to think I "completely understand the product design process". Ultimately, this was an article I wrote and posted to a primarily technical website. It was primarily intended for technical people, but would hopefully be relevant to anyone involved in building software products.

> the author citing Facebook as a prime example of low-agency infinite scroll traps when services like TikTok and Instagram are far better examples hints he might not be very familiar with the current design landscape which he is critiquing.

This felt a bit presumptuous. I am familiar with Instagram and TikTok, but I chose Facebook as the prime example because they were the _first_ massively popular site to implement this type of interface.

> Overall, this article seems to argue against a design perspective I've never once seen a designer present.

I'm somewhat envious of you, then. I've unfortunately worked with product managers, designers, and engineers who have said that users are stupid. Sometimes, it's well-meaning and in a sort of abstract form, meaning "we should make it so simple that even an idiot could use it". But often, it's unfortunately quite literal.

> he's a bit overconfident in his layman's interpretation of designers' perspective and intent

Again, this was more of a critique of software development than of designers' intent in particular.

> I'm sorry you seemed to have interpreted this article as a critique of designers from an "exclusively technical" person. I'm sure this wasn't your intention, but some of your choice of words supports what I think is a pretty harmful idea in software development: that technical people aren't qualified to critique or discuss product design.

I think that developers have just as much right to critique the the design process as designers have to critique the development process, and I think it's just as likely to be useful. Sometimes it is, sometimes it's born from assumptions or misunderstandings. I specifically highlighted the benefits that arguments from outside the field can bring, and I said that I didn't think your article, specifically, was beneficial in that way. Frankly, I think you said this because you wanted to try and refute my comment on principle rather than refuting the argument that I made, which is that you didn't actually present any evidence of what you were asserting.

> This felt a bit presumptuous. I am familiar with Instagram and TikTok, but I chose Facebook as the prime example because they were the _first_ massively popular site to implement this type of interface.

Sorry if I was being presumptuous. It wasn't clear from your article that you were attempting to show a progression or highlight the history of these sorts of design decisions so your example, completely out of context, seemed outdated and non-representative. Personally, I didn't associate infinite scroll with Facebook more than Twitter or Myspace or Instagram, but maybe that's me.

>I'm somewhat envious of you, then. I've unfortunately worked with product managers, designers, and engineers who have said that users are stupid. Sometimes, it's well-meaning and in a sort of abstract form, meaning "we should make it so simple that even an idiot could use it". But often, it's unfortunately quite literal.

I have 23 years or so of experience working in technology in maybe half-a-dozen industries and I too have heard, among the people who actually make this stuff, say that users tend to behave stupidly as a metaphor for them not being patient or attentive enough to interact with things that aren't immediately apparent— the sort of well-meaning, abstract form that you allude to. How did you know that in your particular situation it was intended to be literal? How do you know that bad design decisions were made based on that? How even could you know? Did they say they ran their copy through a language analyzer to make sure it was at a first-grade level language when they knew they were targeting an adult demographic without diagnosed intellectual disabilities? With what specific criteria do you differentiate beneficially simplified designs— which you note are touted in Don't Make Me Think— and designs that are simplified out of disdain for users' intelligence? How does that difference negatively impact users? How did you come to the conclusion that this is pervasive throughout the industry?

So in this article I did not see a) that product designers, in general, actually disdain their users' intelligence, b) what concrete effects that has on design, and c) how those things negatively affect users.

> Again, this was more of a critique of software development than of designers' intent in particular.

In what way? You mention product design, highlight a design trend, talk about designers, and the only external work you cite is product design tome. I did not once see one single reference made to software development, coding, or programmers.

> talk about designers

I think this is the main issue here. You think this is about designers. It wasn't. I literally only use that word once, and I use it in the abstract. I consider anyone with input into a product as one of the "product's designers". They don't literally have to have that as their job description. A good example is someone like Alan Kay, whom I also cite. I use a lot of "we" and "us", to refer to the tech industry collectively. I never draw a line between engineers and designers.

> It wasn't clear from your article that you were attempting to show a progression or highlight the history of these sorts

I attempt nothing of the sort. I just chose the product I thought people would most associate with infinite scroll. FB, which by raw numbers is much larger than Instagram/TikTok/Twitter, and had an infinite scroll in front of hundreds of millions (billions?) of people even before TikTok was started, seemed like the most obvious example. And so I say "products like Facebook". Maybe MySpace had an infinite scroll first, idk, but MySpace isn't used by billions of people.

> How did you come to the conclusion that this is pervasive throughout the industry?

There's enough argument in the overall comment thread with some people saying "oh, when people say that, it isn't literal" and people responding "well, actually, I've encountered sometimes when it is". I think that proves that is does happen, and not just to me.

> talk about designers

You took those three words out of context and avoided the larger point. You said in your last comment that this was about software development and it's not. It's about product design. This has nothing to do with the construction of software and everything to do with the designing of products that happen to be made with software. This isn't a semantic point. The whole point of what you are trying to say is that people who design software products— regardless of what you call them— think negatively of users and that makes products worse. All I did was ask for proof— repeatedly— and all you did was avoid the question.

> I attempt nothing of the sort.

Like I said, "maybe it's just me."

> There's enough argument in the overall comment thread with some people saying "oh, when people say that, it isn't literal" and people responding "well, actually, I've encountered sometimes when it is". I think that proves that is does happen, and not just to me.

This was a relatively minor point in a whole list of assertions that you made that you didn't back up.

I'm done. You're clearly on the defensive and not particularly interested in feedback or pushback of any sort. You're just taking individual bits of things that I said out of context and not even addressing the larger point that I've repeatedly tried to make. Have a nice day.

You're right, this unfortunately doesn't seem constructive. And yes, I did feel the need to defend myself because I felt like my ideas were not only misinterpreted, but also given less merit because it was written by (what you assumed was) an "exclusively technical" person (even if you acknowledged that "sometimes" those types can provide well-reasoned input).

You asked me to prove that some product designers view their users with contempt, and I said the only proof I can provide is that others in the comment threads seem to have encountered it. You're free to disagree if you haven't, others in the comment thread disagreed as well (though without alluding to whether I was technical or not, or making assumptions about whether I was familiar with certain products du jour). A simple "oh I've never encountered that". It's not like you provided proof that that sentiment _doesn't_ exist either.

Anyway, peace to you brother. We're all in this tech thing together—product designers, software developers, exclusively technical people, exclusively non-technical people... we pour our hearts into our work, and I hope we can have respect for the people we build for. If you think we all already do, then maybe it's just good to remind ourselves every now and then.

The reality is that neither career prepares you for interaction design or HCI, so being an art school trained designer or a software engineer has little bearing on how deeply you understand design; comes down to your own learning path.

My major was interaction design, which is in the design department and shares many points of affinity with graphic design. I have 23 years of experience in technology.

Was more of a general comment as that kind of degree is still rare, but good for you. Watch out for gatekeeping.

I’d like to add that I don’t want to flick my finger down an infinite feed. It’s easy to convince me to (well it was, until I realized the psychological effects) and I might even do it for a while but eventually I’m going to be upset about it.

Yes, nobody ever asked for this, it was built as a manipulative device to drive "engagement."

Lol. Apple seems to think users are stupid and lazy. And they’ve sold billions of devices.

Some of their designs are so minimalist for the sake of the design, that it makes their products harder to use.

One good example is getting rid of keys on their MacBook keyboard. Like the printscreen key, and they replaced it with a Command-Shift-4 keystroke. I think a single printscreen button is far simpler to use.

Another missing feature, is they put the Mac application menus on the top of the screen, but there is no clear keystroke to invoke it, and navigate through it by keyboard arrows. Instead, you must use the mouse to click on it.

By doing this, they eliminated a discreet and digital keystroke action by using the keyboard, and turned it into an analog action by forcing you to navigate to the menu by the mouse. On Windows, the keystroke is Alt-F.

I call this simplifying madness, Fisher Pricing.

I am lazy and I reserve the right to be stupid.


Good thing laziness drives developer productivity.

Users: hold my beer.

Yeah it's really the Apple attitude. "We'll lock everything down so you can't screw it up".

First with iOS and now they're bringing this model with the Mac (on Big Sur you can't even edit config files in /etc anymore, I've been told).

I really hate that attitude, I feel the user should always have the final say in everything that happens on their device. Defaults are fine, locking stuff down is not. There should always be an override for the power user.

> on Big Sur you can't even edit config files in /etc anymore, I've been told

Huh? I'm running the Big Sur beta and just edited my hosts file minutes ago.

I get what you're saying and kind of agree on some fronts but macOS has never really gotten in my way as a technical user and I can run whatever I want on it, I even dual booted Arch linux on this machine for a while. iOS is a whole different story though.

Ok great if it is possible. Like I said I didn't try it myself. I don't have my testing Mac due to lack of office access.

I was told by people on ##apple on Freenode that on Big Sur it's no longer possible to change flags in /etc/ssh/sshd_config for example. But as long as this is possible it's fine. For me that would be a dealbreaker as in work we're not allowed to have SSH daemons with password auth and they often scan for them.

PS: I believe (also need to test) that on models with T2 chip you can boot linux but you can't access the internal SSD then. This is really one of those things where I'd want to see an override.

It's really hard to test right now :( I have a pile of test MacBooks but they're all locked in the office.

But even on iOS.. It's too locked down IMO. I don't use it for this reason, for one I need full NFC access for my Yubikeys (OpenPGP mode). I really miss stuff like that.

> PS: I believe (also need to test) that on models with T2 chip you can boot linux but you can't access the internal SSD then. This is really one of those things where I'd want to see an override.

Thats a bummer if true, I'm still kicking around on a 2013 MBP thats been a really reliable work machine for a long time now but I've been considering a replacement soon.

I don't think OSX has ever supported modifying /etc/ssh/sshd_config? I've always had to modify /System/Library/LaunchDaemons/ssh.plist.

It works fine for me up to Catalina at least.

That plist is for launching it, not for setting the configuration. Though you could probably pass some config as command-line variables (or point it to another config file), but it is a more roundabout way.

It is simply OpenSSH server so it should be possible to change its config of course.

I get this is a problem that bugs you, but it has little to do with the authors points.

Big Sur isn’t much different from Catalina (which did lick more things down). But MacOS isn’t Linux, despite its greatness as a developer platform it’s aimed at consumers, so Apple is going to serve their needs first. And they are going to err on the side of more security.

And on iOS it’s a phone OS, security and consistency is ten times more important and is the main reason users buy iPhones. Apple is never going to open it up for power users because before they know it tens of millions may have installed some app that slipped through review, rooted their phones and filled them with malware.

You hating that attitude doesn't mean you're right. Also that attitude you describe is actually not Apple attitude, it's your interpretation.

Anyway, you say you feel that way – Apple design is based on decades research & experience in the field. They don't get everything right, but you would hardly find a more user-centric company.

One issue I see here is that some of us just don’t want all that control. When i use a new tool, I want it to work. If I see tools mentioned here, I see a lot of people looking for “super duper mega ultimate fine point control 259”. I don’t want a company spending time on the 5% of genius computer users or experts. I want things to work. I don’t want 90 buttons on my ps5 controller. I’m ok if I can’t do the crazy Uber loop you get 3rd place in the world. I want fewer buttons and will trade that for less control.

Many on here will seriously disagree with, but almost my entire corporate office won’t.

But that is exactly the point the author makes with the quote from Alan Kay "simple things should be simple, complex things should be possible". He is not advocating complexity he is advocating that the possibility of "power user" complexity should be there.

An example of this used to be :( Firefox, you can use the browser from the start unchanged but the possibility of complex addons and customizing 'about:config' is there.

The point that annoys a lot of users is the 'google UX' type of decision arrogance, that enforces users to operate in a padded room with no sharp objects. That mindset of "we know better than you how you should want to use this app", is what I suspect frustrates a lot of users.

I'm starting to use the term "space for user to grow" instead of "power user features" - because that's what power user features are. Power users aren't born, they're made through repeat exposure and their own desire to solve their problems faster/better.

This implies that these features don't need to be up front, or conflicting with simpler, streamlined methods. But they need to be there, and be easily discoverable.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact