Users aren't thinking about your app when they use it.
That person in MSWord isn't thinking about the ribbon bar. They are thinking about how to address that new prospective customer.
That person in Excel isn't thinking about cells either. They are thinking about whether Robin in R&D is going to get the figures for Q2 to them in time.
Users have tasks to concentrate on. They don't have the attention left over to deal with your app's crappy UI.
Voice from the next room: "Can you help me, dkarl? The computer won't save my document."
Me, comfy in my chair: "What does it say?"
"It's doing something weird."
"Weird how? What does it say?"
"I don't know, I was just saving like I always do, and now there's this window and it wants me to do something."
"What is it asking you to do?"
"It wants me to click something? I don't know, maybe it wants a password or it's not going to do it because of iCloud or something? It's been weird lately."
"Okay. What are the words in the window where it wants you to click something?"
"It says... oh, it says there's another file with this name already."
"Okay, do you want me to come over there and help you find the other file and see what's in it?"
"No, I can do it. Thank you!"
I think one of the reasons I find computers relatively easy is that I compulsively read whatever you put in front of me. I always read cereal boxes when I was a kid, even the non-kid cereal boxes that were all about colon health and fiber. But even I get this blindness sometimes, especially when I'm writing code and running builds and tests and trying to work quickly and efficiently. I'll get hung up on something for ten minutes where the answer is literally spelled out for me in front of my face.
> I think one of the reasons I find computers relatively easy is that I compulsively read whatever you put in front of me.
Telling my mom this is what eventually got her to read error messages instead of closing them. She had to learn not to close them and then, separately, learn to also read them. I'll admit there were times where I was on the verge of crying out of frustration, begging her to "please just read the screen" because she would leave the error, but then still not actually read it and go with:
Me: "What does it say?"
Mom: "I don't know."
So glad things are better, now.
This is an astute observation.
I also didn't know that other kids read cereal boxes like that... I thought it was just me!
Or even just signs when driving down the highway. I can't help but read everything.
It took me the longest time to realize that for some people reading is something that doesn't happen until they choose to do it.
Then says "hey hon, It won't let me save."
I see the error. Can't access disk. Immediately I think, well I replaced the HDD with and SSD, so that's probably not the problem. I look at the folder she's accessing, everything looks normal. It shows a bunch of other files in it. I go up a folder and back into the folder, and everything looks great, click Save - > same error again.
Finally I go to my computer -> and path my way back all the way into the same folder, click Save. It works and she says thank you!
And I'm thinking, what the fuck that shouldn't have fixed it.. walking away dumbfounded. No problem since, there's nothing wrong with the disk she's been using the computer for days since. It was literally a bug in word's specialized save dialog.
I see alot of apps using hardcoded paths since they create a "downloads" folder and save stuff to it.
Alot of bugs are exposed in a non US-eng install.
I have broken so many computers in my life and repaired them that I am not very worried about pressing the wrong thing because I am confident I can recover from a mistake. Someone who has never repaired their own computer runs an unknown (to them) risk of being wrong that would force them to take their computer to an actual store and the hard drive probably being erased without asking.
Specifically, I was once trying to help a CS professor register a new domain name. This was at some low-cost registrar and there were 6 pages of upsells on the way to checkout. For whatever reason, he was definitely taking his time reading the entire pitch, and eventually I had to step in and click 'next' several times. "No, you don't need a new virus scanner as part of your domain name purchase. No you don't need a company to submit your new domain name to search engines. No you don't need SEO services. ..."
I'm going through a contract-heavy legal process right now, and it is amazing how strongly the agents/lawyers/etc. involved in the process clearly expect me to just sign without reading. Obviously they're not pressuring me not to read it, but the implied expectations of time/effort that it should take is much less than necessary.
Edit to add: also, I've seen this in more places than just health care.
I complained and was ignored in the same way people are ignored when they complain about the wait in the ER or ask how much treatment will cost.
For people who are not comfortable with computers, they just see an error message and expect it to be something that they cannot handle.
Part of the problem is how error messages are presented. A lot of software uses tools like modal dialog boxes to present everything from "a file with that name already exists" to "the file cannot be written". The former can easily be handled. The latter may involve a call to technical support (and part of the reason for that is the ambiguity of the error message).
"I have an error"
"It won't compile"
Whats the error message?
"Function lib.X not found"
Did you import the library?
Most everyone seems to be trained to ignore error messages, regardless of background. Expecting them to go the next step and google the error is a lost cause, regardless of background. Even 5yr devs sometimes need to be taught this... I'm still not clear why, but it feels like people are being trained to use computers this way -- i just can't figure out when, why and who's doing this mis-training
A big frustration of mine is sites/apps that bury their support info. If I need support, I'd like support. Expose your FAQ to me, point me towards some forums, but don't bury the actual contact info somewhere like the footer, or worse, omit it completely. If I type my problem into the supplied form and the chatbot that opens up asks me to type that info again (often after a redirect where the session info gets overwritten so that I can't even copy-paste my already written text) I'm much more likely to be unnecessarily rude to the support person in the other side... Assuming it's a person and not the aforementioned chatbot.
You have no idea how many times I've seen the RDP certificate warning: "Click here to never show this again."
Nope, no. Nobody but me ticks that checkbox. NOBODY. It's infuriating.
That, and the VMware vSphere first-time tab that "explains what a cluster is". Click here to close and never show again.
Nope. Every time I look over the shoulder of a dedicated, full-time, VMware-certificate ESXi cluster administrator... there it is. The certificate warning. The first-time popup. Every time.
In my mind, I know I really should manually grab a copy of the cert from work so I'm actually certain I'm not being MITM'd, but on the other hand I know the likelihood of that being the case is pretty darn low and I'm lazy so I don't. But I make sure that warning keeps popping up because I don't want to forget that I'm doing something dangerous.
The easy way is decently secure: if you tick the checkbox, it memorises the certificate serial number for you, the same as if you had manually trusted a self-signed cert. Any change to the certificate will show the warning again. This is comparable to trusting an SSH certificate the first time you connect to a host with Putty.
The best way is to issue RDP Certificates from an Enterprise PKI: They're free, you never get warning popups ever again, and they're secure.
Your method means that if the certificate changes (due to a MitM attack), you won't notice. The warning looks the same with a new certificate as an old certificate that you didn't choose to trust. You're 100% vulnerable and you've both hidden an a warning for an actual attack and also trained yourself to ignore all such warnings.
Then it should say so.
"Click here to never show this again"—never show for this certificate? For this connection? For any connection?
Be clear and specific, even if it takes three more words.
I'd put it differently. Realize that your app is not the center of the world. Your users have things to do, and using your app is a small fraction of their life. Even though it might be your baby and you might be spending all of your waking ours thinking about your app, user perspective is different.
Thereby forcing all of their users to learn the new UI immediately, not even with due warning.
One or two platforms doing is in a week or so is a pain. But when lots of platforms do this in a short period of it, it can be "throw hands up in the air and do something else". :(
but this is functionally the same thing. whether the user is an idiot giving you 100% of their mental capacity, or a genius giving you 1%, you have to make the same decisions when you're designing a product. "Users are stupid" isn't necessarily a judgemental thing meant to demean your users, just a reality that you have to accomodate the people who are going use your product in a way that's indistinguishable from a stupid person.
You can't assume that anybody will remember anything they did in a previous step, or that anybody will correctly interpret the label on a button no matter how clearly you think it is worded, or when presented with multiple options will be able to correctly choose the one they want.
Say that, unbeknownst to you, 90% of your users are non-native English speakers. Is a non-native English speaker "stupid" because they misinterpret the simple English idioms, metaphors, and symbology sprinkled throughout your product?
A group of actually-stupid-but-native English speakers might exhibit the same behavior.
You hypothesize that you have a large percentage of non-native English speakers using your app and change the design to accomodate. Behavior improves, distinguishing the bulk of your users from merely "stupid" English speakers.
The problem is that if you conceptualize your users as "stupid" the hypothesis that something else might be going on _never even enters your head_. You never do the experiment. Or, if you do, it's not faithful to an alternative model and is basically a wild stab in the dark.
Imagine your user was not able to sleep on a Trans-Pacific flight in bad weather, they have hay fever and a very sore neck.
Better yet, get user studies.
An idiot giving you 100% will never learn your product.
A genius giving you 1% will eventually develop a highly optimized way to use your product.
If we assume all users are the former, we never design for the latter. That pretty much sums up the last 20 years of de facto UX.
And if you disagree, I'd recommend you go find a call center still using mainframe apps, and watch a random sample of 100 users.
It was hilarious watching some of the reactions in our team when a client gave us a screenshot that was just boxes with numbers in them arranged neatly with no more than red/yellow/green for colors. Some saw ugly... I saw enhanced productivity.
Personally I think more UI/UX designers need to spend time doing data entry with time constraints so they learn the pains of repeated useless actions and time wasting features.
perhaps, if the user is a continual user. but most users of most products are infrequent, and if your UI relies on being learned over time, it's probably not going to work for most people.
If you're working on SaaS that is meant to help someone make money or work faster, there's a good chance you'll have people using your software as a part of their dayjob - i.e. 2-8 hours a day, 5 days a week, for years. If you treat your users as idiots and don't implement space for them to grow (i.e. power user features), you'll be wasting a lot of life for a lot of people (and a lot of money of their bosses).
That's a heuristic I'd like to see being used in UX design: assume your software becomes a tool in some company, and you have full-time users spending their workday on it. Design to that group.
Which, yes, probably. But there are more goals in software development than hyper growth.
In most situations most of the time, your code is between someone and something they want. Your main importance in their life is when you fuck it up, not when you get it right. You just aren’t that important. The user is.
Which is partly why we need books championing good design. Because good design gets out of the way, and would go mostly unnoticed if not for other designers and aficionados giving them kudos.
I disagree that you have to make the same decisions. I think you need to design your product contextually. Sometimes, users might be giving you 100% of their capacity, sometimes they will only give you 1%. It depends on context and motivation. I'm working on a personal finance app. Sometimes, users are in a "I just want to make sure that latest transaction isn't fraud"-mode and you'll have seconds of attention. But sometimes (albeit much more rarely), users are in a "I want to sit down and think about my financial future mode".
> "Users are stupid" isn't necessarily a judgemental thing...
I hope not, but unfortunately, I've often found that it is.
At least in my experience, the two ways of framing users lead to different behavior of development team.
The seminal work on interaction design isn't Don't Make Me Think, it's About Face. About Face makes thee general rule that you should not design for beginners or experts but for the perpetual intermediate.
A developer might spend a huge amount of time and effort modeling a workflow or a process into something crisp enough for a machine to process. That might take months of what feels like pulling teeth from people to extract requirements - people who are supposed to be domain experts. By the time he's done, he probably understands everything better than the people whose job it is to use the tool - the tool that now enables them to ignore and forget about parts of their job and complete it with 10% of their attention.
"Stupid users - they barely understand their own job and I spent months learning the finest details about it to build them a tool so they can keep doing it without understanding it at all."
Not saying it's right, but I can see how it happens.
Right, we also need to remember that simple is not better than possible.
If your interface is too simple and feature limited that your user can not complete their task, then you have gone too far.
We want to reduce friction as much as possible while still allowing a customer to solve their problem.
Dang, that's just the most perfectly succinct phrasing of (an answer to) my pet hate in software/developer attitude/life in general.
But it also seems that the 'possible' facet is just nowhere near as financially lucrative as 'simple'.
If the underlying model is very powerful but so complicated that it's very difficult to convey it to anyone, is it of any use? "Financially lucrative" in that case is simply a proxy for "can I get anyone go understand what is even going on?"
I've built an online pub quiz, and I'm doing as much hallway usability testing (in the Zoom hallway) as I can. Early on, every single one ended up with me changing things from being more flexible and powerful to being simpler. Either the underlying model, or just what is being presented/allowed in the user interface at any given time.
There is just no point in trying to provide lots of options instead of a well thought out middle-of-the-road path of obvious next steps if no-one will be able to get around to doing what they came for. People can't become power users if they aren't users.
UX design is so hard, but luckily it's also very interesting :)
Yep. It's probably over used as an example, but I always come back to pinch-zoom. People don't care about your UI, the best you can let them do is directly manipulate their data. Anything else is a compromise. That compromise is always going to be there, but we need to minimize it.
MS Office has turned into a sort of industrial machine with a control panel. You can do anything to your data if you know what sequence of buttons to push to tell the machine what to do. Actually that peaked several years ago and they have been getting better about more direct access to features. OTOH they also have so much functionality that such UI will never go away completely.
On a related tangent, I feel like tools for programmers suffer even more from this. Because we're used to editing text files to get things done, the notion of editing a config file or writing a script to get something to do what you want doesn't seem that unreasonable to software developers. Maybe it should.
Text files aren't perfect, but they're hard to beat. It's not like the hardest part of programming is actually syntax, anyways. You can always improve on this by providing better assisted editing.
If there's anything I'd actually want it would be for software to have pluggable configuration so I can instruct it to pull its configuration from a database or something along those lines.
edit: and on that note, it’s probably the command line interface that actually would strongly benefit from some consideration. CLI arguments are currently a flat unstructured list... but almost everything at least has key value pairs.
Alternative views into those same text files would be nice. You could color-code the type of a snippet (yaml string/float issues) or nested blocks. You could offer a graph view of dependencies (context-dependent -- could be services, function calls, modules, docker base images, etc...).
Text files are nice (and I've yet to see anything that would be a good replacement for most uses), but that doesn't mean we can't offer extra tooling/views into that data when it would be more convenient.
Are they real world problems like the examples given above.
What is more important.
Someone once said the best interface is no interface.
If I do not have to think about an "interface", if the program is doing its job without requiring interaction, then that is more time I have to focus on what is important.
Not all software can be like this but a lot can. As the article indicates, there is an enormous amount of "forced interaction" in today's computers. This is in part because companies that employ interface developers rely on the online advertising business to make money.
I think about it every time I'm forced to use it. I think about how much I hate it, how constantly clicking between tabs slows me down, and how much more productive I was without it.
For mainly this reason, I still use Word 2003 and have no plans to switch until the fad passes and someone in UI at Microsoft finally stops designing for the lowest common denominator and remembers the point of toolbars wasn't to replace menus, but rather to put commonly used functions one click away.
I have the latest version of Office installed for when it's needed, I simply prefer doing my serious Word work in '03.
I don't think I'd hold up any version of Office as the paradigm of "internet safe" software.
a recipe for rote disaster
I'm a dev and I cannot cringe at a user failure to operate a piece of software.
Even those with a solid enough mind and education will have issues, french teacher told me she couldn't bear the improper use of verbs and nouns in her daily usage. It's a gigantic mess.
Though from a practical PoV those two states are often hard to distinguish. Very similar hand-holding is required in each case.
> Users have tasks to concentrate on. They don't have the attention left over to deal with your app's crappy UI.
This is very true. We provide accountability/competence/compliance management software for regulated industries (mainly savings and investment banking ATM) and for most of our users touching our software is a secondary task linked to their main roles. Even the day-to-day admins and UAT testers for new releases, are seconded from other areas or worse effectively being asked to use our software on top of their usual roles (an extra task given with no extra time assigned or other tasks paused).
Distractions include sudden popups that aren't the result of an immediately preceding interaction (looking at you, iOS). And those login forms. No one ever wants to see a login form for something they've already logged into from this browser/device.
Also, it might seem silly to point out, but use your goddamn product yourself the way your users would. That alone really helps with many UI/UX issues.
Therefore, it shouldn't distract from the task at hand.
I have to admit the first thing that came into my head after reading the title was “well actually, some of them are, dude”. There’s a reason why no-one really wants to work the hell-desk and act as the front line for user support. There’s a reason why websites like ‘Not always right’ exist. There’s a reason why Reddit has /r/TalesFromTheFrontDesk...
I think it’s as much a disservice to assume that all users are enlightened angelic creatures, who only need that little pointer to go their own happy way, as it is to assume they’re all lazy and stupid. Sure, start off with the helpful approach, but shutting down people who are only there to scam a deal or make themselves a nuisance would go a long way to curbing that behaviour.
Stupid, on the other hand, is something that needs to be coped with well - and good design can certainly help. Stupid needs the hand-holding, because there’s nothing the client can do to help it. “Stupid” is also often just unfamiliarity, so good hand-holding will prevent a repeat of the situation.
Lazy/entitled I have no truck with. If you’re not willing to help yourself, I’m sure as hell not going to do it for you. I’d never make it past the first day in customer support...
Though, I sorta object to treating them all like their stupid. There are stupid users, and then there are users that are stupid because you treat them like they are. Let me explain.
When I worked in healthcare I was pretty stupid. It wasn't that I didn't understand the mechanics behind it, but it just didn't click with me because the training I had was deplorable. I was regularly yelled at for making mistakes, and didn't have a really supportive environment. So in my haste to ensure I did everything quickly, and didn't make any mistakes it actually caused me to either overthink or underthink certain steps due to anxiety and nervousness. So in other words, when someone started treating me like I was stupid I was more prone to acting like I was stupid...
So when I worked at help desk no matter how stupid a users was I always did my best to treat them as kindly as possible. In the end what wound up happening is that I wound up getting much better results than my coworkers who wouldn't treat them respectfully. That wasn't even my super weapon though. It was anticipation of future problems...
I always really liked Roman Mars perspective on design. Good design is effortless. When you go through a door if there is a handle on it and you pull that handle you feel stupid when you see that it says pull only. So in my own way I am always trying to view things I do in IT, whether programming or helpdesk from a user centric perspective. If I think ahead of time about what sort of potential issue someone would have, and I can change something to prevent that then I have much better results.
This is something I see being a problem all up and down the entire stack in IT. It is time to stop treating our users like idiots. But it’s also time to start returning to idiot proof design.
You wouldn't accidentally hit a switch on your car and now you can only go left, with no indication on why that is, and no transitional information given during the switch. Yet programs and websites will happily modify the state without so much as a brief text flash asaying "Oh hey, you're in Paint Only Mode now" for example.
My absolute favorite way to deal with this is small status indicators in some kind of toolbar, that when I hover over it, it gives me a tooltip on what it is and how to toggle it.
Accidentally hit debug mode? No worries, a little debug symbol just flashed briefly, I can go look for it in the toolbar and hover over it to figure it out.
There is a high degree of irony to this when you think about the extremely popular Unix philosophy, which is kind of the antithesis to what you're saying.
I know that it's meant for power users, but even for those in many cases it's still bad UX.
If so, that's completely different. It just states that after the user change something and the thing actually changes the way the user expects it to do, you should not interrupt the UI flow. It has no realtion at all with hidden state.
State on a classical Unix system is pretty much entirely explicit.
(Also composability usually implies lack of hidden state - in Unix-land, state issue may just come from the baggage of C language.)
If you look at all modern tools, this approach has been abandoned, especially for long running operations, since it's obviously bad UX.
It is more about a lack of feedback than a hidden state. It is a problem but not the same one. Talking about Unix tools, the hidden state would be more like environment variables like the PATH or locale.
Related: cmdline arguments are essentially lexical binding at process level. I wish there was an idiom there like there is in Lisps (Common Lisp in particular), where a function would take an optional argument whose default value is the value of a dynamic variable. In process terms, that would be declaratively specifying that an optional cmdline parameter takes its default from an environment variable.
What I really want is a generic system of algebraic effects for processes so "read configuration" becomes something I can inspect or wrap with my own handler.
What is why the same tools do not have silent failures. And OS failures aren't silent.
> If you look at all modern tools, this approach has been abandoned
Gladly, you are wrong. Modern tools have indeed add optional noisy modes, nearly all opt-in.
Thankfully, there are no bugs in our software :-)
> Gladly, you are wrong. Modern tools have indeed add optional noisy modes, nearly all opt-in.
Almost everything network facing (which is almost everything, these days), is verbose by default. Git, curl, etc.
Yep. Bugs are why OS failures are not silent.
But you've got a point. Network facing software need some progress confirmation.
There are many people who have little interest in the underlying “how” and “why” of computers work. They haven’t been educated, more importantly they don’t want to be educated. They want to focus their time and energies on (What they regard as) more important things.
I have two bright teenage girls who need to use computers every day to get school work done. But their focus is art and reading, and writing. Their eyes gaze over whenever I try to explain why their computer is behaving the way it is.
My eldest did an amazing project in just a couple hours the first time she ever used Photoshop, removed a person imperceptibly from a old scanned image, realigned it to be upright from a mislaid scan, recolored it to make it look present day without oversaturating it.
Most of this she learned from in app tutorials. But I had to show her how to save the file and explain file formats and the difference between cloud and local storage. She just didn’t care about that part any more than she needed to turn the assignment on.
But how to use Photoshop is like another form of sketching or painting, so she ate those tutorials up like chiclets.
Yet how often do you ask a waiter about the challenges of scheduling the maintenance of a commercial refrigerator?
That sounds like textbook selection bias; users will usually end up at helpdesk only when they're out of their depth.
Interactions with users were more common in my position than just when they needed help. I provisioned all the computers for the company I worked for. I also upgraded any software, usually manually for whatever reason, and dealt with a lot of the day to day maintenance.
Like I said, users are not universally stupid, but when they are they can be especially stupid. A lot of it could be mitigated with better design. A lot of it could be mitigated by critical thinking on behalf of the users.
Tech support pros saying users are all dumb is like brain surgeons saying everybody has brain damage.
This resonates with me because it's something I try to do all the time.
The XKCD "Ten Thousand" comic really spoke to me and ever since reading it I've tried to treat people as 'temporarily inconvenienced geniuses' rather than morons when they ask me for help and it seems to work wonders.
If you enter requests for help with the mindset of "let me show you how to solve your problem" and not "let me solve your problem" you end up speaking to people as if they are an equal which always seems to provoke better responses
Of course the reward for being good at digging holes is more holes and to coin a malaphor if you teach a man to fish he'll come annoy you every time his boat won't start, but it's also pretty nice to be the person everyone assumes can fix anything.
(Also it's a great outlet for the typically annoying desire to solve everyone's problems)
But! On the other hand, if I ever needed something it wasn't that hard to get someone to help me. Kindness in my experience has always multiplied. There is, as you said, always something that someone doesn't know, including yourself. So there was always something that the users were willing to teach me in return.
PS. I love this comic. When I am finally done going to school maybe I'll hang that in my cubicle or something. I estimate with all of the programming I have done for college, and all of my personal time on side projects that when I am finally ready to get back into IT again I'll end up being stuck in help desk again, anyway...
This is very true and why I love mentoring people. Once you have shared knowledge with someone, they usually want to return the favour and you can learn a lot from that.
> I love this comic. When I am finally done going to school maybe I'll hang that in my cubicle or something.
Nice! Prior to that comic I had Hanlon's razor in a frame above my desk which uh, while a good sentiment isn't the most positive mindset.
Cheesy (but true) anecdote: As I was writing my article this morning, there were two distractions. First, I tabbed to Facebook where I saw one of my "friends" had posted something political that was both factually incorrect and also very offensive. It reminded me of All the Bad Things, and made me doubt my argument, to the point of almost just scrapping the article. Not just because that FB post was an example of someone being stupid and lazy, but because I had mindlessly interrupted myself and tabbed to Facebook and was about to engage with that post.
My second distraction was my 2-year toddler, who had just woken up and was tugging at my sleeve. And I thought about the world I want him to grow up in and the view I want him to have of it.
I know I can come across as idealistic (hell, I've even gotten that as formal feedback in a performance review—but I've also gotten feedback that I'm cynical, so shrug). But honestly, one of the wonders of us as humans is that we can carry all these different capacities. We can be smart and motivated and social. We can be stupid and lazy and anti-social. All those capacities are worthy of admiration, not contempt. Technology is a great tool, we can decide where we focus it.
We should instead approach issues in our lives with this “how the world should be” attitude, instead of “how do I make gains out of the world that is”. This means treating people with respect and courtesy even when we believe they are wrong. We work with them to achieve a better result, because lack of knowledge is not equal to stupidity and laziness in all cases. For instance I do not have a medical degree, but appreciate not having doctors talk down to me. Work with me, help fill in the gaps, don’t pander with hand wavy explanations. If it requires too much time in the moment then point me to were I can independently find the information and study on my own.
In other words, live by the “means” not by the “ends”.
All too often I've found people dealing with external users become calloused. We forget that questions with obvious answers (to us) may not be obvious others and that this doesn't imply that others are stupid.
I feel that it's our responsibility to design things to be as obvious to as many people as possible. I also think a default attitude of respecting the users of our products can go a long way to making this happen.
Yes, sometimes the action is really frequent or it's part of the main workflow, and having a shortcut for it is good, but I think one often ends up with a more resilient tool by allowing the user more control so they can recover from mistakes and exceptional cases you didn't think of.
A recent example is an app I'm working on that involves helping the user work through a list (they do a task for each item, then move to the next one, and so on). It's a high priority for the user to get through the whole list efficiently, so the PM wanted us to have only a "next" button because showing other buttons "might be confusing." The thinking was that it would streamline the workflow if the UI is minimal: do the task, click "next", do the task, click "next", etc.
This is the kind of situation where I really want to give the user more control — if we provide "next" and "previous" buttons, they can go back if they made a mistake, or check one of the earlier items and then return to what they were doing, look ahead to see what's coming up and plan for it, etc.
I have that attitude towards things like expense filing software. It's a nuisance that I don't want to spend time learning, and it doesn't come up often enough to learn it well. So my click stream probably looks stupid or lazy.
Have you ever taught someone to drive? They have their attention completely occupied by ... everything at once.
After a while you can see the pattern. "turn here. right here. WAIT TURN HERE!" "here?" "no, back there."
It's not that they're stupid, it's that you have the pattern ingrained in you and you don't remember what it's like to experience something for the first time.
Now yes, there are idiots - in terms of social interactions - they lack maturity or patience or speak too quickly (or are having a bad day). But don't lose compassion for 99 people for the one unenlightened person.
Tangentially related, I’ve noticed some online products almost purposefully make their websites less appealing to filter out certain types of users. I think landing page copy call it “speaking to your audience,” but it’s the same thing where you’re selecting for the kind of customer you want.
Maybe they clicked forward from a previous screen and were presented with an active text field and the "send code" button, but didn't recognize it as a button, or misunderstood its purpose, expecting the code to be delivered automatically, or thinking that it meant "send code from the user to the website" (i.e., after they have typed it in).
Whatever the details, a bunch of users have an expectation that doesn't match the tool's reality. It's the job of the designer to recognize or discover that gap and find ways to remedy it.
While I’m sure your “send code” button was visible, apparently that’s not what they expected. Maybe they expected “Send MFA code”, or “Text me my code”. Maybe the terms you are using don’t match up well with their limited understanding of the domain problem.
Sounds like a problem A/B testing or simple user interviews could help with.
These days it seems just as likely that some other EvilCorp™ with more money than you has done a better job hijacking your potential users' attentions as that your users really do have something better to do. But either way, I think the "focus on the user" kind of mantras painted on the walls of so many tech offices have been twisted into something horribly exploitative or beaten into irrelevance by rote repetition without thought. So many "products" being "sold" to users these days aren't really products at all, in the sense that if they do offer the user something of real value it's almost by accident. There is no value, absolutely none, placed on building a quality product for its own sake.
If you're all tangled up in this mess and "users are stupid and lazy" is what lets you sleep at night, I don't blame you.
I spend a lot of time turning off features that were built to distract me. Notification badges, unread item counts, related content, news feeds put in places where they don't belong etc.
Some simple tasks are made deliberately difficult by this. For example, why is it so hard to get just the weather, without any bullshit?
Related is my issue with most "data-driven" companies these days. Extensive telemetry and A/B testing isn't "being user centered", it's just the feedback leg of the control system over your users that you're designing to mine them for all they're worth.
An interface is humane if it is responsive to human needs and considerate of human frailties.
There is a temptation as a designer/programmer to present the world with MY PERFECT VISION of what a piece of software should be like. Nothing wrong with that, as far as experimenting with and presenting new ideas. But a mindset that produces better software is one of serving the needs of the users. Of course there is an entire profession in identifying the "needs" and studying the "users". But the mindset: "we build tools that enables users", is the first step.
Far more insidious and destructive to the general computer experience has been the rise of businesses, whose entire purpose is to produce software that exploits it's users. I don't know the solution to this phenomenon but I find it repulsive and strive to avoid this software as much as I'm able. Raskin's dedication for the above book:
We are oppressed by our electronic servants. This book is dedicated to our liberation.
Might have been tongue in cheek then but feels disturbingly serious now.
Now. Am I really saying that they users are stupid? Of course no. Saying that a user is stupid is just an assertion to say that you really have to design as if users were highly incapable of inferring complex mechanics or interactions, because the reality is that there’s always a subset of users that legitimately won’t get it.
I get what the author is saying though. Referring to a user as stupid is derogatory, but when I have personally used that phrasing I always include myself as part of that group of users. I don’t know anyone that uses it differently.
It’s us collectively as software users that we are stupid. It’s not a direct insult to an specific user or cohort of users. It’s just a token to assert that your software should be “stupidly” easy to use.
If someone has used this phrasing beyond it’s rhetorical meaning, then they don’t have the empathy to design software. Simple.
I just believe that nobody uses this phrasing to personally insult another person, and that we shouldn’t be focusing in such small trivialities. Designing software has more complex ethical implications than trying to poke holes at your communication style.
Again. Most people that use this phrasing do it to assert a point. I don’t think I have ever seen this phrasing used in anything formal. This is just a desk phrase to say that a particular thing is too complex. Seems really unnecessary to even argue around this, honestly.
We're several generations into the field of designing user interfaces. Like with all other saws and pieces of folk wisdom, it is the case that when you learn an adage in context of knowledge and experiences it summarizes, you understand it's just a shorthand, a helpful reference in your mind. But if you learn the adage without that context, you derive your own meaning that's not necessarily the intended one (or you just dismiss it as being trite).
It's a problem as old as human societies - you have to live through relevant situations to understand as piece of received wisdom. There is no known solution, but it is still super helpful, if an adage is phrased in such a way as to minimize misunderstanding by people who hear it without experience to understand it. "Users are stupid" fails that.
> If someone has used this phrasing beyond it’s rhetorical meaning, then they don’t have the empathy to design software. Simple.
If you assume that engineers are also (rhetorically) stupid, then if they regularly don't realise that it's a metaphor does that not mean we need to change the way we communicate to help them understand the the system of design?
If you optimize for sheep, you get sheep
But, I realized that I'd taken for granted that they were "experts users", not dummies, and that they would easily figure this stuff out. Heck, they might even appreciate not having their hands held! Unsafe assumption.
From this experience, I took away the lesson that users may not be stupid , but you won't go wrong pretending they are, and taking the same precautions you would if they were.
- When engineers design software, we design it with the mindset that people like using software and will appreciate all the cool features.
- When non-engineers use software, they do it because they need to use it, not because they like it. They have other stuff to do. If anything it's an obstacle to what they need to do, so they interact with it as little as possible to do what they need.
So, imo that line about respect for your users was spot-on. Don't get in your user's way more than you have to.
I always design from a perspective of having experienced a lot of truly horrible software, from websites that require spelunking to find the nugget you came to find to HR time-tracking applications that don't even allow you to define a standard week.
While I do assume people like using software and all the cool features, that stance takes a backseat to thinking about how to:
* fit flows to what they actually want to achieve
* minimise the amount of data entry and clicks
* produce meaningful messages (errors, tooltips and descriptive text)
* not overwhelm the user with features irrelevant to the current view/flow.
I'm by no means particularly knowledgeable in "formal" UX, but I find focusing on usability does make users appreciate your software more. YMMV
That's been known to happen :)
And if there is anything that my life has taught me, it is that potentially everyone responds differently to the same stimulus. There are few "universal" things about people and so generalizations are difficult if not impossible with regard to how people will react.
Understanding that can help avert the 'dual user stories' quagmire that some products find themselves in. Imagine your text editor product development team is half hard-core vim users and half hard-core emacs users. Maybe you put together this team because you wanted a product that appealed to all text editor users, maybe it was just thrown together, however such a team was formed, my experience is that it will produce a substandard product that will not appeal to anyone. This seems to happen because the conflicting visions of "a good editor" within the team seep into the product and the users of the result are often confused as different parts of the same product seem to have different philosophies about how the product should work.
That said, I'd love a product like 'Monarch Money' that was a) not subscription based and b) not cloud hosted :-).
 Anecdotally of course, just a survey of my general social group of which roughly 40% chose to have children and pretty much half took a "self determination" route in child raising and the other half took a "guided/trained" route.
Author here, reading, and thanks for this, I really enjoyed your comment. And the 'dual user story' problem (aka split-brain) is so true, I wish someone had told me about it earlier in my career.
> That said, I'd love a product like 'Monarch Money' that was a) not subscription based and b) not cloud hosted
We chose a subscription-based business model to try and align ourselves with users. If I had to write a second part of my article, I'd write something like "Your product becomes your business model". I've worked on enough "free" products to know that they lead to the very dynamics I criticize in my article (there's another comment in this thread that summarizes that quite nicely). As for not cloud-based, I hope that's something we can deliver some day down the road, but for now, cloud-based products are the easiest way to deliver the type of cross-platform product we're trying to build.
On the product, I'm not interested in a 'free' product, I'd pay you $300 for such a product and buy periodic updates for $100/year. I tend to do this will all 'tools' type applications (Illustration, CAD, Simulation, Etc.) I love having the company invested in making it something wonderful and open source products with their, by necessity, indiscriminate addition of developers, suffer for that reason.
My concern, especially with financial software, is that your cloud gets compromised and I and all of your customers are scrambling to avoid losses. Frankly I am amazed that people can get liability insurance for that sort of thing given the stakes!
That said, Quicken and Intuit in general, really needs competition to help them focus.
Edit: And per the summary comment. I am not suggesting a 'freemium' or ad supported model, just one that keeps the data where I can be assured that if someone else let's their guard down it won't compromise my data. That is the big weakness with 'cloud' in my mind.
 FreeBSD tries really really hard and comes close, I expect it would really shine if it had a revenue stream to support a full time product management team and lead developers.
What you're saying makes sense and is definitely something worth thinking about. We do take security very seriously, of course, but I understand your concern about data still living in the cloud.
Thanks ahead from one parent to another.
Among my friends, there exists generally two very broad schools of thought with respect to raising children.
In one school of thought, which I relate to self determinism and is how I raised my kids, is that as a parent you have no idea what the "best way" is for your kids, they aren't you. As a result, parenting is a process consisting of three things; Teaching your children the underlying reasons and rationale of the world. Sharing with your kids the choices you made, why you made them, and what you learned. And finally, helping them explore different things while giving them a framework for capturing any new learning or understanding that their exploration yielded.
The other school of thought, which I think of "You'll do better than me because I'll tell you how to be a better me." is about using your understanding of the world and your experiences in it to create a path for your child that will be more successful (or as successful) as your path. Parenting in that case focuses mostly on setting the direction for your children, giving them goals to help them evaluate how far they have moved along the path, and correcting them when they stray from the chosen path.
Both styles fail at the extremes. For self determinism that would be completely disengaged "observer" parents, and for guided path, that would be over functioning "helicopter" parents.
Parenting is a big job, but you have to remember that you will be lucky if you to spend a total of five years worth of time being with them. Planned path parenting fails kids who are completely different than their parents (in my experience) and that situation can be expressed by people who graduate from college with a degree in X and then completely change their life after they have run off the end of the list of goals their parents gave them.
Anyway, my 0.02 for what it is worth.
 Assuming at age 18 they go off to college, from about 3 yrs to 5 yrs you get to spend all day with them, then half days, then maybe a quarter day + weekends, and then mostly weekends until they are gone. Chronologically it is nearly 2 decades by in terms of face to face time, especially time where you can talk and really share ideas and thoughts, it is incredibly short.
Knowing your audience, and how to create beginners is critical. Helping them get up to speed as quickly as possible is the goal, not assuming they are stupid or lazy.
This article was plausible for me, up until it might not account for users who do not have digital literacy or capabilities like the writer's own. There might be a bit of a blind spot there, ironically.
Many of our major services started the same way.. Facebook, Twitter, Instagram all started as very simple services for people to learn new digital experiences and interactions.
With these social media digital interactions as a foundation, maybe it's possible to place more complex initial interactions in front of some users.. but I feel that the digital alienation of people is real and one of the things that may be fueling the divide in society when it comes to access to opportunity.
This post by Google's CEO articulates the alienation that not helping create beginners can fuel: https://www.nbcnews.com/think/opinion/digital-technology-mus...
"The computer “user” isn’t a real person of flesh and blood, with passions and brains. No, he is a mythical figure, and not a very pleasant one either. A kind of mongrel with money but without taste, an ugly caricature that is very uninspiring to work for. He is, as a matter of fact, such an uninspiring idiot that his stupidity alone is a sufficient explanation for the ugliness of most computer systems. And oh! Is he uneducated! That is perhaps his most depressing characteristic. He is equally education-resistant as another equally mythical bore, “the average programmer”, whose solid stupidity is the greatest barrier to progress in programming. "
What I do se a LOT though is designers disrespecting users' time. Many see their product as the center of the world, expecting and kinda forcing the user to spend much more time and energy with their product that is really needed.
I am instead going to propose what I think software should aim for.
1) Software should be safe to explore. Trying things out is how you learn. Accumulating learning is how you become productive. With that in mind, you should never allow the user run a command that creates a situation where they can't revert to the previous state. When one wrong move can erase hours of work, your users will never become experts. (Historically, software has been absolutely awful here.
I think this is the number one reason why people are afraid of software -- the one UI paradigm they've learned throughout every new "UI innovation" is "one wrong move will destroy your work". So they tread carefully, and being careful is slow and painful.)
2) A user that knows the jargon that describes some command should be able to run that command by name. Teaching users the jargon can be hard, but once they've learned the word, they should be able to at least run the command. An example is watching a YouTube tutorial for some operation that was recorded a few versions ago. Sinc the tutorial was made, someone moved every menu item around (hello, Fusion 360). Now the tutorial is useless -- you know the command you want to run and how to use that command, but you have to dig around in the UI looking for it. That's a big fail. (I called out Fusion 360 for this, but they have a command search, so it's not a big deal. The search could be better, like maybe telling you how to do it faster next time ("next time, just press X"), but at least it's there. I'll also point out that Emacs does this, and even tells you the shortcut after the command is over. If a 1970s LISP app can do it, so can your whizbang Electron app.)
Anyway, I'm not sure I'd understand how to use a door the first time I saw one. Normally you can't walk through walls, but with this innovation, you can. Weird! But once you see someone else use it, or someone walks you through it (literally), it's pretty easy. With software, we are creating new concepts like that several times a day. At some point, we have to accept that the user will have to try it out and learn how to use it. There is a first time for everything, after all.
When I was a kid, I was surrounded by computer-illiterate adults. The biggest fundamental difference between their approach and mine, was that they were "afraid to explore" and could only understand things if they were provided in the form of a prescribed set of instructions. Meanwhile, I was comfortable casually exploring, and thus gained a far better intuition for how to actually use things.
You wouldn't. You'd see this great, menacing piece of matter in front of you. If the door was closed, that would be it for you. But if you saw it being opened, or better yet, encountered it open, then the fun would start.
You'd touch it and it'd move. You'd notice it has inertia. You'd grab it by the side or push, and see it move more. Eventually, you'd close it by accident, and then trying to move the door back would fail. You'd cry. Your parent/caretaker would come and open the door for you. You'd grab it again, and close it again. Then maybe you'd play a game with your caretaker, with them holding you in their hands and opening the door, and you closing it with a strong push, laughing at the noise. Some time later, you'd figure out how the door handle works.
Source: I have a 15 month old daughter. This is how she learned how to operate doors. Nowadays, when she doesn't want some thing we want from her (e.g. getting dressed), she sometimes runs ahead and closes the door behind her. She understands the door in more abstract terms now. But it took a lot of trial and error.
A door I needed to open for an elderly person who didn’t have the strength to open it (fire stop door with strong closing spring - a potentially fatal UI).
A door that needs two hands to open - one for key (spring loaded) and one for the handle. My friend with one arm complained about it because they wouldn’t be able to open it.
Teaching someone how to use a touch unlock door on a car (great UX once learnt, but multiple weird UI interactions that one needs to learn to use).
An automatic door that wouldn’t open even with face up or waving hands (I’m guessing I was close to background temperature, but who knows why? Took 20 seconds)
A door that seemed to be locked (needed to push harder? Or push handle harder? Someone else then opened it).
Many locks are very unobvious which way the key turns (e.g. on my car anticlockwise unlocks the drivers, clockwise unlocks passenger: I regularly still get it wrong).
So many shop doors that are unclear whether you push or pull.
A shop door that says “PULL” but I don’t realise I am reading it from the reverse side so I pull when I should have pushed... (I often don’t notice I am mirror/reverse reading something).
Thank you for this.
Also, even better products have ways to signal how they are providing control. Sadly, UIs have reduced the means of doing so in a standardised, immediately accessibly way over the years.
When wading through the actual issues "the public" has, individually, you'll find that the issues are actually real. Usually some un-turned nugget of wisdom that was not applied in a very specific way. Many people learn things in a variety of ways. Different learning methods don't make people stupid either.
Or actively malicious, which I ran into consistently when doing internal support for a company that employed a lot of pharmacists. There are plenty of even well paid people who are just miserable and will intentionally break things for self gain or even no reason whatsoever.
People are trash. I've made a career out of designing to accomidate that fact.
An example I like is Amazon's question feature. If you buy a product and somebody else has a question about it, Amazon will mail you and ask whether you can answer the question. When you browse Amazon, you'll notice that quite a few people don't understand what's happening at all. They believe that some person has directed a question at them personally, and feel compelled to answer, even if they have no idea. Amazon being Amazon lets those answers go live, and so you'll see answers like "Dear Mr. Doe, unfortunately this was a gift for my nephew, so I cannot tell you how large the item is compared to a banana". Others are using it for support, I've seen things like "Hi Amazon, I don't know. Can you please tell me how to find more from manufacturer X?"
"Users are stupid" is short and catchy, and it beats holding an hour long explanation session each time you want to remind somebody that they're not designing something to be used by people like them, but by society at large. And many users are, in regard to this technology, stupid.
Is anyone actually judging them, saying they're horrible people or something similar?
Take AndroidAuto: the order of the icons always changes on the front screen. That is so stupid I have no words for it!!!
Google Drive: you have to scroll up to the top so that new items appear at the bottom of the list.
And so on ...
The cure to this sort of thinking is two spoonfuls of empathy: one for others (truly putting ourselves into other people's shoes or lack thereof), and another one for ourselves (understanding our own issues better).
You might think that a GUI is a waste of your time because you can use CLIs, yet non-intuitive flags or help messages lead you to google time and time again the same awk or sed command (or whatever else). You're always someone else's stupid.
Meet people where they're at, not where you're at. That is by the way true for users as much as it is for your manager, your sales team, and so on.
Gedenryd showed the problems of understanding the user and designer as thinking computationally in his PhD thesis: https://www.semanticscholar.org/paper/How-designers-work-mak...
The user just wants to do what they want to do. Usually the motive is very straight-forward, and almost fleeting. A small subset of their life is affected by this product, and they can easily choose away from it. The relationship is thin.
The designer not only designs for themselves, but for other people. This includes understanding the world, but also has to do with social value. What you create, in many contexts, dictates how you are viewed. The designer cannot choose away from their own product. Their relationship is very thick, complex, and personal.
When the users do not understand your own brainchild, it is hard to lose your own ego. It is easier to keep your own ego and criticize those that threaten it.
ok, so you've created a model X for the emotional/mental state of the user.
This article is heavy on the psychology, and very light on any actual design advice.
Tell me how this actually influences the design in contrast to what one would design if what you are arguing against was true.
Certainly, an app for those struggling with depression should be structured differently than one for those struggling to beat their weightlifting personal record.
Doing something, other than the most frequently used operation which would be on the application's main screen, now requires remembering which path it is in a series of menus which cannot be on the screen at the same time.
"I think I came up with this slogan at Parc during discussions wrt children, end-users, user-interfaces, and programming languages. Chuck Thacker (the genius behind the Parc hardware) also liked it and adopted it as a principle for many of his projects.
So e.g. Smalltalk needed to work with children and end-users even more intuitively than (say) JOSS or Logo. But we also wanted to write the entire system in itself, so that those who were curious — especially later on — could “pop any hood” in the system and see a live program/object written in exactly the same terms as what the children were learning.
Similarly, the GUI had to be easily learnable by children, but — looking ahead — it had to handle “50,000 kinds of things we hadn’t thought of done by 50,000 programmers we hadn’t met” and be as simple as possible." Alan Kay
Neal Stephenson explores similar ideas in his "In the Beginning was the Command Line."
"Another part of this was that we were determined to have a very easy to learn UI would also incorporate end-user programming (scripting) as a natural part of it — in other words to combine what had to be simple yet possible with the programming language with what had to be simple yet possible with the UI.
The general zeitgeist against this idea — both back then and now. Basically: those artifacts that do simple things usually wall off next levels of complexity, and those that do complex things don’t do anything simply.
But, given that there have been some really good examples of how to do both, it’s hard not to see most computer people as (a) not caring, or (b) being lazy or unskilled, or (c) both." Alan Kay
Bonnie Nardi explores the value of this approach in "A Small Matter of Programming: Perspectives on End User Computing"
"Simple things should be simple, complex things should be possible. Despite good examples to the contrary, it's unfortunate that those artifacts that do simple things usually wall off next levels of complexity, and those that do complex things don’t do anything simply."
I like to find the original source of a good quote, one of two things often happens: the person credited did not actually say it, someone else do who has other insights to offer; the quote is part of a longer passage that adds value to the original quote.
But when you create a platform that incentives lazy dumb behavior from both your consumers and your content creators, you end up having a platform with mostly lazy dumb content that mostly only appeals to lazy dumb users, which then creates a feedback loop.
Twitter is a prime example of this. The limited tweet length, and the nature of their algorithm incentivizes the "content-creators" on the platform, I.E. the top 1% of users with a significant follower base, to tweet quick, low information, "hot takes" that mostly appeal to dumb partisanship. Also, the way "engagement", I.E. "views" works, incentivizes "zinger" style shit-slinging rather than actual discussion.
This ends up driving away people looking for interesting discussions, opinions, or information, reducing the "market" on that platform for a content-creator who wants to create informed or interesting content.
This feedback loop cycles over and over, until eventually, a platform that might have once provided interesting content ends up being a place that only provides the lowest brow kind of echo chamber drivel.
And this is how Twitter has ended up being a place where the most popular voices are on either side of a binary partisan line, either celebrating the ambush and shooting of two police officers or pushing the idea that anyone with even the slightest past criminal history has no human rights and deserves to be shot by the police.
- Ads with claims like "This single mom makes $$$ working from home, you can too!"
- Clickbait news titles. (You could define "clickbait" here as the most egregious, or as the sensational nature of headlines across the industry.
- More subtle, but pervasive trends manipulation like $_.99 price tags, or arranging grocery stores not for customer convenience, but to encourage them to buy more things
- Even more subtle: The ubiquitous practice of advertising (Perhaps on a product webpage) the positive qualities upfront without mentioning the drawbacks or competition. Is that disrespectful, or just good business practice?
- Video games puzzles designed to be easy to solve. Eg I could paraphrase some of the Portal levels with dev commentary as "We wanted to make this more challenging, but our playtesters kept getting confused, so we made the solution obvious."
- Junk food manufacturers micromanaging the sugar etc content to keep the user hooked and get just the right amount of good-feelings flowing.
- Restaurants loading every dish with salt.
Does the article apply to all these? Maybe? They're on a spectrum, with some more clear-cut as disrespectful than others.
There's another article to be written on those cases. One that points out these practices for what they are: immoral.
Skype would be a great example of an application I think very much had it completely in the bag before the buyout and redesign. Windows 8 would be the largest example of the worst redesign of all time, and I dare say will go down in history as the worst version (hopefully!) of Windows, even beating ME; who at least didn't dare make the same rudimentary mistake.
The issue is that especially older users and non-tech-savvy users in general know how to use a handful of applications a handful of ways, and that's it.
I'm sure, on this forum in particular; we have been the typical example of the son/daughter that some family member calls when they need tech help.
It's not about users being stupid or lazy, it's about designers having no sense of perspective and living in a bubble of the advancement of tech. We always think we need to adopt new things and it leaves most of our users in the dust as they are just not on the same level as we are, nor should we expect them to be.
Absolutely not. Simplicity is _beneficial_ - as the author argues and assumes - but it is not the alpha and the omega of product and interface design. When something has both simple and advanced uses, both a default mode of use but also complex customizability, the design needs to balance the extent to which it caters to the different levels of use-cases complexity. Just simplifying everything all the way makes the complex uses entirely impossible or effectively impossible; and medium-complexity uses cumbersome and difficult.
> Because I respect users’ time, not because I look down on their intelligence.
The author of this piece (and perhaps the author of "Don't make me think", which I haven't read) respects only the simple desires of the majority of users. S/he does not respect the complex desires and needs of minorities of users, nor the potential for the majority-user to refine and develop their needs and desires.
No amount of work will make the inherently complex less complex, but doesn't have to be more complicated than it has to be. That's what "as simple as possible" means. If you have to peel down a strange interpretation of a problem or a misguided attempt to "dumb it down" before you get to the complexity, it's definitely more complicated. Over-simplification is pretending the complex doesn't exist, or chipping away at some parts until it looks simple, or straight-up disbelieving that there are complex needs and forgoing them.
I haven't read "Don't make me think", but I have read other books by the same author and his general message is that everything should be clear and approachable such that you can tell what things are and spend your energy actually solving the problem or accomplishing what you want to accomplish.
> The author of this piece (and perhaps the author of "Don't make me think", which I haven't read) respects only the simple desires of the majority of users. S/he does not respect the complex desires and needs of minorities of users, nor the potential for the majority-user to refine and develop their needs and desires.
This reading pairs poorly with this quote from the article: "And I’ve found that good products, ones that respect their users, give them more control." If the author really was out to knee-cap software, that's an odd sentiment to hold. I think you're in agreement with the author, but somehow take offense at the word "simple", maybe taking it as meaning "stripped down in function". There are certainly a lot of products, services, companies and people who take it to mean that.
Its not about thinking the user is stupid.... They are. But then again im not a mechanic, plumber or tow driver. I domt need to know all their "things" .. But i know how to pus gas in my car, change a tire and drive a car.
Im stupid after the very basics.. Just like most end users are. Turning it on .. Checking email.. Looking at facebook .. This is all things people just do.. Its process.
Not everyone needs to understand specific errors or mpd.
Make the effort .. Earn the understanding.
The samer person can build their own custom Emacs distributive with Evil mode, and at the same time not figure out how to use your web app made out of couple of forms. Because she needs Emacs for her work, but it's you who's trying to convince her to use the web app.
I'll withdraw the comment. lexicality was correct, and I probably didn't write my thoughts well. I do believe in promptly admitting when I was wrong.
Why do you think that? The author seems to be lamenting that Krug's advice isn't followed more closely.
But I allowed myself to read your article, colored by my own emotional cant, and I should know better. I can get a bit grumpy about what I consider the ham-handed approach to usability, practiced these days.
anybody who has sat behind someone not familiar with modern web UI conventions and watch them hunt the UI and try to even log in will understand what i'm talking about.
give people autonomy in something like a site builder and "trust their competence" and 9 times out of 10 you will end up with some fugly, frankenstein site which might be what they "want" but doesn't serve their goals.
it's a lesson in futility and frustration. it's the same people who never bother to learn shortcuts, opting to go to edit menu -> copy every single time. my mother traded in her iphone for an "old" candybar phone because it was too complicated in her mind.
in the end, identify your audience and build for familiarity while balancing usability aspects. this fetish for the holy grail of efficiency and simplicity is damaging IMO.
If you're designing for specialists in a space, you can make some assumptions.
If you're designing for the average user, and you have little information about their preferences, style, desires, or how they approach problem solving, you have to remember that half of everyone is below average. So if you want mass appeal, you're going to have to account for that half.
This is almost never true unless you have a sharply bimodal distribution. Assuming a normal distribution, ~34% are right around average (within half a standard deviation of the mean). That would mean about 33% are below average.
That’s what I think of LinkedIn every time I log in to read messages and accidentally start reading the god damn feed of utter business non-sense.
Actually, I prefer apps that allow me to not think too much. Because I am lazy.
Within the single example given— infinite scroll traps, specifically citing Facebook— the author did not present any real evidence that those designs were even partially driven by contempt for users' intelligence. It's a pretty controversial assertion that he treated as self-evident. Also, the author citing Facebook as a prime example of low-agency infinite scroll traps when services like TikTok and Instagram are far better examples hints he might not be very familiar with the current design landscape which he is critiquing.
Lots of people— mostly designers— have presented very valid critiques of infinite scroll. Generally, the goal of infinite scroll is to reduce the cognitive barriers to staying engaged with a product by having content just appear as soon as users look for it. It's incredibly effective. Too effective. Removing these natural pause points removes one of the ways users subconsciously gauge how much time they've spent on an activity, and removes the places that users tend to insert natural stopping points. For example, when reading a book, flipping pages and passing chapters both gives us a general sense of how long we've spent reading without having to manually track it, and gives us a nice place to hit the pause button and put the book down. I agree that infinite scroll is disrespectful to the user and poor design because it favors business needs over many users' need to not get sucked into scrolling through TikTok for a few hours without even realizing it.
Overall, this article seems to argue against a design perspective I've never once seen a designer present. (Support? Maybe.) I've not once seen anybody justify infinite scroll by saying users are too dumb to use something more complicated. Either this guy has worked for some companies with seriously bad design departments, or he's a bit overconfident in his layman's interpretation of designers' perspective and intent.
> the author citing Facebook as a prime example of low-agency infinite scroll traps when services like TikTok and Instagram are far better examples hints he might not be very familiar with the current design landscape which he is critiquing.
This felt a bit presumptuous. I am familiar with Instagram and TikTok, but I chose Facebook as the prime example because they were the _first_ massively popular site to implement this type of interface.
> Overall, this article seems to argue against a design perspective I've never once seen a designer present.
I'm somewhat envious of you, then. I've unfortunately worked with product managers, designers, and engineers who have said that users are stupid. Sometimes, it's well-meaning and in a sort of abstract form, meaning "we should make it so simple that even an idiot could use it". But often, it's unfortunately quite literal.
> he's a bit overconfident in his layman's interpretation of designers' perspective and intent
Again, this was more of a critique of software development than of designers' intent in particular.
I think that developers have just as much right to critique the the design process as designers have to critique the development process, and I think it's just as likely to be useful. Sometimes it is, sometimes it's born from assumptions or misunderstandings. I specifically highlighted the benefits that arguments from outside the field can bring, and I said that I didn't think your article, specifically, was beneficial in that way. Frankly, I think you said this because you wanted to try and refute my comment on principle rather than refuting the argument that I made, which is that you didn't actually present any evidence of what you were asserting.
> This felt a bit presumptuous. I am familiar with Instagram and TikTok, but I chose Facebook as the prime example because they were the _first_ massively popular site to implement this type of interface.
Sorry if I was being presumptuous. It wasn't clear from your article that you were attempting to show a progression or highlight the history of these sorts of design decisions so your example, completely out of context, seemed outdated and non-representative. Personally, I didn't associate infinite scroll with Facebook more than Twitter or Myspace or Instagram, but maybe that's me.
>I'm somewhat envious of you, then. I've unfortunately worked with product managers, designers, and engineers who have said that users are stupid. Sometimes, it's well-meaning and in a sort of abstract form, meaning "we should make it so simple that even an idiot could use it". But often, it's unfortunately quite literal.
I have 23 years or so of experience working in technology in maybe half-a-dozen industries and I too have heard, among the people who actually make this stuff, say that users tend to behave stupidly as a metaphor for them not being patient or attentive enough to interact with things that aren't immediately apparent— the sort of well-meaning, abstract form that you allude to. How did you know that in your particular situation it was intended to be literal? How do you know that bad design decisions were made based on that? How even could you know? Did they say they ran their copy through a language analyzer to make sure it was at a first-grade level language when they knew they were targeting an adult demographic without diagnosed intellectual disabilities? With what specific criteria do you differentiate beneficially simplified designs— which you note are touted in Don't Make Me Think— and designs that are simplified out of disdain for users' intelligence? How does that difference negatively impact users? How did you come to the conclusion that this is pervasive throughout the industry?
So in this article I did not see a) that product designers, in general, actually disdain their users' intelligence, b) what concrete effects that has on design, and c) how those things negatively affect users.
> Again, this was more of a critique of software development than of designers' intent in particular.
In what way? You mention product design, highlight a design trend, talk about designers, and the only external work you cite is product design tome. I did not once see one single reference made to software development, coding, or programmers.
I think this is the main issue here. You think this is about designers. It wasn't. I literally only use that word once, and I use it in the abstract. I consider anyone with input into a product as one of the "product's designers". They don't literally have to have that as their job description. A good example is someone like Alan Kay, whom I also cite. I use a lot of "we" and "us", to refer to the tech industry collectively. I never draw a line between engineers and designers.
> It wasn't clear from your article that you were attempting to show a progression or highlight the history of these sorts
I attempt nothing of the sort. I just chose the product I thought people would most associate with infinite scroll. FB, which by raw numbers is much larger than Instagram/TikTok/Twitter, and had an infinite scroll in front of hundreds of millions (billions?) of people even before TikTok was started, seemed like the most obvious example. And so I say "products like Facebook". Maybe MySpace had an infinite scroll first, idk, but MySpace isn't used by billions of people.
> How did you come to the conclusion that this is pervasive throughout the industry?
There's enough argument in the overall comment thread with some people saying "oh, when people say that, it isn't literal" and people responding "well, actually, I've encountered sometimes when it is". I think that proves that is does happen, and not just to me.
You took those three words out of context and avoided the larger point. You said in your last comment that this was about software development and it's not. It's about product design. This has nothing to do with the construction of software and everything to do with the designing of products that happen to be made with software. This isn't a semantic point. The whole point of what you are trying to say is that people who design software products— regardless of what you call them— think negatively of users and that makes products worse. All I did was ask for proof— repeatedly— and all you did was avoid the question.
> I attempt nothing of the sort.
Like I said, "maybe it's just me."
> There's enough argument in the overall comment thread with some people saying "oh, when people say that, it isn't literal" and people responding "well, actually, I've encountered sometimes when it is". I think that proves that is does happen, and not just to me.
This was a relatively minor point in a whole list of assertions that you made that you didn't back up.
I'm done. You're clearly on the defensive and not particularly interested in feedback or pushback of any sort. You're just taking individual bits of things that I said out of context and not even addressing the larger point that I've repeatedly tried to make. Have a nice day.
You asked me to prove that some product designers view their users with contempt, and I said the only proof I can provide is that others in the comment threads seem to have encountered it. You're free to disagree if you haven't, others in the comment thread disagreed as well (though without alluding to whether I was technical or not, or making assumptions about whether I was familiar with certain products du jour). A simple "oh I've never encountered that". It's not like you provided proof that that sentiment _doesn't_ exist either.
Anyway, peace to you brother. We're all in this tech thing together—product designers, software developers, exclusively technical people, exclusively non-technical people... we pour our hearts into our work, and I hope we can have respect for the people we build for. If you think we all already do, then maybe it's just good to remind ourselves every now and then.
Some of their designs are so minimalist for the sake of the design, that it makes their products harder to use.
One good example is getting rid of keys on their MacBook keyboard. Like the printscreen key, and they replaced it with a Command-Shift-4 keystroke. I think a single printscreen button is far simpler to use.
Another missing feature, is they put the Mac application menus on the top of the screen, but there is no clear keystroke to invoke it, and navigate through it by keyboard arrows. Instead, you must use the mouse to click on it.
By doing this, they eliminated a discreet and digital keystroke action by using the keyboard, and turned it into an analog action by forcing you to navigate to the menu by the mouse. On Windows, the keystroke is Alt-F.
First with iOS and now they're bringing this model with the Mac (on Big Sur you can't even edit config files in /etc anymore, I've been told).
I really hate that attitude, I feel the user should always have the final say in everything that happens on their device. Defaults are fine, locking stuff down is not. There should always be an override for the power user.
Huh? I'm running the Big Sur beta and just edited my hosts file minutes ago.
I get what you're saying and kind of agree on some fronts but macOS has never really gotten in my way as a technical user and I can run whatever I want on it, I even dual booted Arch linux on this machine for a while. iOS is a whole different story though.
I was told by people on ##apple on Freenode that on Big Sur it's no longer possible to change flags in /etc/ssh/sshd_config for example. But as long as this is possible it's fine. For me that would be a dealbreaker as in work we're not allowed to have SSH daemons with password auth and they often scan for them.
PS: I believe (also need to test) that on models with T2 chip you can boot linux but you can't access the internal SSD then. This is really one of those things where I'd want to see an override.
It's really hard to test right now :( I have a pile of test MacBooks but they're all locked in the office.
But even on iOS.. It's too locked down IMO. I don't use it for this reason, for one I need full NFC access for my Yubikeys (OpenPGP mode). I really miss stuff like that.
Thats a bummer if true, I'm still kicking around on a 2013 MBP thats been a really reliable work machine for a long time now but I've been considering a replacement soon.
That plist is for launching it, not for setting the configuration. Though you could probably pass some config as command-line variables (or point it to another config file), but it is a more roundabout way.
It is simply OpenSSH server so it should be possible to change its config of course.
Big Sur isn’t much different from Catalina (which did lick more things down). But MacOS isn’t Linux, despite its greatness as a developer platform it’s aimed at consumers, so Apple is going to serve their needs first. And they are going to err on the side of more security.
And on iOS it’s a phone OS, security and consistency is ten times more important and is the main reason users buy iPhones. Apple is never going to open it up for power users because before they know it tens of millions may have installed some app that slipped through review, rooted their phones and filled them with malware.
Anyway, you say you feel that way – Apple design is based on decades research & experience in the field. They don't get everything right, but you would hardly find a more user-centric company.
Many on here will seriously disagree with, but almost my entire corporate office won’t.
An example of this used to be :( Firefox, you can use the browser from the start unchanged but the possibility of complex addons and customizing 'about:config' is there.
The point that annoys a lot of users is the 'google UX' type of decision arrogance, that enforces users to operate in a padded room with no sharp objects. That mindset of "we know better than you how you should want to use this app", is what I suspect frustrates a lot of users.
This implies that these features don't need to be up front, or conflicting with simpler, streamlined methods. But they need to be there, and be easily discoverable.