Hacker News new | past | comments | ask | show | jobs | submit login
Too Many Knobs (neverworkintheory.org)
161 points by ashitlerferad on June 10, 2016 | hide | past | favorite | 116 comments

I've noticed a trend in many web apps to stop doing something that used to be standard

It's this: you're displaying a list of things that have various properties. It used to be the case that you're be allowed to sort and filter by any property. Many web apps nowadays seem to 'curate' my sort and filter options and in many cases a particular use is crippled because the property I want to sort or filter by is one that the author deemed a minority use-case.

Now the programming cost of allowing filtering by anything is minimal. The cognitive load on the user is minimal (once you've got the UI for one property then adding more doesn't make much difference)

I've found this deeply frustrating on many occasions.

"Now the programming cost of allowing filtering by anything is minimal"

Not only is this not true except in your imaginary app but it's also disregarding the incredibly bad UI that can come from allowing everything to be filtered.

A good design has constraints and thoughtful choices. A shitty design has everything possible thrown on the screen because "allow filtering by anything is minimal".

This is highly application-specific. If you're looking for components to use in some design, for example, it's useful to have the component value, power rating, cost, form-factor, etc, all displayed together and filterable/sortable. If the application doesn't allow for filtering in this context, then that's something that the user ends up doing mentally. Worse, if the application doesn't allow all this information to be displayed as a summary, you end up with 50+ products for which you have to manually click a link and skim the product brief of each one individually.

For example, here's Newarks's procuct chooser [1]. You can sort and filter by any field. It remains uncluttered by default-populating only the most commonly-used fields, but it avoids sacrificing any capabilities via making these hidden fields easy to access (click the "extended" button right above the table header).

I'd like to point out that in lists like these, the parent comment is 100% correct in saying "[the] cost of allowing filtering by anything is minimal". And while I'm sure an experienced designer could improve the UI, this is a scenario in which sacrificing capability for beauty is absolutely an inappropriate thing to do.

[1] http://www.newark.com/eeprom

A good metadata layer can generate appropriate filter controls, so it is the case that a system can be architected so that filtering by anything provides little programming overhead. The system that i work on is like this. Whenever you see a grid, you can filter by any column, with appropriate controls for that column's datatype. Whether it is desirable to do such a thing depends on the nature of the product and its users, but my point is that the dev overhead can indeed be minimal.

> except in your imaginary app

That kind of language isn't going to further the discussion.


Arbitrarily adding constraints doesn't make design better. The default container for tabular data probably should allow for sorting by any column. Show the OP the benefit of the doubt.

One reason for this is that if the list is lazy loaded (e.g. paginated) in any way, it can be more difficult to make sorting & filtering work on more dimensions - mainly it requires the list to be indexed in extra ways.

Not impossible, but some applications weren't designed to support it from the beginning.

If it's common to sort by X, you should add an index. But, if it's rare to sort by X then doing it the hard way is at worst (O) x log x which is a generally a non issue out to a few billion records.

Remember, give me the next 100 of X less than Y is an O(N) operation.

> But, if it's rare to sort by X then doing it the hard way is at worst (O) x log x which is a generally a non issue out to a few billion records.

Not really true. Users aren't understanding if "sometimes the list page takes over 10 seconds to load". They don't understand (or care) that they are doing a non-indexed query or that the alternative is they don't get to do it at all. Removing a feature is often not acceptable, and so often the only resolution to a problem like this is to add an index.

Of course, indexes increase the write times to your database. Indexes on every field increase the write times by a lot.

Except sometimes you can't simply add an index, because you're displaying a calculated value and only calculate it for the page you're displaying. In order to sort by that value, you have to calculate for the entire system, so now you're into O(n^2) or worse complexity. You can also denormalize or pre-cache calculated values in some way (even going as far as adding something like ElasticSearch on top), but that introduces another dimension of problems (expensive updates, more potential for bugs and/or stale data).

Do this for several fields, and you're now talking about quite a significant development effort, both up-front and for on-going maintenance, to support sorting that wasn't ever an actual requested feature and that only a small minority of users actually use.

I know this because I have inherited exactly this problem before. Even as thousands turn to millions (let alone billions), anything worse than O(n log n) becomes painfully obvious rather quickly, and it tends to happen to the entire system at once. Suddenly you have to fix a whole bunch of "this feature is unusable slow, you broke my business" problems at the same time, while getting further and further behind your actual planned schedule.

> Remember, give me the next 100 of X less than Y is an O(N) operation.

..IIF you spent the time to implement caching (and deal with the additional cache invalidation complexity problems).

Many apps do paging on the database, or only do expensive joins after paging. Heck, many apps are even stupider and query the entire dataset from the DB, do paging in code, then throw it all away afterwards (so every time the client asks for the next page, it transfers the entire result set from the DBMS).

Heck, many apps are even stupider and query the entire dataset from the DB, do paging in code

I would only expect to see something like this at a company with glacial bureaucracy because that's a technique that was already garbage in 2007 when I had to babysit an app whose devs decided to retrieve a user's entire object graph whenever doing a User#show.

10sec load times are similiarly the domain of the DMV and trivial hobby apps on a free Heroku.

At the risk of exposing myself as an FB user and thus older than 25, I think a better example for discussion is FB's wall sort: "Top Stories" vs. "Most Recent." If you're like me, you like "Most Recent," but Zuck has his own ideas about how I should live my life and will switch me back to "Top Stories" anywhere from 6 to 24 hours after my last visit. You cannot hard-set your wall to "Most Recent." Educated me would think that sorting by date descending is the O(nlogn)-iest of sorts, yet here we are, with Big Data Massage turning out to be a forced default.

Point being, I hope: it's not just a choice between 10s loads and the elimination of features. The difference is rarely that stark, and with simple attribute sorts (even multiple) should be nigh-on unnoticeable.

Except sometimes you can't simply add an index, because you're displaying a calculated value and only calculate it for the page you're displaying. In order to sort by that value, you have to calculate for the entire system, so now you're into O(n^2) or worse complexity. You can also denormalize or pre-cache calculated values in some way (even going as far as adding something like ElasticSearch on top), but that introduces another dimension of problems (expensive updates, more potential for bugs and/or stale data).

I have a heuristic that if you're doing temp tables and derived values and whatnot, you're edging into reporting, so denormalizing and aggregating makes a lot of sense for that.

Apologies if I've ranted entirely past your point!

At the risk of exposing myself as an FB user and thus older than 25...

As an FB nonuser much older than that, I'm amused... The internet wins eventually.

> I have a heuristic that if you're doing temp tables and derived values and whatnot, you're edging into reporting, so denormalizing and aggregating makes a lot of sense for that.

Totally agree.

> 10sec load times are similiarly the domain of the DMV and trivial hobby apps on a free Heroku.

Unfortunately, the app I was talking about:

1) Allows customization of several views, including adding several columns that are expensive (and users can add all of them, if they want). This was not a big deal with hundreds of items, but is with hundreds of thousands.

2) Has a hierarchy above these objects, but doesn't enforce any context. So literally you can load this view for the entire system (which can be hundreds of thousands of objects, with some columns that summarize data from tens of millions of rows)

3) Some of the expensive calculated values are time-dependent, eg "past hour", "past day" etc, which also makes pre-calculating a very expensive operation (not impacting user time directly, but adding overall load to the system).

4) Was built on top of MS SQL server, deployed both as a "cloud" model and as a product on-premise at customer sites. This meant several customers were heavily invested in SQL licenses, and beefy hardware to run it all. Even if we overlooked the challenge of updating our own product (half a million lines? give or take) and data migration (dozens of ten to hundred-GB databases, and hundreds of smaller ones), changing to another DBMS and telling our customers that their tens-of-thousands of dollars investments in hardware/licenses were useless was unacceptable.

So yes, in this case it was possible to get multiple-minute load times in the worst cases. Why would anyone want to see many of these columns for hundreds of thousands of rows at once? I have no idea, but they could, and it was slow, and they complained.

To your point: this absolutely should have been considered 'reporting', but the app didn't make that distinction. This was the main view of data, essentially, and it was possible for users to make it really slow.

We ended up redoing part of the UI at one point, and during that time, removed sorting from some of the columns. Users complained about this at the time, but got over it. We also put a hard limit on of 20,000 objects: more than that and it just says "please pick a group" first. Again, users complained about this, but got over it. Going through that was crappy, but the constant complains about being slow stopped. I should note, there is a separate reporting UI that does allow sorting by everything.

I wanted to go much further than that, and completely re-think the UI and some of the design concepts (such as not allowing a single list to be viewed for the entire system at once, and instead either summarizing multiple groups, or displaying detail for one) but at the time the business side (including up to the CEO) was unwilling to do a change like that, and we also had no formal product management to really push the issue.

I am thinking of a pretty expensive product that we are currently using that is very similar to what you are describing.

It has an old search/filter interface that allowed filters on arbitrary fields, and indeed filtered results would sometimes take 10-30 seconds to display.

Now it is being phased out in favor of new, prettier and faster interface, that only allows filters on some fields, and on top of that has a much lower limit on filtered results displayed.

It does load a page of results on screen faster, but leads to never-ending frustration when trying to do anything useful: now, instead of filtering on what I actually want, I have to iteratively do a zillion filters on irrelevant items, trying to fit within the row limit, combine the output, and then filter that again manually, sometimes taking hours to do what is possible to do within minutes in the old version.

On top of that, for a longest time an absolutely essential filter was hardcoded to be on in the "new" version, making the whole thing useless entirely :(

The whole discussion reminds me of discussions of "feature creep" in products like Office. Indeed, most people use only ~20% of the features -- the problem is that these 20% are not the same for everyone.

> Indeed, most people use only ~20% of the features -- the problem is that these 20% are not the same for everyone.

Yep. The trick is walking the line: if you don't develop the features in the first place, you may not get some of those customers because you don't meet their needs. If you try to be everything to everyone, you end up with everyone being upset later when it's slow/buggy and you're inevitably forced to change it.

There's also going to be some portion, call it 20% or 5% or whatever, of your features that almost nobody uses, and that's the stuff that really can bog you down. I think it's healthy for a company to really take a hard look at this from time to time: take into account the costs of those features to calculate the profitability of those customers, consider their future potential, the market reaction/perception of having them (or losing them), and really consider if those features (ad customers) are truly worth keeping.

Also, regarding "So yes, in this case it was possible to get multiple-minute load times in the worst cases. Why would anyone want to see many of these columns for hundreds of thousands of rows at once? I have no idea, but they could, and it was slow, and they complained." -- most likely, because they either need to compute statistics or do an analysis that your app does not provide, or in order to make custom queries based on that output.

> most likely, because they either need to compute statistics or do an analysis that your app does not provide, or in order to make custom queries based on that output.

Very well could be, and this is actually a very good example of why unbounded features like this are terrible to have.

Using a main interactive view in the UI is not the way to achieve this. Data export, reporting systems, and/or APIs are the ways, and the big difference is performance is less of a factor. Over a second for a UI is a long time, but several seconds, or even several minutes, can be acceptable for data dumps/exports (that usually happen in the background, not user-interactive).

One way to look at this, write a user story of the form "As a _____, I want to ____ so I can _____", and then use it to rationalize modifying a primary UI (in a performance negative way).. if it's truly compelling enough, then spend the time to make the UI work. In my experience, that type of thing is better left to background exports (where performance is not critical).

Ed: The specifics on how things get broken are complex and fixing them can be non-trivial. But, punting these issues often means they keep showing up when you want to do some other completely reasonable thing. People talk about unit tests as a huge thing, but when your software can't handle basic functionality things are already horribly broken.

As to removing features to avoid slow pages when using the feature, that's a horrible tradeoff.

And further, the UI for sorting/filtering does NOT get in the user's way. Sure, a config screen with 100 check boxes on a page is bad UI, but a table that you can sort by any column looks almost exactly like one you can't sort, just that in the former case the column headers are clickable (or little arrows appear when you hover over the heading).

> Now the programming cost of allowing filtering by anything is minimal

Only in simple CRUD applications where the data being displayed is a 1-1 with how the data is stored.

In any non-trivial CRUD application this is not the case, and paging/filtering/sorting on a field that is being displayed quickly becomes non-trivial if you are trying to do it for every field.

> Now the programming cost of allowing filtering by anything is minimal.

The programming cost of allowing sorting/filtering on any field (especially if performance is not a concern) is small. Depending on the dataset, the resource (and programming, though this is usually less of an issue) cost of supporting efficient filtering and sorting on every column rather than a limited subset may not be.

I think I can isolate the entire issue to this one line, emphasis mine:

> Only a small percentage (6.1%-16.7%) of configuration parameters are set by the majority of users; a significant percentage (up to 54.1%) of parameters are rarely set by any user.

The necessary question is, how many installations used at least one rarely-set parameter? Would those installations have happened without that parameter? How much effort went into developing that parameter, versus the profits of those installations?

Reminds me of this article: http://www.joelonsoftware.com/articles/fog0000000020.html

The pull out quote being:

A lot of software developers are seduced by the old "80/20" rule. It seems to make a lot of sense: 80% of the people use 20% of the features. So you convince yourself that you only need to implement 20% of the features, and you can still sell 80% as many copies.

Unfortunately, it's never the same 20%. Everybody uses a different set of features.

The same idea applies to those advocating for having "simple" and "expert" configuration menus: everybody's expertise is different, and changes over time as we learn and forget things.

Plus there are power users and sheep. Sheep will follow power users, power users will expect extra features. No configurability means no power users means no sheep.

Right - just because I rarely change a parameter doesn't mean it's not important. In fact, some of the most important parameters are set once and then forgotten about.

Like, say, IFS in bash. Have you ever needed to set IFS? Do you know what it does? I've set it maybe 5 times and practically each one was to $'\n' so I can read newline-delimited input values in a while loop. But I appreciate the fact that I can set it to literally anything, if I ever need to.

> I've set it maybe 5 times and practically each one was to $'\n' so I can read newline-delimited input values in a while loop.

Does "| while read line" not do what you want here?

Also, until you have lots of users, how can you guess which 6.1%-16.7% of parameters most of them will use?

This makes me think of a cached model of settings: Most used can be configured in-app Less used need to edit the config Least used need to be set at build/install

I'd rather have confusing software that does what I want, than simple software that doesn't do what I want.

I'd suggest the proper response to this is, "Make the common case easy, make the uncommon case possible."

I'd rather have five simple tools that do what I want than one tool that can in principle do what the five tools do if you massage its configuration enough.

Would you rather massage the configurations of 5 tools? Or figure out how to transfer data from one tool to another (because I bet you that in most situations there's going to be slight - and annoying - incompatibilities) between what each tool expects.

At any rate, it's useless to talk about this in absolutes. There are times when your option is better () and when it's not.

I'd imagine it's when you're in full control of the input and output format for each tool, each tool can be configured to work with varying formats, when there are times you want to skip steps in the processing pipeline...

I agree it's useless to talk about this in absolutes.

But personally, I find composing separate tools helps me learn and debug because things are compartmentalized.

I was messing with a tool (where the equivalent tool at other places I've worked was 5 or 6 separate command line tools) where the command line help is 1,441 lines long. I dread having to look through it and still don't really understand most how to use it.

In other words, Unix (modular agile) way vs Windows (bloated monolithic) way.

The problem is that Windows way seems prevalent (or maybe I'm overestimating it). Perhaps, much to my joy, a focus would shift soon, but that doesn't look like it will given the abundance of new installs for the win10 nagware.

The way this plays out, it sort of reminds me of Linux (you can do whatever you want, but you're on your own!) versus Apple (we know what you want even better than you do). As a developer, I'm loathe to see my precious configuration disappear, but I can see the appeal for minimalist interfaces from the user perspective.

My experience tells me that the price you pay for being able to do everything is having to figure out how to do anything.

That's a reasonable trade-off, if the sole alternative is being unable to do something that you need to.

In my real-world tool kit for household use, I've got a configurable screwdriver with multiple heads of various sizes and shapes (slot, cross-point, etc.). I also have a bunch of conventional screwdrivers. I use the latter far more.

Opposite anecdote: I have a bunch of conventional screwdrivers in a toolkit, too, but I also have a single configurable screwdriver in the kitchen in a drawer. Rather than going out to the garage to dig through the toolbox, I find that most of the time I just use the configurable screwdriver. It's not as good at actually driving screws, but it's faster to change the head and do whatever needs to be done.

But when you come across a Torx screw, I bet you're really happy you have the configurable screwdriver.

Actually I have a Torx screwdriver too -- and the multi-head configurable screwdriver doesn't.

I often feel the same way too, but not because of difficulty of configuration -- in fact, in this case, you would have to figure out how to configure each of the five tools, so if anything things become trickier on that front.

Why? And what if that one tool came with five common configurations for the five usages?

Because simple tools are easy to understand and easy to replace when you find something better. Once you invested a lot of time into configuring your GenericTool(tm) to do what you want you're locked into its ecosystem forever. Also, simple tools that do one thing only tend to do it better than tools that do lots of things.

What if that one tool came with five configurations for your five usages, that you wouldn't have to touch?

You can still replace one usage with another tool, if you like. I think that the idea of many simple tools is good in theory, but seeing that most important codebases last for ten or twenty years, use of many tools increases the likelihood that you've introduced a dependency on a product that may be discontinued/unmaintained. "Easy to replace" is fine if you're very familiar with the code and all the assumptions it makes about the product's behavior, but if a tool that some module of your system that no one has touched in more than five years all of a sudden disappears[1], replacing it may be almost as big an undertaking with replacing the one big tool, but the chances of that happening are greater (because you have more such tools, and small tools are more likely to be discontinued than big ones).

[1]: Say, by no longer supporting your new hardware/OS.

Replacing just part of a big tool because there is a better alternative doesn't fly with management ("we already paid for big tool and it does what you want, why should we replace a part of it? It will just increase the complexity!"). I see small tools disappearing occasionally as a good thing. You should always be prepared to change parts of your system and a larger organization should have people that continuously work on improving the tooling. Assuming that your tooling will work indefinitely leads to ossified systems that can't be upgraded at all.

Imho, tools should be treated just like code. You wouldn't write one big function that takes half a dozen parameters to change its behavior. You refactor it into multiple functions that are easier to understand and improve when requirements change.

I'd rather have simple software that does what I want.

Exactly. Our tools should not add a layer of unnecessary complexity on top of already complex domains. If anything, they should take complexity away so we can focus on what is most relevant for the goals that we want to achieve.

A lot of configuration options come into existence not because they are actually useful, but because the team had a disagreement that they couldn't settle, or because writing code that just figured out the right thing to do would be too complex. Options should be a way of empowering the user, not simply offloading work to them.

It's very important to realise that our goals are not just related to the concrete problem at hand, but include things like "get out of the way and don't make me feel stupid".

But then a lot of options exist because the domain itself is complex and can't be captured in a one-size-fits-all solution.

The only way to make something both very simple and very useful for a domain is to cunningly cut out parts of the domain. The end result is a pretty, simple and opinionated tool that doesn't really solve anything for anyone, but sure looks good on screenshots.

This. Most "disagreements a team can't settle" are rooted in fundamentally different use cases of a complex domain.

Then either other people don't have the software what they want (because there are so many different needs when it comes to any domain) or the complexity is transferred to finding/evaluating the huge selection of software.

LOL that's called having your cake and eating it too, AKA impossible.

The real takeaway here should be that engineers tend to make UX that feel good to them as engineers, and most of their clients will typically not be engineers.

> I'd rather have confusing software that does what I want, than simple software that doesn't do what I want.

Of course you would, or I would, or anyone who browses HackerNews would -- but it's not necessarily the best UX.

If this is a business software, once the sale is done, users are going to pick it up because they pretty much have to. But the sale can easily fail in the first place if the product does not have a feature that particular customer wants, and the competing offering does.

Sensible defaults.

I remember learning a saying years ago on slashdot. Difference between Windows and Linux: easy things are difficult in Linux, difficult things are possible. Easy things are trivial in Windows, complicated things are impossible.

This was in XP era. Today the likes of Ubuntu/Gnome try their best to dumb down Linux.


That's how you get shiny toys, not useful tools.

The advice here is good if your primary goal is to successfully sell glorified interactive ads in a market where people buy software by looking how pretty it is and making their choice in 2 seconds. If that's your goal, then ok, one has to make a living, but in this case I don't want your software, and I'll discourage everyone from using it.

Sane defaults + flexibility are a way to go. Software should empower the user, and part of this empowerment may be suggesting a particular workflow, but it should not force one to use that workflow and never stray from it. And don't give me the excuse that "more options == much more complexity == much harder to add new features". That's only true if you write utter spaghetti code, and fuck it, programmers get paid so much so that they do this right.

Emphasis on sane defaults.

One issue is that "sane" changes over time, but it's hard to change defaults without upsetting people who suddenly find their defaults changed because of an update.

(Although I'd argue that's best solved by letting them be upset until they understand why the default changed.)

e.g. a Video capture which previously defaulted to 320x240 might later default to 640x480 and now default to 1440x720.

Or better yet, intelligently handle the change as a suggestion during the upgrade process. "We see you are still using the legacy default 320x240 setting. We are now recommending 640x480. Do you want to change to this new recommendation, or keep your existing setting?" Something like that.

I think Firefox is a great example here.

It has a simple UI to guide the user into finding and setting the most common options. That UI must be as simple as possible for new users and can be changed regularly, according to the changing needs of the majority of users.

It also has an advanced, generic but type-safe UI (about:config) for finding and setting any possible option present in the software.

Configuration files are the traditional way to achieve this second level, but they don't always help the user in finding the options and are generally not type-safe.

I think about:config is terrible. It's really hard to find anything in it. It's very "if you know its there you will use" but has no discoverability.

For instance I was wondering if I could have one tab connect over a VPN while the rest were over my normal connection. I have no idea if that's an options somewhere in the millions of toggles

EDIT: You know how applications used to be have an "Advanced" button that'd give you a nicely organized display off all the "extra" toggles? Whatever happened to that paradigm?

> You know how applications used to be have an "Advanced" button that'd give you a nicely organized display off all the "extra" toggles?

I think your example of about:config is also an explanation for why that went away. It used to be possible to reasonably put all of the more esoteric settings on a single page (or a small, single-digit, number of pages), but that's no longer the case for many pieces of software. Also realistically the overwhelming majority of users probably already weren't going into the advanced settings, and those that were are probably more capable of handing about:config.

At least there were options that were grouped in a sane fashion and discoverable while I was in a particular screen trying to grope around to find a setting that I needed to change. about:config is not discoverable in any way, shape, or form and we've lost the ability to discover capabilities that you don't look for in particular via a broader search. The ability to find something in passing that is related to your mission has severely diminished.

The problem with something like that is maintaining and testing all the possible settings.

I can't possibly fathom how firefox tests all of their settings. There's just no way. And that leads to edge cases and buggy software.

It isn't as big a problem as you might expect.

A very low percentage of users even change the advanced settings, so any issues which might exist have a very low impact.

If a user does hit a problem with some combination of settings, it is more likely they will report it with some useful reproduction information, since they are likely more of a power user than the average user.

Finally, anyone changing advanced settings is more likely to understand that they may break something and are less likely to be upset, when compared to something fundamental breaking for all users.

In fact, Firefox even warns you upon navigating to about:config that you could very well break something.

And chrome warns you too, but it doesn't stop the flood of help posts in the forums, on reddit, and other places when something is broken because of a setting that someone forgot they had enabled.

Just recently one of the updates for chrome broke the ability to interact with extensions if the user had enabled the flag "Material design in the browser's top chrome". There were tons of angry posts everywhere about how unusable chrome was because of that change, and how they should test their software better...

As for only "advanced users" knowing what happens in there, just do a web search for "chrome://flags" and look at any one of the hundreds of blog posts, news articles, and tutorials showing you how to go in and turn on these few flags that "make chrome better".

People compile binaries off of the internet and sudo make install them right over /usr. Go Google for all of those blog posts.

Evidence of the stupidity of the masses is not evidence that there is anything wrong with leaving advanced options behind a scary warning and letting people break things sometimes.

In fact, I'd wager that if you took away those advanced options, people would find some more dangerous way of achieving their goals, and there would probably be even more breakage and more blog spam complaints over the status quo.

Things are fine. The sky is not falling.

Well the chrome team is making that gamble, so I guess time will tell if it pays off!

And yet more knobs is the difference between Photoshop and Elements, or Elements and Photo Viewer. I'll be Photoshop is full of features with <1% usage, and that's how it should be. Those components don't have to be (shouldn't be) displayed at equal visibility to others, but they shouldn't go anywhere. The entire value of the program is that if you do want them, they're there.

Removing low-use features is a dangerous game; much of the best software out there (Excel, Photoshop, Viz, etc) is defined by an enormous feature set and a high skill ceiling.

Arrghh...jesus there are a lot of "engineers" in here that can't seem to see the forest for the trees.

Yes we developers and engineers as a class of human beings love being able to set things to the nth degree as can be seen by the comments here but that's not necessarily the right choice for the vast majority of projects.

Most normal human beings want simple tools that do what they say on the box and don't require much cognitive load at all.

Every setting, every "knob" introduces to the average user a feeling of frustration, confusion, complication.

To an engineering minded person they might find all those options exciting but to someone who just needs to GET SHIT DONE those extra options are hellish and anxiety inducing.

In a few rare cases an application is best designed with more knobs but it's very definitely the exception and almost always a clear sign of shitty design made by engineers for people who they imagine are like them and not professional designers who have actually worked with the general public and know what they actually want, how they think, what they need.

More knobs is almost always a sign of lazy thoughtless design.

> More knobs is almost always a sign of lazy thoughtless design.

I'll put forth an economic justification: specialization. Experts are happy to cope with the explosions of features, and so happy to take a tradeoff of less usable software if their one additional feature is in there that they're willing to pay a premium for it. They spend a lot of time with the software, learning the keyboard shortcuts and tuning knobs.

Conversely, novices like you and me don't like any of that. We want to get shit done, even if that shit isn't exactly what we wanted, or held to the high standards of a professional. The software they need is easier to write -- you don't have to implement obscure features, and you don't have to make the UI cope with all those features.

So the fact that Photoshop exists isn't a sign of lazy design, but of a market for professional users whose niche is being catered to.

Also, the dataset the paper in question focuses on is Apache httpd, MySQL, and Hadoop. These are not technologies for average users or the general public. These are clearly the domain of experts. It's not entirely clear to me that the fact that few people use given Apache config option means I shouldn't be allowed to change it. Sure, most Chef recipes don't use ModRewrite rules, but that doesn't mean I would be better served if Apache removed those knobs.

Building software for novices with simple ui is much harder than building software for experts with a gazillion settings. Where there are multiple ways to design something, you can punt the problem to the user and add a setting, or think hard about how the software will be used and figure out a settingless design. The second strategy is definitely the harder path.

> Building software for novices with simple ui is much harder than building software for experts with a gazillion settings.

IME, people advocating for simple UI end up advocating for the removal of features. That's often fine, but it's also why people like Torvalds bounce between desktop environments as things are Simplified with no secret handshake to undo them.

Even the research suggests that over 50 percent of Apache config setting points can be removed without affecting more than 1 percent of the customer base. I'm not clear if that means each feature is 1 percent or 1 percent total. It's also not clear how they can feasibly disentangle people who use obscure features and don't need help with it. (Also, I'd love to see concrete suggestions for improvement from that pool to better understand what counts as obscure here.)

That said, the Apache config has plenty of room for improvement. Why do we have to write chef scripts to calculate MPM settings and avoid OOM'ing the machine?

The solution to the "too many options" problem is to organize the options better, not to remove options.

Every single function in Excel, or Photoshop, is there because someone needs it. Should we leave those people out in the cold? Should we fragment major applications into dozens of sub-versions, each with a single specialized feature set, and force people who need more than one to juggle multiple programs?

I can get behind the idea of a "lite" program and a full-featured program in any given category. And, indeed, there are plenty of simple image manipulation programs for people who don't need all of Photoshop. (I'm not sure if there's an equivalent for Excel, but I'm also not sure Excel needs any changes to be accommodating to very basic spreadsheet use.) What's the benefit in removing options that someone is using?

I have a lot of respect for the walled garden design pattern. It's very rarely the right decision to remove options, but it's often the right decision to hide them. Even power users apply 10% of the features 90% of the time, and those are the features that deserve hotkeys and big obvious buttons.

Other options can live comfortably behind buttons like "One Time Configuration Settings" or "PROBABLY NOT WHAT YOU WANT, DON"T CLICK" - there's no reason that Insert mode in word needs a bumpable hotkey. But they should still be there. There's nothing worse than a program that obviously supports an operation, but has removed your ability to do it any way except a few presets.

Of course, all of this conflates 'professional' software like Excel and Photoshop with 'casual' software, but even in the casual case there's a lot to be said for not actually removing features.

I don't think I'm missing the forest for the trees, I'm just complaining about preferences presented as absolutes.

It's absolutely true that "more knobs" is often evidence of laziness, or bad design, or sloppy thinking. Many (most?) programs should have fewer knobs than they do, and should elevate a few of their knobs while burying the rest. Knobs are distracting and confusing even for power users, and there's a lot to be said for putting 90% of the content behind a big red button reading "ALMOST CERTAINLY NOT WHAT YOU WANT".

I primarily objected to two things here:

One is the failure to differentiate general-use software from poweruser software; Photoshop and its ilk are hard to use, but their feature-rich approach is what makes them employment-worthy skills. Glibly applying usage statistics to Photo Viewer and Photoshop equally feels like a category error to me.

The other is bad statistics. "many features are used by <2% of users", well great. But that isn't grounds for hiding/removing them unless it's the same set of users! Most of the data gathered in this article overlooks questions like "will you break 2% of workflows, or 90%?" and "among rarely-used features, how many are one-time preference settings?"

I don't mean to imply that most software is good, we all know better than that. Fewer knobs should be a goal, when all too often more knobs is a goal. But that doesn't mean I'm especially impressed with this treatment of it.

Not sure why you're getting downvoted. I completely agree.

I use photoshop maybe once every two months and have found that its interface is completely inaccessible if you don't use it full time.

I understand that it's great to have full control over everything, but there's a point where it's overkill in Getting Shit Done. Hell, it's gotten to the point with me where I'd rather use cv2 over photoshop/gimp for most things...

Meanwhile, I fire up the average application that uses the Gnome HIG spec, and I find it horrendous to use. They take it a number of steps further and remove, completely and thoroughly, any ability to customize. I very much like the Apple mentality circa 2005 with sane, intelligent defaults, a small number of configuration options in a small number of screens, and an advanced button for everything else. Apple did a very good job back then of making their software feature complete and customizeable without making it a shit show in either the too dumb or too complex direction.

There's a middle ground. I'm fine with tons of configuration, but at least make some sane defaults or prompts so that I don't have to read a giant wiki to do an initial setup. Everybody initially runs through it, so why not spend the time it took to write the wiki and write prompts for an installer, instead?

Chances are Photoshop is the wrong tool for you. It's the wrong tool for a lot of people.

It doesn't make Photoshop a bad tool or overkill. It makes your choice in using it bad, and makes it overkill for your specific task.

A nail gun isn't a bad tool because a hammer is simpler and more intuitive, and a hammer isn't a bad tool because a rock is simpler and more intuitive. We don't expect roofers to use rocks, and we don't expect someone hanging a picture up to use a nail gun. Specialization is not a bad thing.

Bear in mind, though... This study profiled MySql, Hadoop, Apache, and Storage-A. None of those are casual-user tools, and removing feature options, even rarely-used options, would be a serious and unjustified breaking change.

Right, I get that, and I still agree. There are WAY too many features in these things that should definitely exist, but sane defaults would go a long way.

Really, I'm tired of being told "read the documentation" and "You just haven't used it enough" when documentation is given 5% of the energy it should be given.

I'm convinced that the only reason so many SV projects exist because of bad config management in OSS projects.

If your relationship with your users is purely as a parent, knowing better than they what they do, then your analysis is correct.

It is possible that these children might have some useful ideas about how to use their tools, which you did not anticipate.

I would encourage you to use this analysis to decide 'what to put on the basic config knob box', rather than 'what freedoms will I permit my users'.

I'm not sure what you're point is. The fact that Photoshop and Elements both sell well enough to continue their development kind of proves the point. If removing knobs didn't work, Elements would have failed. And in fact, there's a level in between - LightRoom which removes a lot of the functionality (knobs) of Photoshop, but keeps several bits important mainly to photographers. There are some users for whom the knobs work, but not for everyone.

> a significant percentage (up to 48.5%) of configuration issues are about users' difficulties in finding or setting the parameters to obtain the intended system behavior

this is the important bit to me.

i don't like using software that feels like it could fall over because i turned the wrong combination of knobs the developer didn't anticipate.

opinionated, fixed configuration is a nicer experience than an app that can do anything, if you bend it to your will.

The thing is - some of us want software not for "experience", but for its utility. If I want "experience", I can go to a theme park or play a video game. I expect my software to help me achieve my goals faster and more efficiently, or to at least eliminate clutter in my life.

Also, nothing really prevents one from having the settings hidden in the "advanced" menu. It's not either/or.

Nobody ever says they want fewer options, but that's because it's never such a pressing issue as not having the one you want.

In a program I develop, I sometimes remove an option and people complain, then it turns out they were randomly tweaking that option to try to solve some problems that it was never going to solve in the first place. They just liked the feeling of being able to control _something_. So those are cases of bad options that should have been removed because they made it harder for people. People just can't help but try changing settings when something doesn't work, and hiding them in an "advanced" menu doesn't deter them. That's where all the most powerful settings are afertall! It encourages frustrating time wasting.

Or a configuration file with proper documentation.

> opinionated, fixed configuration is a nicer experience

Even when those opinions are wrong?

Compiz is the perfect example of this, the defaults are broken, there are thousands of options with no constraints, if you change one option to fix the broken default you will be guaranteed to break some other feature.

This is a very important question to ask, but I am a little frustrated reading the paper. I went into reading the paper expecting answers to these questions :

1. How much is the problem of having too many configuration options mitigated by having sensible defaults ?

"a significant percentage (up to 48.5%) of configuration issues are about users’ difficulties in finding or setting the parameters to obtain the intended system behavior; a significant percentage (up to 53.3%) of configuration errors are introduced due to users’ staying with default values incorrectly."

The former means the configuration options provided do not match the ones desired by the user, not whether there was too many or too few. If anything it encourages software authors to provide more knobs. The latter doesn't have a strong enough correlation to the number of configuration parameters at all. For example, say all these errors happened at Google because of high load, and they were using Apache with default configuration which was built for small and medium scale websites.

2. How does having many configuration options affect software update process ?

No idea

3. What percentage of users are unhappy due to having too many knobs ( decision fatigue, fear of missing out ) ?

17.3%∼48.5% of users calls to the technical support center and questions posted on forums. I assume this is a conservative estimate.

4. Does having too little knobs cause software to be forked and cause fragmentation ?

No idea

5. Does the result differ when applied to application software ( vs system software ) ?

Not in scope.

Need more research.

As soon as you have multiple dimensions of configuration it might be impossible to provide better defaults-- that "wrong for 53.3%" could be a good number reflecting the best possible defaults.

Not really. The video players on Linux have tons of options that I could set, but I hardly ever do.

And do those options clutter the ui?

I use vlc. 99% of the time all I need is the standard ff, play, rew and the que bar. But I'm also one of those uses that uses many of the other buried features. If they weren't there I'd switch players but they're buried and so the basic ui is uncluttered

I can only attest, configuration bloat is especially relevant to complex enterprise products, for obvious reasons. What we normally do in our products is to provide a simple interface to view a filtered list of non-default settings. We also have a way to export this list of custom setting names (not values), so we can query our customer base once in a while, to check which settings are not utilized. We've used this trick to internalize or EOL a lot of "knobs" that haven't been used at all.

>configuration bloat is especially relevant to complex enterprise products, for obvious reasons

I've always assumes that enterprise software has complex configuration options because it's more bespoke than the supplier would have you believe. So rather than delivering a piece of software tailored to the need of one, or a few customers, the new behaviour just becomes a setting in a much, much larger system. The goal being having just one code base.

Our product is on the small side of "enterprise". We have many configuration options that are customer-driven, but we're not trying to replace bespoke software.

For example, we have 5-6 different settings for password policy. Most customers use the defaults, but the ones in banking and government usually have very specific security rules that require them to tweak the settings. One allows 10-char passwords but requires them to expire in 60 days. Another is 12 characters and 90 days. And so on.

We had that too. At some point we added LDAP authentication and it took care of user credential management. LDAP (AD) is universally present in the enterprise.

tl;dr: "there is too many setting, most people won't even use them". The solution IMO is not to remove those knobs completely, but to set defaults well and be able to 'pop under the hood' when you need as you probably know what you're doing by then. A good use case is an editor - most recent successful editos such as Sublime, VSCode etc are minimalistic to begin with, but very configurable behind the scenes. For starting user it's nothing but text editor, for an advanced user it can be a full blown, build, testing and deployment enviornment

Getting the 80/20 rule right on defaults is extremely important. I see this especially in well-written mobile sites, where the GUI complexity cost of configuration is higher.

When making Office 2007, the MS Office team had the problem that 99% of their feature requests were for features already existing. Users couldn't find them.

Their solution was a UI change, from menus and toolbars, to the oft-maligned Ribbon. Advanced settings were kept almost the same as in previous versions, and the button to access those was an arrow in the section containing common settings.

Users couldn't find them.

That's my problem with the oft-maligned Ribbon. It changes it's appearance based on the window width.

So the first time an option might be a large-icon-with-text.

The next time you look for it, it may be a small icon, have no text, or be completely hidden. That's 4+ visuals for the same option.


Never mind that menus could be navigated by keyboard. You hit alt, then either the highlighted letters or used the arrows to move around.

Do it frequently enough and you can pretty much do it blind.

“Star had many fewer commands than today’s systems, and it didn’t do it by having fewer functions. It just had fewer commands.” — Dave Smith¹ at The Final Demonstration of the Xerox ‘Star’ Computer²

Most UX designers are not as good as David Smith; they worship Apple/Jobs' “function follows form” misunderstanding of the Star GUI, and think you need to remove functionality in order to remove complexity.

¹ https://en.wikipedia.org/wiki/David_Canfield_Smith

² video: https://www.youtube.com/watch?v=_OwG_rQ_Hqw text: http://archive.computerhistory.org/resources/access/text/201...

The VMS product managers at Digital figured this out in about 1985. VMS was horrendously configurable, with all sorts of system settings to tweak.

As I recall, product management forced the issue by making it so the dev team didn't have access to their own machines to tweak parameters. It didn't take long in VAX years for the OS to become more self-configuring. It was a draconian approach, but helped customer satisfaction.

This blog is providing a valuable service, aggregating and commenting on research. Are there others like this?

On topic, I do think studying interfaces is exceptionally helpful. Just take a look at all the approaches in music! Faders, knobs, buttons, toggles...the gamut. Such a fascinating issue...and can be confusing, no lie.

But too manky knobs? How about Too Many Buttons?!

(It's a DJ parody video and actually hilarious):


Havoc Pennington in 2002: http://ometer.com/preferences.html

This is the kind of thinking that destroyed GNOME 3.

I was thinking precisely this.

"User-friendly" depends entirely on the user in question. I find Gnome 2 to be considerably more "user-friendly" than Gnome 3, because Gnome 3 frustrated my attempts to do what I expected to be able to do.

For that matter, I consider bash (and coreutils and family) considerably more user friendly than Gnome, for a given value of "user" (me).

And thats why i think KDE back in the day, before plasma and all that bling, had the right idea. The basic behavior emulated Windows 9x pretty closely, thus anyone passably familiar with Windows could jump right in and use KDE.

This in turn lessened the transition friction, as people try to do something but do not find it where they expect it from memory.

So, users don't change the settings on your software? Good for you. That means you have good defaults, and should be one of your goals.

Unlike an interface on the net with to many choices, in a none emergency situation, I love having too many knobs.

There are some good observations here. I believe a large part of the interface problem is that people try to design interfaces that make tools easy to use, but it's usually better to design interfaces that make tools easy to understand. There's a difference.

It's funny that they call it "over-designed" when in fact over configuration is usually the result of the opposite: lack of design.

My own saying on this:

"That which _can_ be configured _must_ be configured",

corr: "Defaults Aren't".

Inspired by my early experiences with early Windows 3.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact