Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think you picked a hypothesis and assumed it was true and ran with it.

Consider that all the following are true (despite their contradictions):

- "Bloated busy interface" is a common complaint of some of Google, Apple, Microsoft, and Meta. people here share a blank vscode canvas and complain about how busy the interface is compared to their 0-interface vim setup.

- flat design and minimalism are/were in fashion (have been for few years now).

- /r/unixporn and most linux people online who "rice" their linux distros do so by hiding all controls from apps because minimalism is in fashion

- Have you tried GNOME recently?

Minimal interface where most controls are hidden is a certain look that some people prefer. Plenty of people prefer to "hide the noise" and if they need something, they are perfectly capable to look it up. It's not like digging in manuals is the only option






If I had to pin most of this on anything I’d pick two:

- Dribbble-driven development, where the goal is to make apps look good in screenshots with little bearing to their practical usability

- The massive influx of designers from other disciplines (print, etc) into UI design, who are great at making things look nice but don’t carry many of the skills necessary to design effective UIs

Being a good UI designer is seeking out existing usability research, conducting new research to fill in the gaps, and understanding the limits of the target platform on top of having a good footing in the fundamentals. The role is part artist, part scientist, and part engineer. It’s knowing when to put ego aside and admit that the beautiful design you just came with isn’t usable enough to ship. It’s not just a sense for aesthetics and the ability to wield Photoshop or Figma or whatever well.

This is not what hiring selects for, though, and that’s reflected in the precipitous fall in quality of software design in the past ~15 years.


> Dribbble-driven development,

I've been calling modern designers "dribbble-raised" for a while now precisely for these reasons. Glad to see I'm not the only one.


I agree with you it's very fashion driven and hence you see it in all kinds of places outside the core drivers of it. But my argument is, those fashions themselves are driven by the major players deciding to do this for less than honorable reasons.

I do think it's likely more passive than active. People at Google aren't deviously plotting to hide buttons from the user. But what is happening is that when these designs get reviewed, nobody is pushing back - when someone says "but how will the user know to do that?", it doesn't get listend to. Instead the people responsible are signing off on it saying, "it's OK, they will just learn that, once they get to know it, then it will be OK". It's all passive but it's based on an implicit assumption that uses are staying around and optimising for the ones that do, making it harder for the ones that want to come and go or stop in temporarily.

Once three or four big companies start doing it, everybody else cargo cults it and before you know it, it looks like fashion and GNOME is doing it too.


> I do think it's likely more passive than active. People at Google aren't deviously plotting to hide buttons from the user.

This is important, thank you for mentioning it: actions have consequences besides those that motivated the action. I don't like when people say "<actor> did <action>, and it leads to this nefarious outcome, therefore look how evil <actor> must be". Yes, there is always a chance that <actor> really is a scheming, cartoonish villain who intended that outcome all along. But how likely is it that <actor> is just naive, or careless, or overly optimistic?

Of course, the truth is almost certainly somewhere in the middle: familiarity with a hard-to-learn UI as a point of friction that promotes lock-in may not be a goal, but when it manifests, it doesn't hurt the business, so no one does anything about it. Does that mean the designers should be called out for it? If the effect is damaging enough to the collective interest, then maybe yes. But we needn't assume nefarious intentions to do so.

Then again, everyone thinks their own actions are justified within their own value system, and corporate values do tend toward the common denominator (usually involving profit-making). Maybe the world just has way more cartoonish villains than I give it credit for.



Somehow in your theory you omit the fact that people can learn how to use a new interface? It’s not like you’re entitled to a UI that never adds functionality anymore, ever. Sure, vendors ought to provide onboarding tutorials and documentation and such, but using that material is on the user.

I think you picked a hypothesis and assumed it was true and ran with it.

The tone of your post and especially this phrase is inappropriate imo. The GP's comment is plausible. You're welcome to make a counter-argument but you seem to be claiming without evidence their was no thinking behind their post.


UIs tend to have a universality with how people structure their environments. Minimalism is super hot outside of software design too. Millennial Gray is a cliche for a reason. Frutiger Aero wasn't just limited to technology. JLo's debut single is pretty cool about this aesthetic https://www.youtube.com/watch?v=lYfkl-HXfuU

> Have you tried GNOME recently?

God, no. I switched to xfce when GNOME decided that they needed to compete with Unity by copying whatever it did, no matter how loudly their entire user base complained.

Why would I try GNOME again?


> Why would I try GNOME again?

It is widely used, the default DE in many installs, and it can be handy to be familiar with, for starters.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: