Hacker News new | past | comments | ask | show | jobs | submit login
Good times create weak men (tonsky.me)
84 points by gonational on Jan 2, 2020 | hide | past | favorite | 40 comments

Rather than blaming abstractions, wouldn't it be better to blame the structural pathologies built into these huge companies? Maybe the abstractions to blame are the ones on the corporate ladder. Among other things, there can be poor encapsulation (e.g. micromanagement), bad memory management (teams being shuffled, broken up, pulled out of one project and put to work on another) and misguided, dogmatic application of design principles (e.g. Apple: everything must look beautiful and sleek, even if in practice it is hard to use, increasingly often. Various companies: "we need to change the interface every time we release a new version", with nary a thought to those who would prefer to master a UI rather than have to constantly adapt to its sweeping changes.)

Part of it may also have to do with big companies' product teams being largely insulated from failure, through lack of competition and lock-in. Where's Apple Music app's competitor? How are Twitter's users going to rebuild their network and brand if they switch to another microblogging service? These companies are too-big-to-fail in a different sense (that their profitable and successful products underwrite their mediocre offerings in other areas.)

Big software companies aren't the only big companies around producing ugly, unusable products with few alternatives. Until those rare times are mightily and unexpectedly defeated by an unexpected upstart, we get to suffer from their mediocrity, especially those of us who are in a sense UI connoisseurs.

Yeah. Just look at Boeing in this context (the problem involved software but it was fundamentally a hardware problem). "How can a company with such resources, experience, and stake, mess up something as fundamental as steering the plane?" Organizational failures, that's how.

Solid post, but complaining is easy. Programmers have complained about slow software, excessive complexity and abstraction for almost the whole history of computing.

What are some non-toy examples of people doing it right? What are concrete steps we can take to write cleaner, faster software?


I think HN itself is a pretty inspirational example. It's one of v few sites that always loads fast, even on a bad network. It has zero ads, zero trackers, and very little fluff.

So what does HN put significant effort into? Moderation, tools for moderation, algorithms to detect spam and brigading, etc. So one concrete suggestion is to focus your engineering effort and your hiring "below the waterline". The best software is often minimalist, with lots of thought and effort invested below the surface. A classic example is the original google.com: just two buttons and a text box.

Much bloat comes from teams that only know how to think about the part of the software they can see.

Sublime Text would be IMO an example of doing it right, at least from the perspective of this author. It's quite a bit faster than other recently developed text editors, while remaining attractive and functional.

However! There is one big problem with "doing it right" in the Sublime Text way. It's clear, at least to me, that VSCode is overall easier to use, because the plugin ecosystem is better (for C++). This is despite the fact that VSCode is a bit slower, being based on Electron and web technologies rather than native GUI libraries.

I'm not positive why it is that the plugin ecosystem is so much better. I will say that when I once tried to write a plugin for Sublime Text 2, the API was not great, and it was difficult to build asynchronous work into the plugin. I wonder if the choice of Python as the plugin language was worse than Javascript, if only because Javascript has some well thought-out solutions to asynchronous operations.

To be clear on why Visual Studio Code is better, there are between 20-25 people working on it at Microsoft[0]. Both Sublime Text and TextMate, the editors that were replaced before GitHub and Microsoft got in this game, were largely written and maintained by a single developer. So there's your answer about why they're better. They have a gigantic order of magnitude more resources poured into them, and they're free.

Maintaining the best text editor in the world can be done on budget that's still a rounding error to a big company, and for whatever reason some of those companies are interested in owning that market now.

[0]: https://blog.robenkleene.com/2019/04/25/visual-studio-code-c...

Sublime Text was mostly developed by one person and costs $80.

That's a decent guess, but Sublime Text anecdotally seemed to be used by a ton of people, and for a while the plugin ecosystem was much more developed.

Sublime Text had a 7 year head start. Some people used it but always wanted an open source equivalent. Some people switched because ST3 was in beta for 4 years.

And any issue can be fixed the HN way - write a rule forbidding discussion of the issue.

>Apple has been known for its attention to detail in the past and it has served them well

This was true 15 years ago and not today. I tried to sync my mom’s photos on her iPad with her Mac when I visited a few days ago, and was unable to accomplish the task in the time I was there. I had to do a software update for the Mac to talk to the iPad, which didn’t necessarily fail, yet never succeeded either, just had a 100% complete loading bar indefinitely. No feed back, nothing. Bad UX. Restarted many times. At one point I thought I bricked her computer because it wouldn’t boot. Maybe I’m not that smart, but should I need to be intelligent to use Apple products?

I think what’s different is who is screwing up software today. Apple used to be great at UI/UX. Ironically the best UI/UX I use today is Gnome3 on Debian, and I like Windows 10 more than OS X now. That would have been unheard of in Apple’s prime. This was the opposite 10 years ago. You could find bad examples of software gore in the previous era. I think this is cyclical, there’s just more consequences now than before.

That headline, as a sweeping statement about society, is crap.

The idea that people are becoming lazier programmers because they can, however, seems intuitively obvious. But that's kind of the idea - increasing productivity inevitably means doing _less_ of something or other. It's just (hopeful use of the word, to be sure), a matter of making sure you don't create false economies in the process.

> They do not need to know, exactly, how X is built, why it was built that way, or how to write an alternative X from scratch.

This speaks to me so strongly. I’ve been doing web development since the mid 90s. I see so many newer developers who just know the latest frameworks (e.g., React), but don’t really understand what came before and why things are done the way they are. I’m flabbergasted at some of the things they do because they don’t really understand what is going on under the hood. They don’t understand when a library/framework/feature is the appropriate tool, and when it is not.

I would like to see every tutorial, documentation, etc. explain WHY it is done in this way, WHAT are other/older ways to accomplish the same task and why the new abstraction might fix some issues, WHEN this thing is the appropriate solution and when it might not. Most seem to concentrate on the HOW to use it.

There are people who thought C was a “high level language” and thus, inefficient. We are always abstracting away until we can deliver value.

Sometimes we consume so much resources from our abstractions that in order for the value to not be entirely consumed by it, we optimise.

But the abstraction trend continues.

This doesn’t just apply to software design or digital service infrastructure; it also applies to our entire society. From farming to accounting.

I've run into this recently when trying to launch a new site after being away from web design for nearly 15 years (i.e. since smartphones). The tools, expectations, and target devices are so different now. After looking at the Source of most sites these days, I'm honestly surprised that anything works at all.

What can seem to be a simple site on the surface might contain dozens if not hundreds of pages of attached code. Gone are the days where sites used just a few pages of HTML+CSS to display everything. It's rare to find a site designed in the past 3 years and to view the source and think "someone definitely hand coded this" and even more unlikely to find a site that loads less than 1MB of resources these days -- even an old school site like Slashdot is >5MB now. That's insane.

Unless they're goat farming and milling ores with a stone mortar and pestle, everyone exists somewhere in the middle of the overall tech stack. There are more-fundamental things below their range of expertise that they'll have fuzzy theoretical knowledge of, if any. There are more-abstract things above their range of expertise that they probably think are frippy wastes of time. And then there's "real engineering" in the middle.

The longer humanity exists, the taller the tech stack gets and the further we get, on average, from goats and ochre.

I find Jonathan's complaints about Language Servers and LSP a little . . . wrong? We have editors written in many different languages. We want them all to be able to show helpful information about many different languages. The languages themselves are written in many different languages, with many different linking conventions. Supposing you'd rather not make each language-editor integration a significant effort, what is the solution other than some sort of protocol?

You could argue that the protocol should simply be an API, but what kind of API? A C API? Well enough, but that means you have to provide some kind of C API even if you're developing the language capabilities of a self-hosting programming language like Go. That is a little annoying, and it's not obvious to me that there's a huge win compared to JSON-RPC. I mean, it'll be faster, yes, but not in a way that's going to matter compared to the other work that's being done.

Don't most LSP implementations just communicate with their parent process over stdin/stdout? I mean, that's pretty Unix-y, and not so different from how things were done in the old days, besides the communication having some structure instead of plain text.

This is something I know a tiny bit about, but I know less about the other complaints Jonathan has. It does make me wonder a little if there's some cherry-picking going on. I don't by any means disagree with his conclusions: simplification is best. However, I don't know how to get from here to there (where here is: "we're just treading water keeping up with privacy and security mandates and adding urgent features for our customers").

Anyone who expects Apple will ship any high quality software these days has had their head in the sand for quite a while.

I seen this posted all the time on various platforms but as someone who has an iPhone and an Android tablet my iPhone is consistently better and has less head aches than any of my prior phones. Can you explain what you mean by this?

My Nokia 3310 had significantly fewer headaches than my iPhone 11.

My iPhone 3GS had significantly fewer headaches than my iPhone 11.

Apple is particularly weak when it comes to cloud and services. Apple is pushing iCloud and other services really hard. There is absolutely no ability to troubleshoot anything in these platforms.

Every time I have a fault with my Apple Watch or my iPhone or iPad, I'm told I need to do a factory reset and not restore from backup to validate if it's something in my working environment that's caused it. If it is, well then, goodbye years of history, because there's no selective restore.

I tried to upgrade my Mac from "10.14.6 (Mojave)" to what the system preferences software update screen describes as "macOS Catalina (null) - 8.18GB", and when I hit the Upgrade Now button, I get a popup that says "This Mac is configured in a way that is incompatible with macOS Catalina." I have a single disk inside my 2015 MacBook Pro, single partition, FileVault2 enabled, and I after contacting Apple Support and going through a screen share, their solution is that I need to... backup my data with time machine and reboot into recovery mode to do an Internet Recovery reinstall. Apparently it may be because I have the weird non-standard configuration of having full disk encryption enabled, but the support person couldn't actually point me to anything that could indicate that on my machine, as even they have no ability to troubleshoot anything.

Seriously. Fuck Apple.

Note: I am thoroughly into the Apple ecosystem, and loving it about as much as I loved being in Microsoft ecosystem in the 2000s. Like Microsoft offerings were at the time, Apple's are better than the alternatives. They're still terrible.

This is total horseshit. The supporting evidence for this argument is a bunch recent public software failures that the author says 'must have been due to complexity' but then offers no proof. Sure it could have been due to complexity but it could have due to many other things such as mismanagement.

Then at the end the author says:

> Docker and Electron are the most hyped new technologies of the last five years. Both are not about improving things, figuring out complexity or reducing it.

That's not why these are 'hyped' technologies. These things solve real problems. Electron solves the problem of a cross platform GUI with permissive licensing where there are an abundance of cheaper frontend devs vs QT experts.

Sure, Johnathan Blow said we could just copy programs between computers in 1960s but that's because every computer in the lab was the same back then. But that isn't reality today. Today, if you want to ship some server applications to different business customers you'll find yourself having to maintain N different install scripts for the N different distros your customers have. Or you could simplify and use docker and have a consistent environment for your application.

> They do not need to know, exactly, how X is built, why it was built that way, or how to write an alternative X from scratch.

Why on earth would they? I want them to build Y not build me an X from scratch. But I assume they are competent engineers that can solve novel problems and they would research and build me an X if tasked to it. After all most software engineers got their engineering knowledge from trying to solve problems they were tasked with in their career, not in a classroom.

There is no intelligent discourse in this piece. The author could have looked at why these technologies were created and adopted, maybe have made the argument that their adoption is unwarranted due to X. Supported X with good examples and proof.

I think the last line is quite telling: "[docker and electron] are just compromised attempts to hide accumulated complexity from developers because it became impossible to deal with"

The alternative is what, infinitely increasing complexity? Every industry has division of labour: If you are a chef baking cheesecake, you don't need to know how to make cheese. If you are a cheese-maker, you don't need to know how to milk a cow. If you are a farmer, you don't need to be a cattle vet, or be skilled in building a barn or repairing tractors, or in smelting stainless steel that it's made of.

Step into the software sphere and the abstractions we are building are leaky, and unstable. The best example of this is Kubernetes -> it has a hundred moving parts, and if you got an error it could be because of the problem in the kubelet, or in underlying Docker, or in Flannel, or in IPtables, or linux kernel settings, or in the settings of hypervisor, etc, etc.

Instead of creating productive division of labour, in IT we create UNPRODUCTIVE division of labour: Ow, this is written in .Net and that is written in Java, But I only know Golang, Javascript, C++ and Python!

The most elementary problems don't have a standard solution: how many ways are there to format a settings file? Probably a few dozen. How many of them afford the user syntax highlighting?

I believe the root cause of bad quality software is accountability: an average android phone ships with dozens of bugs. You could not ship a car with a missing wheel. But if WiFi calling doesn't work, nobody cares. Companies know that they can get away with it, and they do. If your management knows that shipping buggy software is fine, and so does the rest of the industry, actions of individual developer can only go so far.

I don't disagree that the ladder of abstraction is perilously tall, but every example of slower/buggier software in this post is assumed to be caused by abstraction. So many other things go into creating software - especially at large companies where coordination across teams and departments often results in compromises both in design and architecture.

Isn't this why universities teach stuff like compilers and OS. Not that knowing those things would help. These are more QA issues.

In the past, there were very few programs and they had to run very well. The cost of developing and deploying software was higher that care had to be taken, deep expertise brought in, so that low level code, that you might never get to patch, could run reliably for years.

Now that we can write, update and deploy software so easily, there’s a very long tail of mediocre programs that get the job done. The average program will of course be slower and buggier than the average program 20-40 years ago. But I have millions of commoditized apps at my fingertips compared to relatively few 20-40 years ago.

It’s also the case that in the past we were forced to prematurely optimize. Google can update gmail tomorrow to make it faster for everyone. Even 10 years ago as a native software dev, I had to obsess over the slowest piece of hardware I needed to support.

The stuff about Apple bugs is a bit of a stretch. The most likely reason is they had to prioritize other things to get it out the door, and it had a few warts. Who hasn't worked on a project like that. :)

This is UI. Imagine the ML mess we will have in 20 years.

I haven't read the page yet, but I'm watching the video it linked to, and I'm finding it fascinating. Older post: https://news.ycombinator.com/item?id=19945452

good times create weak men, weak men create hard times, hard times create strong men, strong men create good times.

Counterpoint: strong men were responsible for both world wars, and in fact every war ever. Hundreds of millions dead, two sincere attempts at self-extermination by the human race. Good times.

I would argue that weak men (with big egos) sent young men to war, giving them the hard times and the fortitude and determination to make good times for their children (boomers).

Stalin was weak? Mao? This sounds like a very odd definition of weak, cut to fit the argument rather than working with the facts at hand.

Both won wars and ruled over hundreds of millions or even billions. They can't be brushed away, nor any of the other dictators who perpetuated all those historical horrors.

Based on Lenin’s depiction of Stalin, he was paranoid, anxious and easily angered/upset. These certainly are characteristics of weak people. Just because people have power to do what they like doesn’t mean they are strong.

I get the impression that he worked tirelessly to exude the idea that he was strong. Much like Putin or Gurbanguly Berdimuhamedow

I will admit that I do not know much of anything about Mao

I dunno dude, Lenin absolutely hated Stalin so maybe he's not an impartial character witness.

I've made my point - OP is obviously wrong and so are you - so that's all from me.

It's a little more subtle.

Strong men take advantage of perceived weakness.

Hitler declared war on Europe and Russia because it perceived itself as stronger.

Japan declared war on USA because it believed it had a stronger navy. It did, but it awakened a Colossus.

Credit: Prof. Victor Davis Hansen.

What about women

Adopting the term (man) for the human species to refer to males is a common feature of Romance and Germanic languages, but is not found in most other European languages (Slavic čelověkъ vs. mǫžь, Greek ἄνθρωπος vs. άνδρας, Finnish ihminen vs. mies etc.).

I'm not really sure how the title relates at all to the rambling article contained within outside of it somehow being a 'generational issue' with examples of bad UI design in large systems.

The reality is that it has nothing to do with the developers I can guarantee. These bugs are likely well-documented in whatever choice of shitty issue tracking system the management has decided to saddle the developers with, and promptly buried underneath other concerns which have been prioritized as being more valuable (see: management thinks it'll earn more money).

The issue has nothing to do with the developers or the generations thereof, it has to do with the constant firefighting we have to do and the constant demand for new things now. Broken, buggy software has to be pushed out in tight deadlines because They Demand It and often there's no time devoted to polishing passes. And not even considering if the development team has access to a robust QA team dedicated to finding these issues and maintaining the quality bar.

This sounds like the kind of rant I would have written before I had to make those kinds of decisions.

- What bug tracking software do we use? You have two hours to research and pick one, and someone's gonna hate it no matter what you do.

- Do we focus on fixing a niche bug which affects one customer, or on adding a feature that all of our biggest customers have been screaming for?

- Do we get the new feature in for our next release or make everyone wait another three months because one dev is anxious?

- Do we build out our QA team ("nobody cares that us devs are overworked and drowning") or do we reinforce the dev team ("cheap bastards won't even give us QA")?

Once you have to think about keeping the lights on for a business as a whole, you realise that all of the choices that seemed so huge and obvious as devs are small tradeoffs in a much bigger picture.

As bad as management can be, you can't entirely shift the blame from the developers.

Having a good issue tracking system is addressing the symptom, not the cause. If the developers are writing software with these kinds of fundamental bugs, then there will simply be too many issues in the issue tracker, and management will have no choice but to prioritise other things because fixing all the issues would take too much time. It's very easy to get into "bug whack-a-mole" where you just add more and more code to address problems without ever fixing the root cause.

The article says this is because there are too many abstractions. I think that's sort-of right but could give the wrong idea. Abstractions are a good thing: they are the only way to contain complexity. The problem is when the abstractions are poor, when the implementation fails to match the intended abstraction or when there are no abstractions at all and you simply have an unworkable mess of spaghetti code.

The reason things are worse now than they once were is that the environment in which code is running is infinitely more complicated than it used to be. More layers of abstraction are required for the program to understand its environment, and as a result, there is a higher chance that one of those abstractions is problematic. The solution is not always to reduce the number of layers of abstraction, because it's a really good thing that our programs can understand complex environments - it's what makes those programs more accessible, portable, etc. The solution is to always be looking to refactor and/or strengthen the foundations on which we build, and to be better at picking robust foundations to use in our own projects so that good abstractions gain the success they deserve, and bad abstractions are replaced quickly.

Unfortunately, those skills are very hard to teach and to learn, and now developers without those skills can do a lot more long-lasting damage than they would have been able to before.

Having computed in the oh ten's, 00's, 90's and party 80's, never before have software been so functional, upgradable and flexible. Most software is beyond apex, which becomes problem itself: How do you keep selling already perfected software? You start to change it, because "customers demand it". Doesn't matter if grandma tears her hair out in despair and power users in fetal position cry their bitter tears: Change it must, for Change is good!

Problem is Capitalism itself perhaps.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact