Part of it may also have to do with big companies' product teams being largely insulated from failure, through lack of competition and lock-in. Where's Apple Music app's competitor? How are Twitter's users going to rebuild their network and brand if they switch to another microblogging service? These companies are too-big-to-fail in a different sense (that their profitable and successful products underwrite their mediocre offerings in other areas.)
Big software companies aren't the only big companies around producing ugly, unusable products with few alternatives. Until those rare times are mightily and unexpectedly defeated by an unexpected upstart, we get to suffer from their mediocrity, especially those of us who are in a sense UI connoisseurs.
What are some non-toy examples of people doing it right? What are concrete steps we can take to write cleaner, faster software?
I think HN itself is a pretty inspirational example. It's one of v few sites that always loads fast, even on a bad network. It has zero ads, zero trackers, and very little fluff.
So what does HN put significant effort into? Moderation, tools for moderation, algorithms to detect spam and brigading, etc. So one concrete suggestion is to focus your engineering effort and your hiring "below the waterline". The best software is often minimalist, with lots of thought and effort invested below the surface. A classic example is the original google.com: just two buttons and a text box.
Much bloat comes from teams that only know how to think about the part of the software they can see.
However! There is one big problem with "doing it right" in the Sublime Text way. It's clear, at least to me, that VSCode is overall easier to use, because the plugin ecosystem is better (for C++). This is despite the fact that VSCode is a bit slower, being based on Electron and web technologies rather than native GUI libraries.
Maintaining the best text editor in the world can be done on budget that's still a rounding error to a big company, and for whatever reason some of those companies are interested in owning that market now.
The idea that people are becoming lazier programmers because they can, however, seems intuitively obvious. But that's kind of the idea - increasing productivity inevitably means doing _less_ of something or other. It's just (hopeful use of the word, to be sure), a matter of making sure you don't create false economies in the process.
This speaks to me so strongly. I’ve been doing web development since the mid 90s. I see so many newer developers who just know the latest frameworks (e.g., React), but don’t really understand what came before and why things are done the way they are. I’m flabbergasted at some of the things they do because they don’t really understand what is going on under the hood. They don’t understand when a library/framework/feature is the appropriate tool, and when it is not.
I would like to see every tutorial, documentation, etc. explain WHY it is done in this way, WHAT are other/older ways to accomplish the same task and why the new abstraction might fix some issues, WHEN this thing is the appropriate solution and when it might not. Most seem to concentrate on the HOW to use it.
Sometimes we consume so much resources from our abstractions that in order for the value to not be entirely consumed by it, we optimise.
But the abstraction trend continues.
This doesn’t just apply to software design or digital service infrastructure; it also applies to our entire society. From farming to accounting.
What can seem to be a simple site on the surface might contain dozens if not hundreds of pages of attached code. Gone are the days where sites used just a few pages of HTML+CSS to display everything. It's rare to find a site designed in the past 3 years and to view the source and think "someone definitely hand coded this" and even more unlikely to find a site that loads less than 1MB of resources these days -- even an old school site like Slashdot is >5MB now. That's insane.
The longer humanity exists, the taller the tech stack gets and the further we get, on average, from goats and ochre.
This was true 15 years ago and not today. I tried to sync my mom’s photos on her iPad with her Mac when I visited a few days ago, and was unable to accomplish the task in the time I was there. I had to do a software update for the Mac to talk to the iPad, which didn’t necessarily fail, yet never succeeded either, just had a 100% complete loading bar indefinitely. No feed back, nothing. Bad UX. Restarted many times. At one point I thought I bricked her computer because it wouldn’t boot. Maybe I’m not that smart, but should I need to be intelligent to use Apple products?
I think what’s different is who is screwing up software today. Apple used to be great at UI/UX. Ironically the best UI/UX I use today is Gnome3 on Debian, and I like Windows 10 more than OS X now. That would have been unheard of in Apple’s prime. This was the opposite 10 years ago. You could find bad examples of software gore in the previous era. I think this is cyclical, there’s just more consequences now than before.
You could argue that the protocol should simply be an API, but what kind of API? A C API? Well enough, but that means you have to provide some kind of C API even if you're developing the language capabilities of a self-hosting programming language like Go. That is a little annoying, and it's not obvious to me that there's a huge win compared to JSON-RPC. I mean, it'll be faster, yes, but not in a way that's going to matter compared to the other work that's being done.
Don't most LSP implementations just communicate with their parent process over stdin/stdout? I mean, that's pretty Unix-y, and not so different from how things were done in the old days, besides the communication having some structure instead of plain text.
This is something I know a tiny bit about, but I know less about the other complaints Jonathan has. It does make me wonder a little if there's some cherry-picking going on. I don't by any means disagree with his conclusions: simplification is best. However, I don't know how to get from here to there (where here is: "we're just treading water keeping up with privacy and security mandates and adding urgent features for our customers").
My iPhone 3GS had significantly fewer headaches than my iPhone 11.
Apple is particularly weak when it comes to cloud and services. Apple is pushing iCloud and other services really hard. There is absolutely no ability to troubleshoot anything in these platforms.
Every time I have a fault with my Apple Watch or my iPhone or iPad, I'm told I need to do a factory reset and not restore from backup to validate if it's something in my working environment that's caused it. If it is, well then, goodbye years of history, because there's no selective restore.
I tried to upgrade my Mac from "10.14.6 (Mojave)" to what the system preferences software update screen describes as "macOS Catalina (null) - 8.18GB", and when I hit the Upgrade Now button, I get a popup that says "This Mac is configured in a way that is incompatible with macOS Catalina." I have a single disk inside my 2015 MacBook Pro, single partition, FileVault2 enabled, and I after contacting Apple Support and going through a screen share, their solution is that I need to... backup my data with time machine and reboot into recovery mode to do an Internet Recovery reinstall. Apparently it may be because I have the weird non-standard configuration of having full disk encryption enabled, but the support person couldn't actually point me to anything that could indicate that on my machine, as even they have no ability to troubleshoot anything.
Seriously. Fuck Apple.
Note: I am thoroughly into the Apple ecosystem, and loving it about as much as I loved being in Microsoft ecosystem in the 2000s. Like Microsoft offerings were at the time, Apple's are better than the alternatives. They're still terrible.
Then at the end the author says:
> Docker and Electron are the most hyped new technologies of the last five years. Both are not about improving things, figuring out complexity or reducing it.
That's not why these are 'hyped' technologies. These things solve real problems. Electron solves the problem of a cross platform GUI with permissive licensing where there are an abundance of cheaper frontend devs vs QT experts.
Sure, Johnathan Blow said we could just copy programs between computers in 1960s but that's because every computer in the lab was the same back then. But that isn't reality today. Today, if you want to ship some server applications to different business customers you'll find yourself having to maintain N different install scripts for the N different distros your customers have. Or you could simplify and use docker and have a consistent environment for your application.
> They do not need to know, exactly, how X is built, why it was built that way, or how to write an alternative X from scratch.
Why on earth would they? I want them to build Y not build me an X from scratch. But I assume they are competent engineers that can solve novel problems and they would research and build me an X if tasked to it. After all most software engineers got their engineering knowledge from trying to solve problems they were tasked with in their career, not in a classroom.
There is no intelligent discourse in this piece. The author could have looked at why these technologies were created and adopted, maybe have made the argument that their adoption is unwarranted due to X. Supported X with good examples and proof.
The alternative is what, infinitely increasing complexity? Every industry has division of labour: If you are a chef baking cheesecake, you don't need to know how to make cheese. If you are a cheese-maker, you don't need to know how to milk a cow. If you are a farmer, you don't need to be a cattle vet, or be skilled in building a barn or repairing tractors, or in smelting stainless steel that it's made of.
Step into the software sphere and the abstractions we are building are leaky, and unstable. The best example of this is Kubernetes -> it has a hundred moving parts, and if you got an error it could be because of the problem in the kubelet, or in underlying Docker, or in Flannel, or in IPtables, or linux kernel settings, or in the settings of hypervisor, etc, etc.
The most elementary problems don't have a standard solution: how many ways are there to format a settings file? Probably a few dozen. How many of them afford the user syntax highlighting?
I believe the root cause of bad quality software is accountability: an average android phone ships with dozens of bugs. You could not ship a car with a missing wheel. But if WiFi calling doesn't work, nobody cares. Companies know that they can get away with it, and they do. If your management knows that shipping buggy software is fine, and so does the rest of the industry, actions of individual developer can only go so far.
Now that we can write, update and deploy software so easily, there’s a very long tail of mediocre programs that get the job done. The average program will of course be slower and buggier than the average program 20-40 years ago. But I have millions of commoditized apps at my fingertips compared to relatively few 20-40 years ago.
It’s also the case that in the past we were forced to prematurely optimize. Google can update gmail tomorrow to make it faster for everyone. Even 10 years ago as a native software dev, I had to obsess over the slowest piece of hardware I needed to support.
Both won wars and ruled over hundreds of millions or even billions. They can't be brushed away, nor any of the other dictators who perpetuated all those historical horrors.
I get the impression that he worked tirelessly to exude the idea that he was strong. Much like Putin or Gurbanguly Berdimuhamedow
I will admit that I do not know much of anything about Mao
I've made my point - OP is obviously wrong and so are you - so that's all from me.
Strong men take advantage of perceived weakness.
Hitler declared war on Europe and Russia because it perceived itself as stronger.
Japan declared war on USA because it believed it had a stronger navy. It did, but it awakened a Colossus.
Credit: Prof. Victor Davis Hansen.
The reality is that it has nothing to do with the developers I can guarantee. These bugs are likely well-documented in whatever choice of shitty issue tracking system the management has decided to saddle the developers with, and promptly buried underneath other concerns which have been prioritized as being more valuable (see: management thinks it'll earn more money).
The issue has nothing to do with the developers or the generations thereof, it has to do with the constant firefighting we have to do and the constant demand for new things now. Broken, buggy software has to be pushed out in tight deadlines because They Demand It and often there's no time devoted to polishing passes. And not even considering if the development team has access to a robust QA team dedicated to finding these issues and maintaining the quality bar.
- What bug tracking software do we use? You have two hours to research and pick one, and someone's gonna hate it no matter what you do.
- Do we focus on fixing a niche bug which affects one customer, or on adding a feature that all of our biggest customers have been screaming for?
- Do we get the new feature in for our next release or make everyone wait another three months because one dev is anxious?
- Do we build out our QA team ("nobody cares that us devs are overworked and drowning") or do we reinforce the dev team ("cheap bastards won't even give us QA")?
Once you have to think about keeping the lights on for a business as a whole, you realise that all of the choices that seemed so huge and obvious as devs are small tradeoffs in a much bigger picture.
Having a good issue tracking system is addressing the symptom, not the cause. If the developers are writing software with these kinds of fundamental bugs, then there will simply be too many issues in the issue tracker, and management will have no choice but to prioritise other things because fixing all the issues would take too much time. It's very easy to get into "bug whack-a-mole" where you just add more and more code to address problems without ever fixing the root cause.
The article says this is because there are too many abstractions. I think that's sort-of right but could give the wrong idea. Abstractions are a good thing: they are the only way to contain complexity. The problem is when the abstractions are poor, when the implementation fails to match the intended abstraction or when there are no abstractions at all and you simply have an unworkable mess of spaghetti code.
The reason things are worse now than they once were is that the environment in which code is running is infinitely more complicated than it used to be. More layers of abstraction are required for the program to understand its environment, and as a result, there is a higher chance that one of those abstractions is problematic. The solution is not always to reduce the number of layers of abstraction, because it's a really good thing that our programs can understand complex environments - it's what makes those programs more accessible, portable, etc. The solution is to always be looking to refactor and/or strengthen the foundations on which we build, and to be better at picking robust foundations to use in our own projects so that good abstractions gain the success they deserve, and bad abstractions are replaced quickly.
Unfortunately, those skills are very hard to teach and to learn, and now developers without those skills can do a lot more long-lasting damage than they would have been able to before.
Problem is Capitalism itself perhaps.