"In 2019, the company released an entirely new version of its terminal for Windows 10, promising a better performance. However, user Casey Muratori was not convinced, claiming that the performance was still far from what could have been done without much effort. To prove his point, Casey developed in just a few days an open source prototype that performed 200 times better than Microsoft's new terminal [32]."
"In the article Software Disenchantment, author Nikita Prokopov exposes how bad software has become commonplace in our daily lives [35]. From programs that get incredibly bigger and slower with each new version, offering little or nothing more in return, to the fact that they get worse as they are used, to the most complete chaos that is the software development process.
For example, when creating an empty (!) project in npm for web development more than 50,000 files are downloaded, taking about 350 MB. This house of cards is so fragile that it gives rise to situations that are, to say the least, embarrassing. In 2016, an npm library called Left-pad, downloaded about 2 million times a month, was removed from the repository by its author, resulting in many web projects around the world crashing, including large corporations, forcing npm to revert the deletion [36]. The library, which contained 10 trivial lines, could be implemented in a few minutes even by programming students."
Some, but not all, developers get defensive when the issue of "bloat" is raised.
What blew my mind is not that incident, or that Microsoft devs refused[1] to fix it, but that about a year later they updated Notepad the same way and it visibly lags and stutters on a suuuuper high end gaming machine. Like. What. The. Fuck!?
They literally never tested Notepad with large text files! It can take a minute to open a file that every other text editor on Windows opens instantly. It misaligns tabs and four spaces by one or two pixels, which means they utterly failed to render using a fixed width font in a way I have never seen before.
This minimalist text editor takes over a second to launch from an SSD that can read a 4K movie end-to-end in the same time.
It’s just a staggering level of incompetence.
[1] Their excuses were mostly that they couldn’t even look at his code, lest their minds be infected somehow and be in violation of copyright. Meanwhile Casey uploaded it as open source and they could have at least listened to his videos with their eyes closed. Instead they elected to stick their fingers in their ears and yell “La-la-la can’t hear you!”. Clearly I’m exaggerating for effect, but not much.
The Terminal team could have simply asked a Microsoft Linux team member to watch the videos, and then explain the gist of the algorithm without ever referencing any specific code.
Algorithms can only be patented, not copyrighted, and any IP lawyer would tell you that.
Not to mention that Casey's code is dead obvious, totally standard, done-this-way-for-decades, not some sort of exotic thing he invented. Heck, I've personally implemented something nearly identical back in 2000, and I'm pretty sure every game engine console (as seen in Source games) works nearly identically also.
It's so simple that a one or two sentence description is sufficient. Nobody is suggesting that Microsoft copy-paste reams of GPL code. Instead, Microsoft is refusing to listen to a single sentence of good advice, and stubbornly forging ahead with a demonstrably stupid solution.
Nobody needs to copy the code verbatim anyway. The essential part is the algorithm (or the "approach" to the problem), not some sort of specific C++ wizardry or a huge library.
The correct solution can be thrown together in a couple of weekends, it's not something that has to be imported into the Windows Terminal codebase as-is.
You can, anyone can, write all your programs with the mindset that you are given X amount of memory, and you use it as a cache or buffer like they did back when computers had 64K or whatever.
It would solve a lot of what I see as the problems of modern software. For so many years, people have pursued the false vision of a programming environment where resources can be assumed to be infinite.
The problem is, no matter how much memory you have, if you take a data structure that reasonably fits in your vast amount of memory, and then you inadvertently multiply its size, you won't have space.
Or to put it another way, all finite numbers are infinitely smaller than infinity, so the delusion that any finite number is equivalent to infinity will always bring people to grief.
Could someone please write a manifesto along the lines of "malloc considered harmful"? Just. Say. No.
Javascript, Java, C#, Golang, Python, Ruby: all garbage collected. Aside from C/C++, those are the languages of modern software dev.
Yeah you could arena allocate and do some nonstandard techniques like that, but basically you are at the mercy of a garbage collector, and those LOVE to use memory.
So... use C++? use Rust? As you point out, malloc and the heap are kinda the same problem as GC.
Personally, some "runs on 1Ghz and 256MB of RAM" would probably be the most real world effective way to advertise some degree of restraint.
We can't keep wasting processing power, the "laws" are running out of steam. MHz/Ghz is stalled for a decade, and the die shrinks are going to start running out of steam as well.
I can't think the coming AI coding blast will help things either.
We also lack stability and sharing in the runtime libraries (and containers are NOT helping this, they just load all their own mini-OSes for each application), and efficient means of extracting only the necessary routines from big libraries in build processes.
I don't think it's about the amount or the language. I have 16 GB of memory, and I don't expect it to decrease. I don't expect the number of cores and MHz to go down either.
I'm just saying that everything should be written to take what you give it and like it. If you do that, it doesn't matter how much memory there is, or if there is a GC or not - you have none of the problems of unpredictable resource usage.
I don't think (although I haven't been in school over a decade) most people come out of college with an experience of writing small programs to process everything in chunks. It's so...impure. I didn't have the experience of being taught in Scheme or something, but I can imagine.
> write all your programs with the mindset that you are given X amount of memory, and you use it as a cache or buffer like they did back when computers had 64K or whatever
Even better, cache-oblivious data structures & algorithms avoid the needing for picking/knowing X, and will essentially auto-adjust to near-optimal regardless of the cache size or pressure.
Windows terminal seems to be typical of modern agile software, they do the MVP then try and ladle on extra features which are buggy.
As an example the tab handling is a mess - eg. you can't configure tabs where title updates using escape codes is blocked and the close "x" is removed even though the settings are there - the tab title just reverts to the profile name.
I suspect it would have been fine if they actually designed all the features from the start vs grafting things on ad-hoc.
I feel like the person writing this article has never worked in a professional environment: getting it right is hard, you always have pressure to meet a deadline (for obvious reasons), but if you don't have a deadline, you can also end up in analysis/development paralysis where you don't spend your time on important stuff.
A lot of the projects that person is talking about ended up being pretty good a year after their launch: Cyberpunk 2077 is a pretty good open world game (if not the messiah some people got hyped for), Apple Maps is not bad these days (especially in California), the iPhone 4 is now considered pretty revolutionary, hell even Windows Vista was not that bad after all the patches!
There's a difference between "having bugs that we didn't catch" and "lets not look since our users will find them".
Also, I'm sure they are, which is why lots of companies will provide an optional beta version of their software that people can use. But there's a big difference between having the opportunity to use beta software because you want to try it, and having the opportunity to be forced into running a beta and being QA.
It’s typically in the eye of the beholder what “finished” means. With stuff like Cyberpunk 2077 I’m not at all fussed by there being bugs in a complicated video game. But most of the complaining about the game was feature requests. Maybe those elements make you like the thing less, but it’s not necessarily incomplete. You’re rarely gonna ship something perfect.
At some point with something like Apple Maps there were always going to be edge conditions where it would be bad, the problem is just too vast. And at some point you have to pull the trigger so that the developers stop trying to mitigate whatever they can think of and start fixing problems that users actually have. Those two things are never the same thing (which is really the trouble that things like "minimum viable product" and "YAGNI" are trying to get at). At some point you need to draw a line in the sand and figure that it is good enough to ship, then you generally get blindsided by a bunch of issues you never dreamed of and then all your priorities shift.
Isn't that a classic case of survivorship bias? Maybe their early release allowed these products to become fully baked much quicker due to shorter feedback cycles?
I just don't see how this is a big deal, if you have a high bar for quality then don't be an early adopter / buy something on the day it is released. You can always wait and see how something develops, the outcome is no different to you but the company benefits significantly by having feedback from real users for that entire time.
I see a problem with an industry that happily releases junk on the premise that it can always be fixed later. That's better than not fixing it, but junk was still sold as if it were actually a finished product.
The problem is the bar for quality has fallen through the floor and it has nothing to do with the amount of time it takes to develop something. If anything, it's likely to be of lesser quality the longer you wait because more web developers will have infused their shitty code by then.
This article has stated very clearly and succinctly something that a lot of knowledgeable people have been screaming about for a long time. Excellent work.
Unfortunately, I don't see a solution to it anytime soon, short of everything just falling apart.
If you look on the opposite side of the coin, you will see that we do have a bunch of tech that is very good these days, in a larger quality than the failures.
IMO these types of failures are simply a percentage of the overall tech, and the reason they seem prominent is because there is just a huge race in tech over the past decade.
If you actually want to fix it and reduce that percentage, push to replace engineers with computers. A lot of engineering problems are search problems. Computers are better at search than humans.
> have a bunch of tech that is very good these days, in a larger quality than the failures.
We have moved to products just being "good enough".
An example is telco equipment, phones always just worked in the past, you would never expect an outage. Ask most people who had office or home phones up until the 90s and most would say they never had a single instance of their phone not working.
Then we went to Voip, and now soft phones like MS teams, and reliability is shit.
Its embarrassing that products like MS teams require multiple retry attempts each week just to make a simple phone call.
I'm thinking mostly about software here, not tech in general. And I'm not saying that some things aren't better. But a lot of things are worse, and the trend worries me.
> We are allowing too much power to be concentrated in the hands of a few people, whose ultimate goal is this relentless pursuit of money, not a human project for society.
Swap "power" and "money" and I agree.
> We are allowing too much MONEY to be concentrated in the hands of a few people, whose ultimate goal is this relentless pursuit of POWER
First step, stop giving them your money. 2nd, take away what they've illegally or otherwise ill gotten.
Do you have many examples of people who have lost significant power and stayed as rich as before? I'm sure there are some, but think it rather rare: usually, loss of power is accompanied by loss of money.
> But if they have the money, how could they lose the power in the first place? If money=power it should be impossible.
I don't see why this should be true. If money=power, surely that has nothing to do with whether you can lose it or not? Generally, when you have something, you can lose it sometimes, no?
Well yes in reality I completely agree. But if money is power then someone with money can't lose power (then money/=power). Otherwise there has to be more variables to power than just money.
> Apple, known for supposedly delivering sleek products, launched in 2010 the iPhone 4, still under the tutelage of Steve Jobs. Within hours of release, users flooded the internet to complain about dropped connections and abrupt reductions in internet speed. The problem, according to Jobs, was with users holding the iPhone 4 with their left hand [8]. As this was a hardware issue, it was never really solved.
They did fix it with later versions of the iPhone 4 and with the iPhone 4S. Also, I remember that they made the "bumper" cases to alleviate the issue with the antenna: https://apple.fandom.com/wiki/IPhone_4_Bumper
I still peer pressure friends into putting cases on their phones. If you were using an iPhone 4 without a case you're just a madlad.
It was never a question if you needed a case, but if you could find one that wasn't ridiculous. For any phone there were maybe 3 manufacturers I liked, and they kept changing every few years.
You know what’s bullshit? Articles like these. They all mash together a big list of unrelated and cherry-picked problems and pile on complaints and cynicism.
With how long this piece is I honestly feel bad for the author for wasting so much time on the endeavor.
What purpose does an article that’s essentially a gigantic list of every recent corporate scandal serve?
On top of that, toss in the tech geezer’s “back in my day page load sizes were small and now everything is bloated” take we’ve all heard a thousand times. Like, holy shit please stop my phone has a 100Mpbs+ cellular connection, 8GB of RAM, and 512GB of storage I don’t care that websites are 7MB.
There’s nothing actionable about this monstrosity and all it can do is make you upset over a bunch of random shit you can’t control.
The author keeps saying capitalism is at fault but I’d argue the bigger issue is that the world economic system is not capitalist at all. The world economic system is debt based and not capital based. Debt is accumulated as if it were capital instead of saving for future investment. Essentially, the time order is reversed. Instead of sacrificing the present for the future, the future is sacrificed for the present. The entire economic system functions in this manner, and it perverts every signal. It encourages spending now, ROI now, and worry should never happen. Of course, this also means that debts eventually collapse and the system collapse wipes out all but the largest players, concentrating all wealth at the top. This will keep happening as long as the system is debt based.
Counterexample: RaspberryPi. Little tech still has a future (but everything the author says about overhyped Big Tech outfits sounds pretty accurate).
Personally I think it comes down to an over-financialized economy. Raising taxes on corporations and individuals while providing loopholes in the form of writeoffs for re-investing profits in R & D and manufacturing facility upgrades etc. is one obvious solution.
I don't get Raspberry Pi. The $35 computer that's often out of stock, doesn't include a USB adapter or case, and is rarely found below $70. And the unfortunately complex boot process involving binary blobs. It's not as cheap & open as originally implied.
The price and stock are from current supply problems. I have dozens from when you could get them on sale. The lack of case is for tinkerers to make their own, although there's an official one you can buy now too. Binary blobs are a valid complaint, but machine code will be easily reversed engineered by transformer models in a year or two.
I'm not sure that's a good counterexample? It hasn't been possible to actually buy a Raspberry Pi anywhere in something like a year, though the hype rolls on.
"Too big to fail" means everything fails. In my opinion we are on the precipice of civilization collapse. The reason is that people can no longer think logically and the tools they use are increasingly becoming a hinderence.
The problem crosses multiple domains, not just "Tech."
Basically we have reach a point as a civilization where failure and ineptitude are increasingly not penalized.
We optimized for the short term without caring about the long term (climate, sustainability, equality, prosperity). Now everyone and everything is in debt (money, natural resources, compassion), yea I could see it all fall apart.
Falling apart could result in a temporary and tragic setback, eg a period of stagnation and hardship, perhaps some world wars and some global population decline, but a civilization collapse though is still avoidable
We're living in the Golden Age: they with the gold (money actually) make all the rules.
As the article mentions, the emergence of quasi-monopolies constrains decisions at all levels in service of further maximizing financial gains. The 99.9% are mere pawns in this game.
Monopolies are the norm, I believe. The 20th century in the United States (from the Sherman antitrust act, 1890, to the changes in the Reagan era, 1984ish) is an exceptional period.
To mention what good software without bloatware looks like, go checkout https://nirsoft.net website. Small size and absolutely wonderfully working tools made by single dev. Its epitome of craftmanship.
PS: Also System internals tool made by Mark Russinovich
But I think the author really just means software failures. The truth is that we make software to just barely work, knowing we can fix it later. But apparently the author somehow hasn't been exposed to software ever.
Further, I think the author is angry and knows something isn't right in the world and is railing on the software industry. I think (hope) that's misplaced anger, but the author is probably right to be angry about . . . something.
This is an example of confirmation bias. It overlooks all the examples in which technology works and makes improvements to people's lives. Things like airline safety and medicine have made big advances.
But I think some of the bullshit is superfluous features that do nto do anything or seem redundant. Why is there a warning if you try to turn the volume up too loud on windows? So annoying. Or McAfee being preinstalled and then annoying popups (bloatware). In that sense, I agree with the author.
I find that I generally agree with the sentiment but putting Apple in this list is a little preposterous. Yes Apple Maps was terrible, but with continuous improvements, at the moment it is a good if not the best product. Outside of that, they have had very few bad products that they released (yes including the $1000 stand) in the last decade.
The butterfly keyboard was pretty terrible. They continued selling the model for years before finally removing it. Customers didn’t get any recourse until the recent lawsuit, but that only applies to certain US states. Apple still denies any wrong doing.
Cathartic venting for some, perhaps, but not a very insightful piece. Rehashes various mostly wrong examples and comes to a predictably lame conclusion about “capitalism”.
I don't quite know what the author's conclusion is here. Capitalism is bad? Part of the problem in my mind is that consumers don't actually punish these behaviors. For example, Cyberpunk 2077 is massively profitable even though it was so shitty at launch. The same is true of many of the other examples in the list. So if companies are going to still make tons of money by cutting more and more corners, why wouldn't they?
Well, then maybe the problem is that, from an economic point of view, it is unsustainable to make decent software. However, when we analyze the increase in productivity compared to the increase in wages, we see that, for decades, the former has grown at a much faster pace than the latter. Currently, this gap is almost 50%.
Some of this can be explained by employer sponsored healthcare and other benefits for employees. Also, employees can capture this divergence by investing wages in index funds like the S&P 500. The divergence, which began in the early 80s, also tracks the start of the huge bull market. Profits that would go to employees are instead reflected in rapidly appreciating share prices and dividends. A lot more people today have IRAs compared to 40 years ago, representing significant wealth .
> Some of this can be explained by employer sponsored healthcare and other benefits for employees.
I don't see how. Those things aren't new, and aren't better than they used to be. Their impact applies to before "the divergence" at least as much as it applies now.
Technically yes, the working class can become capitalists by just saving their money. Practically, I don’t think I need to explain why that’s a dishonest and rude suggestion, at best. Just because anyone can become a shareholder in a company that squeezes its employees dry for profit doesn’t mean that that company isn’t a weapon in the class war.
From Wikipedia: Federal Reserve data indicates that as of Q4 2021, the top 1% of households in the United States held 32.3% of the country's wealth, while the bottom 50% held 2.6%.
Bezos increased his wealth by $100 billion over the last decade. If all that money was evenly distributed among Amazon's 1,500,000 employees instead, they would get like $6k more every year. That might be a nice bonus to the average Amazon employee making $30,000 a year, but it would hardly be revolutionary.
I don't think 1% of people holding 32% of the wealth is that surprising or exceptional. A lot of people are in the bottom 50% because they don't save any money at all. I think these stats give people the idea that if you were to somehow take all the wealth from the top 1% and give it to the bottom 50% that this would vastly improve their situation, when in reality it wouldn't make much of a difference and would fuck up economic incentive structures.
Fixed pie fallacy of wealth strikes again. Amazing how common this view of x% people holding 'the wealth' is. Nothing similar existed before Jeff Bezos, Bill Gates, Mark Zuckerberg and the 2 Steves built their offerings; they got massively rich thanks to people worldwide choosing to pay them for their products & services (or voluntarily shunning the open internet with multiple anonymous fora like this one for Facebook's walled privacy screwing garden).
> After the fiasco that was the release of Cyberpunk 2077 — a game estimated to cost more than 300 million dollars, including the performance of actor Keanu Reeves —, many people began to question how far the so-called late stage capitalism can go.
Really? Releasing a multi-year software project worked on by many, many people that was buggy but still more or less served its purpose is someone's idea of capitalism teetering on the precipice? If we truly believe "bad results for consumers" means capitalism itself is reaching its end stages, an earlier stage of capitalism featured people just brazenly selling pills with tapeworm eggs as a weight loss supplement and no real regulatory apparatus equipped to stop that, so what was that? "Late-stage capitalism" is becoming one of those buzzwords that doesn't mean much of anything.
I had to start over from the beginning 3 times due to completely main plot breaking bugs, and this was on 1.4+ versions . Still had a side quest broken.
That is not a "more or less" serving it's purpose.
I played the game from beginning to end without any major incidents on launch. But even then it was eventually fixed and I still don't think this is a harbinger of the collapse of global capitalism.
Yeah, I think that is true as well. I had just upgraded my PC for the first time in many years and was looking for something to showcase what the hardware could do when I picked it up. All the bugs I encountered were pretty minor.
Also, more for future readers, I've heard the PC version has community mods that fix most of the worst bugs! Shame they couldn't have contributed it back to CDPR's main branch .
> Capitalism has never put so much money in so few hands as it does nowadays, and it is just getting worse every year. There is less real competition in the market, as companies are buying each other, centralizing power.
I assume the author is talking about the US.
Genuine question: How does the software industry look in more non-capitalist countries? Is it flourishing without these issues. I bet their equivalent of Apple Maps was perfect on first release!
The software industry has issues but I am sure the author has a causal link from capitalism to these issues coming up in a second edition of this post that is thoruoughly researched!
Yep, but that is all capitalism's fault! And we should change over to another economic system where their equivalent didn't even have data issues on day one.
It is hilarious, seeing as capitalism is firmly expected to meet the ever changing and increasing needs of the luddites that sit opposite of it. But capitalists can't make the same demands of it's enemies.
Just hilarious to me to lay the blame for 'bad' software at the feet of capitalism when one of the 'purest' capitalist countries, the US, is so dominant in this sector worldwide.
Of the 100 largest tech companies, 62 are from the US. Next-highest China has 9. [0] Where's all the great software developed by non-capitalist economies?
Feel free to check out some rankings of this stuff by people that have thought about it methodically rather than just cherry-picking examples. US is ranked 6th and 20th in the world by Fraser Institute and Heritage Organization, respectively.
Of all the things we are bringing back from the past, why does communist propaganda have to be one of them? Why do people, smart people! keep making almost the exactly same mistakes?
Modern socialists have clearly studied history, but how are they able to interpret it like this?
What if the only viable futures are communism or collapse?
Consider as capitalism grows in complexity the grasp of the invisible hand grows weaker until it loses its grip.
Does the world really need a new iPhone every year when people can hardly afford rent?
Does making monopoly aspiring app for dog walking (Wag) to enrich a few sophomoric founders or defraud a few more gullible investors really constitute efficient rationing of scarce resources?
What if we are an the last stages of a faltering system?
Ah yes, communism. This time we will surely get it right because of our arrogance and hubris combined with total cluelessness about human nature and how spectacularly communism and its sibling socialism failed everywhere they were tried (And no, Scandinavian countries aren't 'socialist' just because of running a welfare program that happens to be funded by much heavier taxes on all their citizens than Americans could imagine)
"In the article Software Disenchantment, author Nikita Prokopov exposes how bad software has become commonplace in our daily lives [35]. From programs that get incredibly bigger and slower with each new version, offering little or nothing more in return, to the fact that they get worse as they are used, to the most complete chaos that is the software development process.
For example, when creating an empty (!) project in npm for web development more than 50,000 files are downloaded, taking about 350 MB. This house of cards is so fragile that it gives rise to situations that are, to say the least, embarrassing. In 2016, an npm library called Left-pad, downloaded about 2 million times a month, was removed from the repository by its author, resulting in many web projects around the world crashing, including large corporations, forcing npm to revert the deletion [36]. The library, which contained 10 trivial lines, could be implemented in a few minutes even by programming students."
Some, but not all, developers get defensive when the issue of "bloat" is raised.
https://news.ycombinator.com/item?id=34842863