Hacker News new | past | comments | ask | show | jobs | submit login
An alternative cause for the Great Stagnation: the cargo cult company (shyamsankar.com)
110 points by galenmarchetti 9 months ago | hide | past | favorite | 178 comments



The Costco screenshot is really interesting. The terminal apps of old were truly works of art and _incredibly_ fast. A non-technical worker wouldn't take long to understand the system and the keys/shortcuts to do something quickly. I remember having to sit with a few folks when we were looking at modernizing an app, watching them with a few key strokes process a record or something and thinking "we'll never match this speed".

Web or even desktop apps these days really pale in comparison.


I edit text for a living. I've recently moved from a content management system to markdown files on my computer. It's insane how fast the latter feels. Everything just works instantly, and the software is built for power users.

I wonder how much productivity is lost looking at spinners on single-page apps, and clicking around web interfaces instead of modifying files.

I really feel that when I do my accounting. These 1-3 second delays between pages add up to a lot.


I'm an Asciidoc man myself, but yes yes yes and yes. I say that as someone who gets their paycheck because of these overblown CMS/CCMS[1] turds. At the moment I'm supporting a gigantic PTC integration, but virtually every single functionality we're trying to establish could be done in Asciidoc and Visual Studio Code in seconds . . and these are things that - if they ever work - will take Windchill entire days to perform, and never, ever accurately.

But the writers and managers want clicky buttons and drop down menus, so here we all are. Article definitely hit a chord for me.

[1] Yes! Asciidoc is quite capable of acting like a CCMS . . if that's something you genuinely need. Component Content is another culty idea, something that really shouldn't be used for very much - if anything - but management acts like it's the second coming. And the effups it can (and probably will) introduce are colossal.


My favourite part is doing a find/replace that would have been impossible before, and fixing dozens of pages in one go. Then I can triple check everything before I commit and push. Oh and it works offline on a really old MacBook.

Yesterday I was testing generating audio files for glossary pages. It took an hour, because it's just another function called by the static dite generator. "For every word in this context, call the text to speech API". So elegant!


> I really feel that when I do my accounting. These 1-3 second delays between pages add up to a lot.

It's funny you should mention that. Year before last was the first year I decided to itemize every single expenditure. I did it using quickbooks, and it was a nightmare. It only took one time to decide never to do that again.

This year I itemized everything using beancount in emacs, and it's incredible how much easier it is to handle everything. Even having to download each bank period as a CSV and run it through a python script, I've just had a better time doing it and am able to more flexibly handle and analyze my finances.

I have yet to find out whether I can easily work with an accountant using ofx files exported from this setup though


I match business transactions to invoices every month. Then I invoice half a dozen clients.

This involves a lot of clicking around and a lot of network delays. They recently improved the UI but it's still slower than it should be.

The older I get, the more I understand graybeards and their command line workflows. Once it's committed to muscle memory, it's like a sharp knife in the hands of a chef.


One man's "lost productivity" is another's salary. I agree though, so many things have interactive interfaces when other modes of interaction would be preferable, not to mention that describing UI operations typically requires screenshots to understand what is going on.


> One man's "lost productivity" is another's salary

If that’s how we’re spinning it, forget the spinners and text editors altogether. Bring back the quill pen; surely it will restore many salaries!


I don't think quill pen manufacturing comes close to the jobs provided by modern software dev. One pen can do so many tasks!


Indeed, it adds up long-term and across people, so scales up horizontally too into many human-hours gone, evaporated into UI/X abyss.


have u used or considered an eink display?


I have a kindle and love it, but wouldn’t an eink display be too slow?


Why?


You said you edit text for a living, so I was curious if you ever tried one.

They're easier on the eyes, I've been looking to buy one but don't know anyone who tried them firsthand



I just wanted opinions from folks in this community, that’s all…


There have been systematic studies of this. UI productivity for people that have had time to learn the software peaked in the mid 1980’s.

So, the industry traded being able to walk up and use a thing in the first 30 seconds for having it be painfully bad after the first 30 minutes.

People like to say, “oh, but power users are rare”. That’s only true if you count users that bounce after 29 minutes the same as users that use the software more than once.


You're zeroing in on the actual answer to "why did we replace these awesome high-speed text interfaces with slow GUIs?" Businesses aren't stupid or irrational: they switched because they moved to an employment model based on a disposable labor force, where workers wouldn't be expected to stick around long enough for training to be worth it, so it was more important to make an interface that was good enough for the endless revolving door of untrained users.


Yes and: then that approach got cargo culted into places where it doesn't work because e.g. the domain is too gnarly, because management heard that glanceable interfaces with high discoverability were the new industry best practice...


> People like to say, “oh, but power users are rare”

Huh, didn’t consider that power users being rare meant the software makes becoming a power user difficult.


Is it just 30 minutes? Isn't this just like the "bows are better than arquebuses" thing where the advantage of easy training and replaceability beats individual quality?


When someone uses a tool for several hours every day for years, you can afford quite a bit of training.

When you need to make a bunch peasants into soldiers on short notice, the equation is different.


AI UIs can fix this. If you know what you want, you could just say it without learning anything about the doing part.

But it needs to be rock-solid or it'll become a glorified unreliable voice control.


commoditizing skill


Part of this is that Costco, unlike most retail outlets, treats their employees well and tries really hard to keep them around. For example, notice that the Costco nametags always have an "Employee since" on them, and in most of the stores I've been in you will even see some with the 1990's on their tags, and lots of people with 10+ years at Costco. They aren't doing the standard retail employee churn- typical retail stores have 100% turnover per year.

So Costco is willing to accept a slower-ramp up period in exchange for longer term speed and efficiency, whereas companies which churn their employees care far more about initial discoverability and getting someone to be moderately capable very quickly.

My wife, a decade ago, worked at the transition of her pharmacy from when it was a local retail chain and then was purchased by one of the national chains. The local retail chain, which was a reasonably pleasant place to work for its employees, had a computer system that was an ancient DOS app like Costco uses, and when my wife would get a call from another employee on how to do something unusual in the system it was lots of "now hit tab three times, then Alt-F10 and select the glorbaldall from the drop-down and then two more tabs and then ctrl-shift-S and enter the frazzlematazz" and the new company was hellish towards their employees and had a slick new web interface with lots of menu options so you could eventually find what you needed without the keyboards, but you could never get to be as fast as my wife was on the older system.


We had some lightweight terminal POS when I worked at a grocery store as a teen. We were tracked on average items/minute we scanned. I was fast with those and could ring through a cart of produce quickly.

The stupid self-checkout things... clunky, inaccurate, awful touchscreen calibrations, flag needlessly for "help" automatically. Whoever wrote those systems should feel bad for what they produced. I can't imagine that in the long term, they've helped the store save money or save anyone time.

As a young kid, I learned to use an AS/400 system and got fast at it quickly. I did a ton of data entry for my father's company. It felt so accurate and fast in comparison to the UI garbage today.


Self checkout is an interesting study. On the one hand, you might get by with smaller labor costs because you need fewer cashiers. But the costs are less easy to quantify.

First, you need more POS stations, because customers are not as fast as trained/experienced cashiers/baggers. So you need more machines to serve the same volume. That cost might be worth the lower labor costs.

But those extra stations also consume space. Picture a Walmart that dedicates one entire edge of each store to a row of checkout lines. It's easy to imagine that 10% of their public floor space is devoted to that function. If it takes 50% longer to self-checkout, you might need 50% more floor space to handle the same volume of purchases. That's not completely accurate because stores don't convert all their checkout lines. But you get the point.

I have noticed that most places compensate for the space requirements by drastically compressing the space for a station. No more conveyor belts. No more funneling customers throught a gauntlet of magazines and chewing gum. They might replace 4 checkout registers with 10 self serve stations. In many ways this can be better for the customer but worse for the business. There's a lot of money to be made in all that merchandise immediately surrounding checkout stands. Even the endcaps on the aisles closest to the checkout lines are worth special attention for maximizing sales. Losing this hurts the business.

I'd be curious to see the results of rigorous studies. In the meantime, as a customer, I like the self checkout stations.

I'd be super curious to see things completely flipped on their head. I'd like to see a grocery store where you walk through aisles that have only one item per SKU on the shelves and you point your phone at the qr code to "buy" one. The store app lets you indicate how many of that item you want. In the back room you have robots feeding your items up to a boxing station. In the time it takes you to walk to your car after pushing "Finalize and Pay" your order is waiting at the will call loading dock as you drive through.

No shoplifting losses, dramatically lower security costs, better and cheaper inventory management... A long list of benefits. And of course an offsetting list of increased costs. For example: smaller sales floor requirements, huge loading dock requirements.


> I'd like to see a grocery store where you walk through aisles that have only one item per SKU on the shelves and you point your phone at the qr code to "buy" one. The store app lets you indicate how many of that item you want. In the back room you have robots feeding your items up to a boxing station.

In the US, there used to be a store like this back in the 80s-90s called Service Merchandise


My local Walmart took out 4 lanes that they anyway couldn't staff and put in 8 self checkouts. There is no reduction in staffed lanes and effectively no additional use of floorspace.


If the system was good enough for that I assume there would be no need to walk the aisles physically. Just browse the high quality app and either delivery or arrange for pickup.

I mean, this is HN, why do people still go to the grocery store instead of using a delivery app? Price? Browsing an app is slow? Item availability? Time? The app sucks? Orders always come wrong?


How big is that pizza is IRL, not just a photo? What color is the Orange Juice? Which brand of bacon appears to have a higher meat to fat ratio?

The grocery store offers an old fashioned produce section so I can hand pick a pear that isn't bruised by a robot.

And all the reasons you list.

Part of the beauty, though, is that it becomes obvious to offer both. Developing in-store system puts you 90% of the way to having shop-from-anywhere abilities. Developing shop-from-anywhere puts you 85% of the way to having in-store shopping without shopping carts.


Change of scenery. Run into the neighbors. See what's available IRL but not on-line (daily specials, etc.). Exercise. Fresh air --for various levels of "fresh". See what's new on the high street.


> I can't imagine that in the long term, they've helped the store save money or save anyone time.

They weren't supposed to. They were supposed to divert money away from checkout staff and towards the company making and servicing the terminals.

(Is my cynical streak showing? ;) )


plus at least the self-checkout software that Walmart uses runs in Java, on Windows, on underpowered hardware, yet still has to interface with the old POS system that's running inside of it. I haven't worked at a Walmart in a few years but when I did, the self-checkout UI system would sometimes get out-of-sync with the POS, due to not accounting for all possible edge-case states the POS system can get into. in order to get the UI system out of such a state, a self-checkout employee would need to hope that the system wasn't too stuck so as to be able to log in with employee credentials, then scan a bar code to log in, then enter a UI mode that emulates the keyboard-and-LCD-driven POS system, then press ESC or something to get out of the unaccounted-for edge-case state. and this is all assuming the self-checkout employee has any knowledge whatsoever of the keyboard-and-LCD-driven POS system that's inside the self-checkout machine, which increasingly few employees have, because they've been using these UI wrappers for so long now. but when I first worked as a Walmart cashier in the late 00s/early 10s, nobody really had any trouble working the keyboard-and-LCD-driven POS systems—pretty much every cashier, whether teen or elderly, picked up how to use the POS system fairly easily.

modern self-checkout machines and touchscreen POSes are perfect examples of heaping more layers of abstraction on top of abstraction on top of abstraction on top of a simple underlying system that really didn't need all those abstractions, but they sure look fancier and apparently that's all that matters, productivity be damned.


Text-driven interfaces can be amazing. I worked at an auto parts store right out of high school. Our terminals for looking up parts were entirely text-based, and once you knew the commands and make/model/category shortcuts you could easily issue commands at a faster than the terminal could render the results. It was so efficient and productive. I was in a parts store last month, and their catalog software was gui- and mouse-driven and just slow and un-ergonomic. Sure, you get images to accompany the parts list, but I'm not sure the tradeoff was worth it or even necessary. I want to believe fast, uncluttered CLIs can make a comeback.*

*That being said, I just realized I sound like the professor in my AutoCAD class who insisted that the way he learned AutoCAD (all keyboard commands) was better as we all happily clicked around.


Yes, we've turned terminal/pc based apps that accomplished focused tasks in a very efficient manner, without a mouse, into 14 clicks on a web ui, each click having a 1-10s request/response loop.


But it's not always true that it's more efficient. At the biggest electronics chains in Norway they have this ancient terminal setup as well. And as a customer I hate it. It takes forever for them to do any little thing. Sometimes they have to ask for help. Buying a phone or something you can wait 5 minutes while they type furiously.


There is a HUGE difference between a user interface designed for professionals who live inside the application their entire working day, pro-sumers who are heavy but occasional users of app and consumers who rarely encounter the app or even only use it once.

These all need to be designed and built differently.


But my point is that some of these are expert users, and still a simple action takes them minutes. Just because it's an old-school terminal interface doesn't automatically make it golden.


Ahhh ... so your point is that you can write a horrible system in any format?

Sure, of course you can.


No, that's not my point. My point is that just because it's a terminal text gui, doesn't automatically make it a good expert system. It just has the look of one.


Buuuut it’s running on microservices now!


What do you expect when modern UX is about engagement not having an efficient transaction.


At one consulting gig, I came close to doubling the productivity of an internal operations team by turning a multi-step wizard into a single, condensed page.

Their design consultant whined about how the lack of white space made the interface "unreadable and, hence, unusable" but, when I left, the ops team took me out to a bar for all I could drink and their manager drove me home.


My mom used to work customer service at an insurance company. They had a mainframe app that they logged customer and incident data in. It was all mashing F keys to get to different screens typing in and tabbing between fields. As soon as she memorized which F key brought up which screen, it was easy for her to use and navigate.

Then they "upgraded" to an all-new, all-shiny Java enterprise app. And she was completely at sixes and sevens.

One of the things I learned about the corporate world was this: Management reserves the right to give you conflicting goals which must all be fulfilled as a condition of employment. It is the duty of the line worker, to resolve the conflict to the business's satisfaction.

In my mom's case the conflicting goals were "find the time to train on the new app" and "do NOT, under pain of termination, log into the system before hours, after hours, or during lunch or breaks".

It was only a few months after her retirement that the dementia started setting in. I blame that stupid insurance company, their stupid app, and their stupid policies for breaking my mom's brain. She spent the last few years of her career feeling like an incompetent idiot for not being able to do an immiserating job which, for all its warts, she could do just fine before. And once that was over and her pension was on lock, her brain just said "that's it. I'm done."

I don't have empirical evidence to know that this is the case. But it feels that way. And this is why I get assmad every time some BigTechCo completely flips the table and changes the UI on some app of theirs, "just 'cause".

(Fuck Instagram and their little popups: "We've moved/hidden that feature behind another menu. Check out the features we want you to use this month, because fuck what you want to do, we have KPIs to hit!")


Standard applications are optimized for discoverability, familiarity, ease of use and looks. Those are appropriate dimensions to optimize for in many cases, but not for heavily used apps.

Great games have amazing interfaces -- they let you do very complex tasks very quickly. They often take longer to learn than the standard GUI app, but so does the Costco text app, and it's a trade off worth making. Can that trickle down to the 8-hour-a-day type apps like the Costco screenshot?


It did trickle down to things like the costco screenshot, in the form of the CUA standard. Here’s a nice writeup of the history:

https://www.theregister.com/2024/01/24/rise_and_fall_of_cua/


There is a really interesting idea here for future makers of power-user software: Can we harness peoples' familiarity with games and gaming to make the user interfaces of the future? Everything is trying to look like Microsoft word today, but perhaps the design example for the next UIs for professional software should be StarCraft or WoW or Baldur's Gate instead.



In a former life I worked in a job where we used an interface like that. The learning curve was steep for the first couple weeks, but once you got used to it and developed muscle memory, it was soooo fast and just effortless.

They were in the process of replacing it with a web application when I left. Nobody I talked to could explain why.


Just to point out because I'm not sure it's obvious nowadays, but you could reproduce those interfaces perfectly well on both desktop and web applications. You may have a few dozen ms more of latency on key presses, but you can make every other aspect much faster.

And it would probably take less than half of the development time for desktop than for the original terminal app, although the web version would probably take longer. And it would automatically come with a lot modern goodies like real databases and networks that don't fail.

People just don't do it.


Looks a lot like CICS, which I spent a fair fraction of the nineties and aughts with. I helped the integrator team trying to get all the functionality into <REDACTED NAME OF GIGANTIC ZILLION DOLLAR ERP>, but the truths of the article held; zillion dollar ERP couldn't support the process. Obviously, the solution was more consultants, but the final judgement was always that we were doing business wrong. I left the company before this resolved, but as it turned out, those who remained did not have long. The company was effectively eliminated in a reorg before the new ERP could be used . . nine years after the project started. If it had been implemented, the entire factory floor and maintenance depot would have had to revert to paper . . incurring an unthinkable cost in scanning those records back into Zillion Dollar ERP.

It's far, far, far from just ERP either. In publications standards, we see this all the time, and as a proportion of cost/value[1], it's even worse than the ERP disasters.

[1] Of course, that might be because no one really has a way to quantify how much documentation is actually worth.


When I was first a retail pharmacist in 2013 we were in a DOS based system. It was fantastic. Fast, fully operable by keyboard only, etc... All the new systems are slow, require mice, scanners, fingerprint readers, etc...


About 20 years ago I worked for a bank that processed checks (checks would come in from businesses and they would be sorted and reconciled here). The check processing machine was a 26-bay IBM check sorter/document reader (the IBM 2956). This machine then was nearly 30 years old, was new to the company yet cost $250k, and needed weekly maintenance from an IBM tech. They retired that machine finally 6 years ago.


Increadibly fast on hardware that would be too slow to run the slow web/desktop apps of today.

Today, "modern" means slow, with large attack surface; often broken on arrival and thus requiring constant "updates"


Yes, it's like the difference between adding a task in Jira vs in some todo.txt file.

Slowness of the interface aside, you just have to do the whole ceremony for some simple don't forget/think of it later note.


At Goldman Sachs on every trading and quant desktop you will see a blue turbo pascal ANSI graphics display. That display is the most powerful risk management system ever built, SecDB. It’s been derided for its lack of GUI elements for nearly 30 years. However behind the blue screen is one of the most advanced computing systems around - an instantaneous deployment and roll back wave front monorepo system built around a topologically evaluated memoizing graph of evaluation that provides a high level abstraction of all data and models ever created at Goldman seamlessly stitched together into a total view of everything in the financial world - all stocks, bonds, derivatives, loans, options, airplane parts receivable contract.

People who judge software by the chrome and how recently 1.0 was released are the epitome of fools.

Anyone who thinks the world hasn’t progressed since the 1970’s and they personally have the secret for success are either delusional or grifting you.


I had never heard of it—unsurprisingly—but wow: "The SecDb ecosystem is composed of over 10,000 databases supporting over 2.5 billion connections, receiving 164 TB of messages, and serving around 8 PB of data."[1]

[1] https://developer.gs.com/blog/posts/secdb-observability-jour...


> The problem isn’t that enterprise software doesn’t work - it is that you didn’t make it work for you. The only measure that counts, hidden behind all those layers of abstraction, is the final outcome - decision advantage. This is often viewed as an implementation detail (the second 80%), and implementation is often outsourced to experts (COVID should have also taught us all something about “experts“). The industry of consultants who themselves have never run businesses but tell cargo cult companies how to execute implementation further obfuscates the lack of productivity gains.

This article is all over the place. It's written in a very confident and attention-grabbing style but lacks clarity and concreteness - especially so for an article criticizing abstraction!

The quote above is the closest I could find to an interesting thesis. I would restate the argument as:

1. SaaS companies optimize for revenue

2. The buyers of SaaS are undiscerning and will buy convoluted SaaS products even if the software doesn't actually improve worker productivity

3. Therefore economic growth has slowed because so much talent goes into SaaS which builds products that no one needs.

It's an interesting argument and I'm sure there's some truth to it. But surely the article would be better if it focused on supplying evidence for the incentive mismatch in #2 and showing that the magnitude is large enough to cause #3, rather than throwing out ambiguous and condescending claims about Covid, consultants, cargo cults. The core concept of "decision advantage" is not even defined, and seems to be borrowed from military terminology.


interesting idea + overblown expression = advertisement


The core premise of this article is convincing: A lot of our software solutions are running to stand still. Occasionally I'll see a video from long, long ago (e.g. https://www.youtube.com/watch?v=ED3jbNCCjok from an episode aired in 1983) and I'll marvel at how little we've actually changed.

Having said that, it would be awesome if everyone stopped cargo culting on the backs of cargo cult analogies. The number of blog entries and articles describing South Pacific cargo cults, and using it to describe any and all follow-the-leader, copy-paste behaviours is...well it's painfully ironic.


> it's painfully ironic

It was successful in drawing in my attention initially , but by focusing on that so much it leaves me thinking about how the article arguably isn’t describing a cargo cult. It ultimately comes across as slightly clunky sales copy for Palantir.


You're describing bandwagoning. Cargo cult is misattribution of cause, not follow the leader.


Bandwagoning and cargo cult behavior are largely synonymous in practice. Most cases of the former could be described in a trite blog piece by talking about some South Pacific tribe during WW2.

Everyone getting into AI? Well, you see there was a tribe in the South Pacific that saw planes bring supplies... People adopting agile methodologies? Well the planes would bring supplies and they thought if they just copied the behaviours...

People (and people in the aggregate in the form of orgs) try to copy what they see the "winners" doing. In this case, an uncountable number of viral blog posts have somehow used some South Pacifican tribe as the basis for some profound lesson, exponentially yielding another crop of bloggers talking about cargo cults, rinse and repeat.


> well it's painfully ironic

How's it ironic?


Seems the words "rent seeking" should appear more in this discussion. "B2B SaaS" companies may well be cargo cults; but the blessing they're dancing to secure, the goal they seek, is "passive income".


One of the things that made us dominate over our competitors in the time I've been here was that our boss switched to a SaaS business model early, while our competitors held on to the "buy new version every 2-3 years".

Accounting really loved the predictable expense, and it was easy for us to scale price with activity, so our customers paid less during slow months or as user count fluctuated.

We even had a competitor which was about our size nearly go bankrupt, as their customers weren't interested in the .Net rewrite they had spent 2 years on, opting to keep the old version rather than buying the new, but functionally very similar, version.

So it's not clear cut IMHO.


Their heros are places like Oracle, and now Uber. Orgs that get to just the right level of dependency where people and businesses just sigh and accept whatevers necessary.

Eventually, some country is going to realize a piece of software is essentially a utility and letting some private company keep a sector of the economy in it's grasp prevents progress and it'll be nationalized.


So China. If you're right it will be interesting to watch. Interesting to see how other countries respond.


China comes from the wrong angle.

everything in China already was nationalizes and they've been trying to give some , mostly fictious freedom.

we're talking and socialism and European compromises.

it's really weird you think we're talking about China.


I thought so because, as you say, Chinas has nationalized banks, the airline industry — because they view them as important components to the nation.

I assumed we were not talking about the U.S. (and to a lesser degree European countries) since nationalization of the private industry is anathema to these countries. To be sure though I know nothing about nationalization sentiments in Europe these days.


you are confused they didn't "nationalizes" banks

they were never independent to begin with. you have a naive comparison and force it uncritically into this conversation.


Damn that private sector for inventing things that create progress. Let's pretend it's actually stopping progress when it creates progress.


>> Damn that private sector for inventing things that create progress. Let's pretend it's actually stopping progress when it creates progress.

The software may be beneficial, but the siphoning off of "profit" only benefits the company owners. It's friction. That's not to say it's bad.

I once had a conversation with a reasonably successful investor. I tried to point out that he didn't contribute anything and get lectured on how hard he works to pick good investments. I really had to spell it out - the money comes from a transaction between the company and its customers. The customer gets something and the company gets money. The money pays for the employees who make the things or provide the service or whatever. Profit goes to the owners/investors but it really does nothing for the people who actually provide something or pay for that something. There was a time when a company issued shares to raise money, and those investors actually did something for the company. Every trader since then is simply trying to profit on what those before them did while offering zero value in the actual transactions making money for the company.

I'm not saying that's bad, its how the system works today and it seems to work reasonably well. But lets not pretend the "profit" isn't friction between the parties involved in the transactions that lead to that profit.


u seriously arguing oracle as a source of progress? "embrace, extinguish and extort" Oracle?


> u seriously

This is highly ironic.

I was replying to your second paragraph. But on Oracle: I don't like Oracle very much, but you'd be wrong to think that the jump into relational databases on commodity hardware wasn't massively facilitated by Oracle. We aren't all renting mainframes in the cloud right now in a large part down to Oracle's advances in the last quarter of the 20th century. Oracle is still the world's third-largest tech company. That means what it does, even today, is extremely valuable.


>> Oracle is still the world's third-largest tech company. That means what it does, even today, is extremely valuable.

Not going to refute that, but what is that value really? Database software is readily available for free, so there's actually very little value in just the software right? Is it the stability of the product? Support? Just the big name behind it? Those are probably part of it.


If you're creating a bank in 2024 from scratch, maybe you could now rely on OSS. But you'd probably rather still have a big name that has great expertise in what you want to do across all types of hardware who can come in and make sure things work. And that's from the perspective of not having the cost of changing. If you have to move off Oracle, which provides all these things, you'd have to spend a fortune and still be taking a higher risk.


"Rent-seeking is distinguished in theory from profit-seeking, in which entities seek to extract value by engaging in mutually beneficial transactions.[8] Profit-seeking in this sense is the creation of wealth, while rent-seeking is "profiteering" by using social institutions, such as the power of the state, to redistribute wealth among different groups without creating new wealth."

https://en.wikipedia.org/wiki/Rent-seeking


> but the blessing they're dancing to secure, the goal they seek, is "passive income"

That's not rent seeking.


>> That's not rent seeking.

SaaS companies are quite literally charging rent for the use of software. Presumably that software does something of value, so they call it a "service" so people can feel like they're getting something for their money.


That's true if you re-define rent to mean payment for a service...


>> That's true if you re-define rent to mean payment for a service...

You're paying for the use of something (software) not a service. I can rent a U-Haul by the day (rent) or I can pay for the use of some software by the day (rent). Calling the use of software a "service" was just a fancy way to disguise what's really happening to make it sound more acceptable.


Rent seeking is literally not that though. The word has a definition, and if there's one thing you can't really accuse those corporations of is to basically just extract value without any investment of their own. It's the total opposite, just look at head counts, the amount of money getting spent etc. That alone is basically the opposite of rent seeking, where we expect to see wealth extraction for minimum investment/productivity(and thus, headcount)/cost

More to the point, rent seeking involves taking money for something that provides no additional value in return, usually because you have a monopoly on something that isn't related to the service you charge for.

For example, you own the land so you charge for passage, without building a road or giving back anything or investing any of that money back to create wealth. That makes almost 0 sense for most software corporations. Sure, they might not be profitable but SaaS is basically the opposite since you get the service or a value out of the service directly, and if you don't, there's no reason to pay for it .

I can't think of a lot of cases where a SaaS product is more or less "forced" on you to the point where you continue paying for something that doesn't provide you value. Rent seeking isn't profit seeking.


> SaaS companies are quite literally charging rent for the use of software

That's not rent seeking either. Look up rent seeking and see if it means "charging for the thing you sell or lease". I don't think you'll find that.


You're right, I looked it up. SaaS software is essentially rented but that's not the same thing as "rent seeking" by the company. I think that's an unfortunate name for the bahavior. The examples I found sound like getting a legislated flow of money for doing nothing. In that light, lobbying for tax reductions would be rent seeking? A penny saved is a penny earned - for nothing.


You're talking as though tax is a right of some sort. You shouldn't have to justify keeping your money. Government should justify why it needs to take it, or it's even less different to the mafia knocking on your door.


> I think that's an unfortunate name for the bahavior.

It made sense when it was coined; it's not like that was the day before yesterday. (Can't recall for sure, but I'm guessing Adam Smith, in The Wealth of Nations.)


> Satya goes on to argue that AI has the potential to restore productivity growth.

And Satya has a big investment he needs paid off. Had Satya's statements come from an economist or other it might be a little more believable.

> It is symptomatic of an engineer who doesn’t understand the underlying problem and tries to pattern match to get at the solution.

Pattern matching is absolutely the correct thing to do for most situations because it is rare that anything new is necessary. Almost all of engineering is a reflection of business processes and business has the same mode: a new business process is most of the time pieces of old processes rearranged.

> But when the abstraction isn’t tethered to empirical results or anything measurable and falsifiable, the abstraction is just an obfuscation, a cargo cult ritual.

That's not a problem with the abstraction that's a problem with observability. Abstractions are insanely useful both pragmatically and otherwise. O11y is often the last thing considered, if it's even implemented at all.

I'll also go on to say that it's interesting that the stagnation problem is framed in such a way. Software is a tool, not a product. It's going to take a product or class of products to fundamentally change the direction of growth and innovation. For all we know AI might even be it, contrary to the author's contrary.


I’ve been thinking about this a bit lately because I’m taking an economics class, but the pursuit of maximal profit doesn’t make as much sense to me as it used to. Specifically, what actually happens, in economic terms, to profit? Nothing good, it seems.

So maybe the “cargo cult” is the ruthless pursuit of profit, as opposed to regular reinvestment in the firm. Companies are incentivized to optimize productivity rather than grow it, as every dollar saved is a dollar to take out of the economic cycle and put into the investor’s pocket.

Didn’t corporate tax rates plummet appropriately at the same rate that computer technology exploded? That seems to me to be important, to the point that I’m not sure we’ve got the right culprit here.


All of the current tech giants reinvested most profits to grow themselves, so your core argument seems totally off. If anything is wrong today it is that there is too little profit seeking, not too much, lack of profit seeking is what sets the last few decades apart from the majority of the last century.

They seek growth instead of profit, and run unprofitable until then and burn investment cash hoping for a big payout, and when that big payout doesn't happen you get the big crash and layoffs...


But what even happens to profit? It disappears from the cycle. The value created is lost because it’s no longer used to improve the firm that created the value. It’s redistributed to various yacht companies and the like.

Additionally the profit incentive is in direct conflict with the productivity goal, as additional profit may not arise as a result of additional productivity, given the cost of pursuit.

And while yes, big tech firms may have pursued growth, correctly so, many existing firms did not, and I didn’t think we were limiting our discussion to the tech industry, and we can be sure those tech firms have since shifted gears.


> But what even happens to profit?

It gets invested into companies that doesn't make profit yet. Where did you think investment capital comes from? So profits is the company paying it forward, the company lived off investments at the start and later pay that forward in the form of profits to build new companies via investments.

> It’s redistributed to various yacht companies and the like.

Only a tiny part of profits goes to luxury consumption, rich people generally don't consume a lot relative to how much they have, otherwise they wouldn't continue to get richer.


Profits don’t pay interest, interest payments are expenses, and those payments can be what goes into new firms as funding.

An entrepreneur provides a resource, I wouldn’t deny that, but every other resource has a price. Why does entrepreneurial effort get to be uncapped? Shouldn’t it have a price too, a specific value, tied to what it gives a business?


Investment capital isn't a loan, it is a profit sharing agreement so you don't pay interest, you pay dividends to them when you make profits.

Nobody loans money to risky startups, that sort of agreement isn't a good fit for risky ventures since the risk is high and the payout is low. Profit sharing is how they can get money to start making profits.


But it is a loan; you get funds up front, with an agreement of payment later. In a broad sense that is the very definition of a loan.

The problem here is the agreement is uncapped, leading to things like a preference for additional profits over additional productivity.


> But it is a loan; you get funds up front, with an agreement of payment later. In a broad sense that is the very definition of a loan.

No, you don't promise to pay anything back, you promise to give them a share of the profits, no profits means you don't have to pay anything. When you take a loan you promise to pay it back, profit sharing doesn't come with such a promise.

> The problem here is the agreement is uncapped, leading to things like a preference for additional profits over additional productivity.

It is uncapped because otherwise nobody would give you any money. If you want peoples money you need to give them a good reason to give it to you, and sharing a fraction of the profits you make is the only thing you can give them to make it worth it to them to give you money.

If you have a very stable business idea with no risks, then some people would be willing to give you a loan. But that doesn't cover most startups. Instead you have to sell a share of your profits.


Firstly, you just keep describing a loan and then demanding it’s not a loan. A distinction without a difference. Secondly, people take out loans all the time for small businesses. Thirdly, whatever you want to call it is not a boolean endeavor; investment wouldn’t drop to zero if a limit to profits were introduced, it would drop proportionately to the cap.

The question was how to spur productivity growth, and I think an answer involves looking at the conflicting nature of profit motives. I don’t know what the answer is, but I think looking there will be more fruitful than blaming technology, based on my limited exposure to classical economics.


> Firstly, you just keep describing a loan and then demanding it’s not a loan

Selling a part of your future profits is not a loan. That is your made up definition.

It is impossible to discuss with you if you change the meaning of words like this to suit your own views.


Sorry mate, have a good day!


> The value created is lost because it’s no longer used to improve the firm that created the value. It’s redistributed to various yacht companies and the like.

It's not "redistributed". There aren't parental figures just apportioning money as they see fit. People are spending money on what they find valuable. Partly that's on the firm. Partly that's on salaries, particularly in places where creating competitors is easy and the incentives are good for people to want to take that risk. Partly on maintenance and the like. Then a chunk of the next lot must be given to the government or directors will be locked in jail. Finally, out of what's left, comes the money you're talking about. That can be reinvested, or pulled out. It doesn't "disappear".


It disappears from the industry that created the value, the industry arguably best poised to increase productivity, was my point. Yachts and fancy cars are part of the economy too, but they’re not the best place for resources to go if you want to increase innovation and productivity.


Yes they are. Things are often invented in fancy cars that get market-tested (at the rich customer's expense) and ones that are popular come to the mass market once the kinks have been ironed out. Also, people who work on yachts buy cars.


Yachts and fancy cares are the best industry in the world for resources to go in order to maximally spur overall economic innovation and productivity?


> Yachts and fancy cares are the best industry in the world for resources to go

You're doing a trick of pretending that an example that's been successfully countered wasn't just an example, but the only thing in question. Obviously CEOs spend on various things, as do we all.

> in order to maximally spur overall economic innovation and productivity

If I buy a Mars Bar I'm not putting my money into what's going to maximally spur overall economic innovation and productivity. You're creating a ludicrously high bar for their spending, that you're now pretending is all on yachts and cars, to clear - one which most people's spending also wouldn't clear, if we even knew what would clear it.


> So maybe the “cargo cult” is the ruthless pursuit of profit, as opposed to constant reinvestment in the firm.

It’s an old story, an individual or group of individuals becomes greedy and consumes enough resources to finally destabilize their society. A new society is born, people are afforded a new opportunity, for whatever reason a person or a small group of people gains an advantage, and then the cycle repeats. I think the question is how to shift incentives to break the cycle, while getting buy in from the stakeholders that are currently running it.

There’s been some ideas around potentially how to do this, one suggestion is changing quarterly reporting to annual reporting for public companies, but then this limits visibility into public companies even further and creates a new set of problems. Then there’s the idea of getting rid of public companies, but I’m not sure that would actually result in the outcome people imagine. Finally there’s talks about wealth tax, which could work but really is more of a stick than a carrot, and has been met with a lot of hostility. So we have what we have and hopefully someone figures it out before the cycle restarts.


> Specifically, what actually happens, in economic terms, to profit?

In an economic sense, profit occurs when an enterprise produces goods & services more valuable than its input costs.

Such economic profit is always desirable- whether it’s made by a private corporation, state owned enterprise or government department.

An economically unprofitable enterprise is a burden to its customers, owners or both.


What about an enterprise in economic balance, making nearly zero profit because it shared surplus revenue with the workers, even after all of the interest payments on the loans given out by the investors (can’t forget the investors, of course)?


I would describe such an enterprise as profit making, if the sharing is truly voluntary.

The persistent existence of surplus revenue implies the excess of resource production over resource consumption i.e profit.

However, if this sharing is involuntary it's likely the true cost of labour is higher it first appears. For example, bankers' bonuses and engineers' stock options are important parts of labour cost for the relevant enterprises.


There can be multiple balance points.

In the one you describe, then the employees would profit more, unless they too have minimum costs equal to revenue.

Fortunately, humans have real minimum operational costs close to zero, and can live for a long time off a handful of dollars a day.


That's still profit. It's just the profit is paid as dividends to workers.


That would actually be an expense, reducing profits by intentionally increasing labor costs.

In economic terms profit would be reduced.


No, not if it's dividends, as I mentioned. Dividends are paid out of profits.


But it's not dividends, it's increased salary, so it's still not net profit, it's pre-operational profit, which nobody looks at due to the importance of paying for labor.


I said this:

> That's still profit. It's just the profit is paid as dividends to workers.

How is that not dividends?


Because dividends are payed to shareholders, not for labor. It’s not net profit if it’s not accounting for the cost of labor.


Workers can have shares in a business, which is the hypothetical I was going for originally.


Weird that in this growth-obsessed world there has been no actual growth for 50 years, once you take inflation into account.

Makes you wonder what significant financial and economic changes happened in the early ‘70s to cause that.


> the pursuit of maximal profit doesn’t make as much sense to me as it used to.

Alternatively, accounting hasn't kept up with the realities of the world. Profit today isn't measurable just in terms of the exchange of tangible items. Actors also seek to profit by way of other intangible things like, most notably, attention; something old-school accounting doesn't account for.

Maximal profit makes as much sense as ever, but maximal profit as it shows up on the books using traditional accounting methods may not.


> Specifically, what actually happens, in economic terms, to profit? Nothing good, it seems.

Uhh . . . first it pays the bills allowing the company to keep running, including your salary. Then the rest goes to shareholders, where it does things like power your 401(k) or if you're really old, your pension fund. Which then allows you to do nice things like retire someday.


Profit doesnt pay for salary. Salaries are an expense to the company. Profit is whats left over after all of that and is what the company can give back to the shareholders through dividends and stock buybacks.

I would guess that in a profitless world there would be something different than a 401k for retirement. Humans can exist without profits.


> Humans can exist without profits.

Of course - and they did for a long time. But they couldn't have anything that couldn't be scratched out of the soil.


Consider profit as what happens when the demand for your product is strong enough to warrant reinvestment.


I don't think it's fair to say that "nothing good" comes from profit. Aren't new businesses created from profit? Investors make a profit in one business and then invest in others. That in turn creates jobs and opportunities.


That’s fair but is that the only possible way for new businesses to arise? Couldn’t the same be accomplished with the income received from the sale of resources? The revenue would still be there, the money would just be more widely distributed amongst the owners of the resources that businesses purchase to make more of their goods and services.


Tech over-promised and under-delivered. The article presents a compelling argument that resonates deeply with the current state of the U.S. economy and its technological focus. Tyler Cowen's notion of the Great Stagnation, alongside the critique of modern tech's superficial advancements, underscores a critical misstep in our economic development strategy. The emphasis on software and tech, while neglecting the foundational sectors of manufacturing and engineering, has led us into a productivity paradox where technological advancements do not translate into real economic growth or improved living standards for the average citizen.

The analogy of cargo cult companies vividly illustrates the pitfalls of prioritizing abstractions and financial metrics over tangible solutions and innovations. This approach has not only stifled genuine innovation but has also diverted attention and resources away from sectors that are crucial for sustainable economic growth and competitiveness on the global stage.


> Tech over-promised and under-delivered.

Seems like you are criticizing profit seeking rather than tech.


That would be the piece relevant to GDP and economic stagnation right? Or at least a strong link I would think.


Well, with regard to profit-making, yeah, tech promised and delivered — short term profit anyway.


> Satya Nadella at Davos in January 2024 said; “ ...inflation adjusted, there is no economic growth in the world, I would say and that's a pretty disappointing state. In fact, the developed world may have negative economic growth...PCs were the last time, when actual, economic growth came about, right? So, the last time it showed up in productivity stats were when PCs became ubiquitous.”

It feels good to hear someone up high say the quiet part I’ve suspected may be true out loud.


The real cult is the one where "productivity" is considered a measurable and worthwhile goal.

The official definitions make no sense. "The ratio between volume of inputs and outputs."

What does that even mean? More stuff? More clicks? More eyeballs? More sales? More disposable income? More wealth disparity? (That last one being the real meaning, IMO.)

The subtext seems to be that OP believes CEOs are visionaries and he'll repeat whatever they say uncritically. Which is a good example of a cargo cult attitude.

It turns out Nadella and the writer are dead wrong, even if you use the official economic definitions.

The writer didn't bother to check this. He believed a simplistic narrative that appealed to his preconceptions without bothering to check the ground truth.

https://www.researchgate.net/figure/Illustration-on-Real-GDP...


The real cult is the one where "productivity" is considered a measurable and worthwhile goal.

How could you *possibly say this !?!

It is literally the definition of not having to break ones back to feed their family!

The problem is, we have allowed our money supply to become manipulated so that inflation can capture the productivity increases to those controlling the money supply instead of to those producing the output.

I post this wherever I see this argument, as someone else has done the work to CLEARLY show the data that shows where the output goes:

https://wtfhappenedin1971.com/


Indeed, there was also for one a pandemic (exogenous shock) which most certainly changes ideas of productivity and measuring economies. I am not aware of economists that are using more advanced AI in their analysis -- does anyone know of cutting-edge economists who are using much more modern tools for inspecting economies?


Well, it is easy to see that is an incorrect statement.

GDP and wages can both easily be adjusted for inflation.

We can then measure and see that with inflation, while GDP continued to grow in real terms, all median worker wage increases were offset by inflation.

https://wtfhappenedin1971.com/

What does this mean? It means that in 1971, creating a purely FIAT currency with unrestricted ability to devalue it, gave those in charge the ability to move all productivity gains from the wage earner to the capital owner.

And as the data points out, at no point has there been even a smidgen of release of the pressure on the wage earners throat to allow them at those productivity gains.

Any income gain a "median" worker can ever hope to achieve is more and more exclusively tied to not earning that income via a wage.

This basically defines where our housing and other speculation bubbles are coming from, and shows that people fundamentally know this and are trying to escape the rat race of being employees if they ever hope to build wealth.


It might be that PCs were the last of the huge productivity boosts… but saying there’s been “negative economic growth” since the advent of the PC or thereabouts is a pretty fringe take.

It would necessarily imply basically every serious attempt at measuring economies, and valuing the activities taking place therein against inflation, is fundamentally, terribly wrong, by absolutely massive amounts. Now, those are hard tasks and there are many opportunities for flaws and biases, but many people around the world have devoted serious efforts to understanding and correcting for them, and the consensus is very much Not That.


I don't understand that quote, isn't GDP growth stated with an adjustment for inflation.


Starts with a rant against "cargo cult software" and ends with an endorsement of cult-of-personality management-by-CEO.

Perhaps the two are more closely related than OP realises?


I think the bigger lesson here is that you absolutely need executive leadership that understands what truly matters.

And I think "cult of personality" CEOs typically end up as the outliers here (on both sides of the spectrum) as they're better at withstanding pressure from external forces (Wall Street, dumb industry trends). But the flip-side is if they've got a garbage business ideology or are frauds they're just ruining lives and setting money on fire.


Yes - I think this is one of the major takeaways. A good CEO matters a lot and bad or mediocre ones ruin companies.


I think a lot of this exogenous growth stuff is just mysticism for people eager for a secular religion.

https://www.employamerica.org/blog/it-wasnt-ai-how-fiscal-su... the endogenous growth story isn't as "fun" but much simpler: until we 'run out of workers', there is no reason to make anything more productive, and tech will be, at best, messing around with useless vanity stuff.


> until we 'run out of workers', there is no reason to make anything more productive

Labor force participation took a major hit during the pandemic worldwide and has not recovered; population growth is limited, and workers are harder to find than they’ve been. Obviously there remain opportunities to use workers overseas, but barriers to that remain or are even rising (usually political barriers - free trade is not popular right now - and even ocean shipping is hit by the near-war conditions of the Middle East).

Sounds like a prime time to make things more productive, at least in most western economies?


Our society isnt really structured to do things that require many years of research so were either making things incrementally better or just going after low hanging fruit. We need different incentives than profit.


Except things are not getting even incrementally better in the aggregate. Say there is a 5% improvement in hardware performance this year it will immediately vanish by shitty programmers writing shit code that can now be 5% sloppier.

I can’t say there has been any new tech that has made me excited or changed my life. Best I can come up with is turning on/off my lights with my phone, but even that is old AF already


Good point. But it seems you have to go back to the Pharaohs to find a time when we (well, slaves anyway) did those things for reasons other than profit.


Sure, we’ve been doing things wrong for thousands of years. Doesnt mean we cant correct that.


I don’t know what you are talking about. The world around is nothing but the fruit of research.


Of course. I am looking at things like nuclear fusion that are perpetually underfunded. It would be an incredible technological advancement but it is an enormous undertaking that industry is largely unwilling to participate in because there are no short term profits to be had. So it falls on governments to pick up the slack but most of our energy as a society is in industry.

Think about the old meme that the smartest people are focused on increasing ad click through rates. Imagine a world where they could instead be focused on something useful.


Computers made everyone their own secretaries then gave them terrible tools to do it. So now people earning 5-10x the salary of secretary can spend 60% of their time maladministrating their work.

"Computers"


It’s not that productivity has stagnated. It is that much of our produce is now intangible. But we do put out terabytes of memes and Instagram pics and TikTok’s everyday. For good or bad that’s what people like nowadays, and that is where money goes to. If we count entertainment and bingeing as legitimate economic utilities, we are in the midst of the craziest economic boom ever.


I like this point, the inflection point in the graph at the top of the post seems too strong to be "real" IMPO, seems more like an accounting problem as you say. Did a little digging and it does seem like this is a pretty legitimate criticism: https://en.wikipedia.org/wiki/Productivity_paradox#2000_to_2...


Seems like quite a stretch to suggest broad, long-term economic stagnation is due to ineffective use of software.

It also doesn't really make sense that "great leaders" is a solution.

There's also a final recommendation to "exclusively work backwards from decision advantage" which might be good advice depending on what they actually mean (well, not the "exclusively" part, but I don't think that's meant literally), but not really to do with great leaders.

I think this must be a marketing piece by a management consultant (it's packed full of tropes and cliches)... and could kindof appeal to some C*O who is confounded by the technology their company uses and this could let them imagine themselves as a "great leader" by making more efficient use of software by "working backwards from decision advantage."


> (COVID should have also taught us all something about “experts“)

That wearing a mask and getting vaccinated was good advice?

That the internet is full of malicious people trying to discredit experts giving sensible advice just for their own profit and fame?

Maybe I'm remembering the pandemic differently than the author.

> To say the quiet part out loud: the OODA loop is all about the human. Technology serves humans. Instead, we have cargo cult companies such that the mere acquisition of architecturally-ordained technology is perceived to be the solution.

The first time I came across this basic thesis was back in maybe 2000ish? Or a few years later. There was a great article about an author who was whiteboarding out a solution, and the architect wrote this phrase, "Configuration data will be stored in XML" (the title of the article).


re: COVID part. The conclusion leaped to is likely inaccurate and not what he meant by the ambiguous "experts" remark.

Their company was involved more in the supply and distribution of COVID vaccines, based on data collection of COVID metrics. The experts he may be referring to are the ones who were painting an inaccurate representation of COVID case distribution. Political messaging shadowed the reality of case distributions by region—inaccurate reporting of COVID cases due to a lack of pipeline and procedures for data collection.

In this scenario, the "experts" would be individuals ignoring the reality of the data collected and making decisions based on other motivations.


Yeah, I guess the "experts" in quotes checks out from that perspective.


> This has always been true: The steam engine didn’t matter until it was put into a ship and locomotive

Sure, but had steam engines not been put over coal mines they wouldn’t have been refined to the point of propelling themselves [1]. It didn’t matter to contemporaneous productivity; it definitely mattered to it in the future.

[1] https://acoup.blog/2022/08/26/collections-why-no-roman-indus...


Sorry, this is too ambitious and overstated. A thesis about a four decade long, all reaching economics phenomenon needs much more research and evidence to be taken with any seriousness.


Absolutely. Claiming you’ve found a solution to this problem with such a shaky premise is actually inviting others to join your own cargo cult.


I'm with the author on the critical eschewing of abstraction, but I don't think that leads so clearly to the strong leader form. It suggests we need to be wiser technologists - understanding why we are making abstractions and what the downsides are and not simply ignoring dangers that are hard to address within our chosen abstraction. Organizationally that should mean more, not less independence for people at every level, who understand the boundaries where systems break down.


First question, where can we go from where we are? What do people desire that they can not have? I am not going to suggest that there is nothing left to be invented but I find it quite hard to come up with things that would have a significant impact. The one big thing that comes to mind is health - cancer treatment, organ replacement, and so on.


Less labor and more leisure time seems like the biggest winner for lots of people right now. I have all the commodities I could ever genuinely want, but for all my nifty gadgets I'm still giving up around a quarter of my life in exchange for a salary and I barely have time to actually them. If we could divert resources from generation of new/improved commodities or shareholder value and instead allow people to have more leisure time I believe that we'd see a lot of knock-on social benefits.


> Technology serves humans. Instead, we have cargo cult companies such that the mere acquisition of architecturally-ordained technology is perceived to be the solution

Sounds right on the money

And of course it has to be the latest bells and whistles without questioning why is it even there. Or how it can serve the company and not the way around.


>PCs were the last time, when actual, economic growth came about, right?

Uhh... smartphones? Shale oil? EVs? AI? Those alone are at a significantly higher (inflation adjusted) market cap than the entire S&P 500 in the 90s.

I don't buy this argument at all. We have seen massive real growth in the last 20 years.


Errr SaaS caused the great stagnation. Or non text UIs. Or consulting firms. I’m confused!

And this is just factually incorrect.

> Intel’s road to ruin was when the CFO became the CEO and managed the business by the numbers and abstractions.


I really could have used a concrete example here. Author seems like they have a very clear idea of what they're talking about but kind of failed to communicate the vision.


> Satya goes on to argue that AI has the potential to restore productivity growth.

And I would argue investment in infrastructure (I know, hand-wavy term) would do better.


Let’s say he is too much an interested party to have any credibility on his opinion. Would he not say it is the case? He bet his company on it. He is paid to say so.


To be sure, a hammer salesman pointing out that nails will save us.


The Great Stagnation is obviously a consequence of worker protection and environmental protection laws. Give me an exemption to both and I will turn this economy into a rocketship.

Everyone knows this is it. But everyone likes worker protections and environmental regulations so we pretend it’s not. But we know.

Remove those rules and TSMC’s AZ plant will be done yesterday and SpaceX’s Starship will fly more often.

We can dance around it but this is reality. All this other stuff is just window dressing.


This theme of progress has been on my mind for decades.

My conclusion, while simplistic, points to an unfortunate mismatch between (1) our personal/subjective expectations to “progress”, (2) business management dogma, and (3) the shift from “enabling-progress” to “distribution-progress”.

1. When I mean by subjective expectations, is that as the article refers to, many of us (techies), carry expectations to what progress must be like, the flying cars, spaceships and robot friends. These are images put into our minds, and some of them are based on pure fantasy and others, on former scientific trends, and their extrapolations. Others again, from former and current influencers, using them to manipulate us. An initial step to better grasp whether we are stagnating, is to look at the “possibility-space” between what is fundamentally possible within the principles of known science, and see where we are dragging our feet. If we are missing progress in obvious areas, it may come down to economics, but most likely is is due to the lack of transitional/incremental investment opportunities with a sufficiently small risk/capital step size to allow for progress (on this i (2) ). We are not likely to get anti-matter worm-drives as the next iteration from SpaceX’s Starship, the step-size is simply too risky even if the physical principles are understood.

2. Since the mid-70s, the general business trend has been to take all business decisions based on a NPV/ROI, hereby assuming that we have sufficient knowledge about the outcome of an effort to estimate whether it is worth doing. Likewise, the frameworks of risk management are seldom able to manage uncertainty in imagination (i.e. the “unknown-unknowns”), forcing a decision bias towards preferring investments with known risks, that can be managed. Combining these management practices makes it basically impossible to allocate capital towards fundamental R&D, with no certain outcomes, and where the ROI can have an asymmetric upside, by opening up a completely new paradigm for business and progress. On top of this, “Venture” Capital, is not and have never been free from this type of management, the ventures are known (whaling, SoMe, or SaaS) the risk taken is that of successfully capturing market. They are not the vessels of progress, but distribution. We could also put our faith in the academic institutions, and government labs to bring about the future. But they too, have largely been funded according to the same model, but with a different incentive structure (think grants, citations, recognition) that also values controlled risk and predictable outcomes. As an aside, I remember one of the most productive professors at my old institute also said to apply for research funding for the already concluded project, and then use the money for the next project. That way he always had a perfect success rate, and could match the research project to budget, time and scope.

3. When I mean with the two types of progress, is that we perceive the invention of telephony as progress, in the systemic sense, now “it is”, but having a telephone in every home, that availability is certainly also progress, as it makes it useful “I have it”, and doubly so for network-effect products, where “you’ll have it too” matters, like the phone.

Of these types of progress, “enabling” naturally most precede “distribution”, but as a society we can gain great wealth from the latter, ignoring the first for quite a long time, before we realize something is “missing”. I think that is what has happened from the factors in (3) for the last 4-5 decades. Computers and IT have driven a massive distribution-progress, blinding us to the fact that enabling-progress was dwindling. And now, when the economy can’t hide the fact much longer, we start to discover that lack of “progress” in the subjective sense.


So, if it's so bad, the more productive companies will take over - like Costco.


I denigrate the term SaaS. Do any other industries have this weird acronym for what they innovate on? It can even be called nBA if acronyms had to be used for everything( "no Brick Apps")! Imagine how SaaS sounds for somebody not in tech! And how arrogant of Satya to just completely wipe out smartphones as an economic growth engine , his mind is stuck in the 70s! I would say technology has always followed from institutions with heavy funding, who had the time and money to do this kind of innovations. If there is no incentive for those institutions to do those innovations, say, for example DARPA invents a neuralink like machine for helping the soldiers be at peak performance, but turned out it did not really help but hurt them more, they would not do so , and so, the landscape for the tech industry to imitate and develop a prototype off that tech is non-existent. Elon musk tried to change the path away from relying on those institutions, but it is just not practical to do so in terms of time and money for a publicly funded company in the capitalistic RoI hungry stock market. One would argue that even the funding that institutions get is from tax money, but once that money is gone, it is gone, they aint asking for interest and principal on that money! May be it is time for the stock market to adapt and create such an environment where interested public can give away some portion of their money , like crowd funding, and they choose which individual be eligible for using that money to advance innovation, similar to democracy but this time, with peer reviews from experts in the field! This would work for the economic growth as a whole because it is not individualistic but everyone is working towards the goal, similar to what John Nash says in the beautiful mind, the best result comes in a group when everyone does what's best for himself and the group! Adam smith was wrong!


Not sure how apt the cargo cult metaphor is in this case, but the article seems to strike in the general target area of our current malaise.

Interesting observation about a complete lack of economic growth (as corrected for inflation), but I'm not convinced that the Service as a Service industry is to blame; I think it's more of a symptom than the cause.

In this (some would say terminal) stage of capitalism in the West, we no longer focus on creating value -- the main objective seems to be extracting it instead. If every organization's goal is to turn their capital into more capital, what they actually do to achieve it is incidental. One could argue that a company of five thousand paper pushers milling around performing ineffective rituals is at least not causing any harm on balance (as compared to, say, arms manufacturers). Nevertheless, the current trend is to reap everything and sow nothing - somehow without the foresight to acknowledge the absolutely desolate desert this is creating.

One observation that caught my attention was the mention of the OODA loop. Access to information is a prerequisite for power - even if you don't have leadership privileges, being well informed empowers individual decision-making. All this corporate obfuscation disempowers people at all layers of the hierarchy. It makes for really ineffective organizations - sadly, this is absolutely acceptable now, because they don't need to be effective at anything other than temporarily grouping resources as an entity in the big consolidation shell game.

This has a lot to do with the scale of an organization, I think once a company grows beyond double digit headcount, things go downhill [1]

1: https://news.ycombinator.com/item?id=39172505 (I guess I like to harp on this subject)


I also have an ECE degree and came to similar conclusions in 2001 after the Dot Bomb and 9/11. I've been awake to the eventualities we're facing today around becoming slaves to technology (that widens wealth inequality) for basically my whole career.

But I disagree on one point: abstractions are not the problem. The lack of abstraction is what leads to opinionated frameworks like Ruby on Rails that obfuscate what's going on through magic like syntactic sugar without formally documenting the underlying abstractions. I work mostly in Laravel, which suffers from similar issues, but thankfully admits its flaws because it's written shamelessly in PHP. Frontend frameworks seem to mostly require the developer to drink the kool-aid because they don't even mention the determinism issues inherent to Javascript's async model and the various syntax hacks to emulate classes that diverge from its original conceptual simplicity. The same thing happened to PHP 5 when design-by-committee pass-by-reference classes were added, which broke its copy-on-write semantics and made it just like every other object-oriented frankenstein language.

No, the answer is to flip systems of control on their head and start talking to people at the bottom.

For example, Apple may never fix its external display support. I can't rely on my MacBook to wake up monitors, so sometimes it takes several minutes of plug and pray and even having to restart to get them to turn on. Or the other day, I discovered that mouse keys can't be enabled if the trackpad clicker is broken and tap to click is disabled. When you push command-option-(fn?)-f5 (a horrific keyboard shortcut), it brings up the modal to enable mouse keys, and you can tab through the options in an attempt to click the checkbox, but pressing return just closes the window. So a mouse is required to bypass the mouse. An astonishing oversight that was never fixed FOR YEARS.

At modern tech companies, there's nobody at the wheel, and there are no adults in the room. Yet FAANG has almost $1 trillion PER COMPANY. We get so mad at the government for squandering its trillions of our tax dollars on maintaining colonialism around the world, but meanwhile tech companies have vacuumed up all available capital so that we subsist on driving Uber to make rent.

I mean, I'm comfortable calling this. I think it's over, and we're into clear and present danger scenarios. I'm just so tired.


want to see an explanation that perfectly aligns? https://wtfhappenedin1971.com/


Abstractions obscure connections between hidden information.


I think Tech has nothing to do with it, it's just neoliberalism and financialization of the economy.

Basically the idea that your problems are your own, and the society (or government) has no responsibility to help people who are struggling for some reason.

This coupled with the ability to put debt burden onto people, one way or another, which enabled lot of rent seeking in the economy.


It start with us, the people

Today the 2 most popular articles are about javascript tools to generate a color palette..

I mean, aren't there more interesting things to talk about?

I submitted this, it got ignored: [1] (Chinese scientists make breakthrough in BCI-assisted rehabilitation trial, 'showing higher safety than Musk's Telepathy')

Transhumanism, gene selection/editing should be on everyone's mind, it'll shape the next civilization

The "great stagnation" is a political choice, it leads to interesting stuff happening elsewhere, while your group is left eating the junk food, it's an interesting paradox, almost like a bubble or a jail

1 - https://www.globaltimes.cn/page/202402/1306503.shtml


at my place of work, we have a lot of files (in the business sense, not the computer sense), and we need to manage them. we used to use this piece of software that we paid another company to host and manage, and it was a little clunky-looking because it was all written in Java, but it worked great. the software did everything we wanted it to, from importing scanned documents, to allowing us to make forms that can flow through workflows we defined. it was great, except that it required the use of Internet Explorer and Java, but we stuck with it... until the company who made the software was acquired by another company, which is making a similar kind of software, except browser-based... and a billion times more terrible.

this new software can be self-hosted or hosted on the company's servers, and we opted to try the self-hosted option out. it's been a complete nightmare. users have to wait 30 seconds or more after clicking "Submit" on a form before it gets submitted. form flows regularly get hung up or vanish completely.

right now, the software is using 24.2GB of RAM in IIS, and another 26.8GB for SQL Server, despite being used by only a few dozen users simultaneously—there is absolutely no excuse for this, in anything resembling a sane universe... but this is what we're stuck with for the time being. it's horrible. and the company who makes the software suggests we would get better performance if we went with their hosted option instead of self-hosting—as if that could somehow magically make this piece of utter shit run any better at all whatsoever.

but this is where things are right now with software: engineers who have no idea how to engineer things are building layers and layers of abstractions atop each other without giving a single shit about how well it all runs, or how many resources it consumes, or whether the entire functionality of the software you're building could be expressed several order of magnitude more simply by merely stepping back and reevaluating intended functionality and expressing it in code more directly than a billion layers of C# abstractions heaped on top of each other.

the funny (but sad) part is, many users, including some of my coworkers, think this is all completely normal, that such simple functionality must be difficult enough for modern computers to run, because their expectations have been set so low by contemporary software. people just don't care if a form takes 30 seconds to submit—maybe if it takes a full minute, then something might be wrong, but 30 seconds? to submit a form to an intranet web server? perfectly acceptable, why would that be a problem, they say.

I wish I could be more optimistic about the future of software but at this point it looks like we're kinda just fucked—nobody cares about making good software anymore, and most people have no idea how good, fast, and useful software could be if it was just written a slightly different way than we're taught in college.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: