I would go even further and argue that vast majority of businesses will never need to think about distributed systems. Modern hardware makes them irrelevant to all but the most niche of applications.
I had a longer comment elsewhere but to me this says that the distribution is happening somewhere and what you're also saying is that companies have to decide how much they want or care to control it.
No. The issue is whether you NEED to not whether you want to.
10 to 15 years ago, you could argue, however implausibly, that hardware constraints meant vertical scaling was impossible, and you were forced to adopt a distributed architecture. Subsequent improvement in hardware performance, means that in 2025, vertical scaling is perfect acceptable in nearly all areas, relegating distributed architecture to the most niche and marginal applications. The type of applications that the vast majority of businesses will never encounter.
There is essentially no tooling for this and vendors all default to distributed patterns. Either you directly control the scaling or you're relinquishing it.
The department saw more need for storage than Kubernetes compute so that's what we're growing. Nowadays you can get storage machines with 1 PB in them.
Yeah, that's an interesting question, because it sounds like a ton of data vs not enough compute, but, aside from this all being in a SAN or large storage array:
The larger Supermicro or Quanta storage servers can easily handle 36 HDD's each, or even more.
So with just 16 of those with 36x24TB disks, that meets the ~14PB capacity mark, leaving 44 remaining nodes for other compute task, load balancing, NVME clusters, etc.
"Culturally significant" is the wrong metric, and shows that you don't really understand why people watch what they watch.
People watch all sorts of things, from all different time periods, because they enjoy them. Sometimes those things are "culturally significant", but I'd expect that's not the most common case. Sometimes those things are B-movies from the '70s or brain-candy sitcoms from the '90s.
The premise is that there is so much good AI content that if you just pick something you enjoy, no other criteria, 90% of the time it'll be an AI work.
The only people that would be watching a significant amount of older work are the people that have a reason beyond that.
The back catalogue will have a few scattered gems that you can find amongst the sea of mass media that appealed to its audience at the time. Most of that content no longer relates or makes sense to us. There's also a massive load of dreck and garbage.
People should be realistic about this instead of emotionally invested against AI as the news media has tried to sway this. It's just a tool, and artists are starting to use it productively.
It’s slightly more nuanced than that. Investment banks and consultancy companies are really interested in graduates who are smart and articulate. The nature of their degree is not that relevant.
I know an extremely clever young woman who graduated in performance arts. She had no difficulty in getting a job at a top tier consultancy company. This company was far more interested in her than some mediocre plodder who hacked his way through a comp. Sci degree.
At the moment on LinkedIn there are about 15k results for jobs containing the “software engineer” keyword in the UK, compared to just over 3k for the “biology” keyword.
Despite this, 56k people graduated in biological sciences but 24k people graduated in computer science.
(Graduation data is from 2019/20 so may have changed slightly, but unlikely enough to move the needle)
The vast majority of jobs on LinkedIn are fake. Even so your figures confirm my claim; 24k graduates versus 15k jobs. The supply is greater than the demand.
As a sibling comment has pointed out, there are too many software engineers in the UK.
Only because there are too many software engineers, not the fact that AI will replace those jobs. Experienced software engineers are still required for successful businesses.
The vast majority of businesses will never need more than single node architecture. Hardware advances are continually increasing that percentage.
SPARK and its modern counterpart Databricks are essentially obsolete for these organizations. Whatever justification they may have had in the past is no longer true.
I’ve recently closed down several in house SPARK clusters and replaced them with single nodes.
In addition to the simplicity of the design and reduction in cost there was a massive increase in performance. I expect this will become more common in the future; leaving distributed architecture for a small and increasingly niche group.
The market can remain depressed for longer than you can remain solvent.
We should be encouraging people to look at alternative careers to tech. Life after tech.
We should also be making it clear to students that while there are exciting things happening in tech this is not going to translate into large scale demand for people.
Large parts of technology are mature, indeed moribund. This is not a message that the technology industry wants to hear.
It’s extraordinary how frequently companies discuss the cost of a bad hire and never consider the opportunity cost of a no-hire.
Companies that keep waiting for Mr. Right are really saying that the opportunity cost of not completing their project is very low. In other words it’s not really that important at all.
On the contrary. "Not completing the project" is not an option—if they don't hire someone to fill a vacancy on the team, the rest of the team will just be expected to work extra hours to keep up.
Oh, not with overtime—you're salaried, remember? (Alternate version: Oh, no, you can't actually log the extra hours; we don't have the budget for overtime, and I, the manager, can't be seen asking for more money, or it would affect my bonus!)
And you'd better step up and work those hours. You want to be seen as a team player, right?
>Not completing the project" is not an option—if they don't hire someone to fill a vacancy on the team, the rest of the team will just be expected to work extra hours to keep up.
And that's the opportunity cost we don't talk about. The cost isn't "we slow down on a project from a bad hire". It's "demoralize/burned out engineers quit to a point where the deadline is impossible to reach". You can't force overtime to engineers that leave and take their institutional knowledge with them
There's also a lot of fake job postings as a sort of carrot to overworked engineers that "promise more help is coming". Which is just as ingenuous to existing employees as it is to applicants.
reply