The 'survey' which is referenced where 25% of respondents say it took 12+ months to find a new job only has 55 responses. 2 things worth noting:
- This is on Blind, which means respondents would have registered with an employer email address, so these are not entry-level candidates (unless they had an internship, or they registered their own domain name to sign up with)
- Since it's on Blind, there are likely more people focused on their next career move than if it was on, say, Hacker News, so people looking for a job are probably overrepresented in the 'dataset'. This is also skewed towards U.S. users (according to a site called 'semrush', 62% of Blind users are from the U.S.), and U.S. by many accounts is the country hit hardest by offshoring in the new remote-work landscape (since salaries there have been historically higher than anywhere else)
Not to say AI isn't a contributing factor, but I think the supposition in the article headline should be taken with a grain of salt
If you're serious about proving an AI correlation, you have to do better than two data points over one year. The leading hypothesis is that low interest rates led to overhiring in tech, so now that interest rates are up companies are cutting excess jobs.
To rule that out and give credence to an AI hypothesis, you need to go back several years to before the peak in the tech growth rate. We need a series of trend lines, not just a comparison of one year to another.
Also, your selection of job titles is pretty small and doesn't account for other possible changes. The most obvious one: where are the full stack web engineers? From the data in TFA it's impossible to tell if companies aren't just consolidating their specialized frontend/backend roles into generalist roles.
In the general population, there would be a very satisfying schadenfreude in watching the techies make themselves unemployed by their own hubris. I think that's part of the desire for the story to be true.
The other part is that those selling AI want to believe it is as big as the introduction of the internet, and not just another hype train like blockchain.
Any evidence that makes the current hotness seem like a tectonic shift is attractive to those types.
As a developer I've tried to use Copilot and ChatGPT as much as I can to improve my performance, but even with GPT4 I've found that for my workflow the most of the leverage it gives falls into:
* Scaffolding of projects like unit tests or basic wireframes,
* writing or proofreading e-mail
* understanding cryptic legacy code that uses old practices
* saving me having to visit some framework documentation.
But I found it hard to believe it's the reason behind layoffs, as the ceiling of their capabilities it's pretty low when things get complex, and even in some cases can have diminishing or negative returns. Examples of this are hallucinations of APIs or code that is not compatible with the language I'm using. So it becomes like a slot machine, hoping that the next prompt will have a good result.
I think the main advantage of the current iteration of AI, it's like having the old Google back, before SEO and paid articles contaminated the web with useless first results.
It would be difficult to prove that AI as a tool is the direct cause of differences in the job market. It would be equally difficult to prove that interest rates are directly responsible for the same differences.
Both the above comment and the article are making claims about a system that is complex enough that it would be difficult to prove any causal link.
That said, I believe that interest rates are probably a much greater factor in any net decrease on tech jobs than the proliferation of LLMs (which is really what has changed).
That and how the IRS changed the rules for tax purposes. You can't write an engineers salary off in the same year it was paid but rather over 4 years now.
as others are saying, it is ridiculous to assume short term changes in the job market, which are affected by lots of things like interest rates, inflation, overhiring during covid, etc. are due to "the rise of AI". LLM tools at best have only the ability to improve the productivity of a human developer by writing boilerplate code for them, and for that productivity improvement to reach the point of "we now need only one backend engineer where before we would have two" is not in any way upon us.
>for that productivity improvement to reach the point of "we now need only one backend engineer where before we would have two" is not in any way upon us.
We're nowhere near a 50% productivity gain, I agree. But we're getting close to "we had 10, now we only need 9". In the case of the Indian bodyshops that most non-tech F500s outsource to, we may be at the point of "we had 10, now we only need 6-7".
In the case of bodyshops, they've been a drag on productivity IME. 0 Would be better than any. Before or after AI. It's only the McKinsey types that always like to espouse the value of their own decisions that like to pretend they saved some money by outsourcing.
The AI tools might increase developer productivity due to smart assist, better search, etc. But then developers go and play with AI features with the saved time so we break even. :-)
Larger tech companies are likely betting on being able to replace a lot of coding jobs with ML. Additionally, I think it's likely that we're indeed even training AI to replace Engineers with code generation tools. That is, code-generation tools offer an instant feedback loop on "Is this model able to write acceptable code?"
I recently did a study on how AI might be affecting the demand of software engineers.
I took a dataset of 20M job postings from big and small startups. I then used a ML classifier to classify each engineering job to the type of engineer it belonged to (ie backend, ML, date, mobile etc). In addition, I extracted the skills listed in each job using entity extraction. I indexed all of this to an elasticsearch cluster when I ran aggregation queries to see how the # of jobs changed for each type of engineer in the past year.
Here were my findings:
1. The # of jobs for ML engineers and research scientists increased 70% from a year ago
2. # of jobs for frontend, mobile and data engineers all dropped 25% from a year ago, while backend engineers did a bit better dropping at 12%
4. Data scientist jobs and security engineer jobs did a bit better, both declining 5% or so
5. There seemed to be no correlation between tech layoffs and the demand for more AI engineers/talent.
I published my findings in more detail in the article. Of course correlation != causation but just trying to make sense of the data as best as I could. Everything I wrote is still just an opinion, but backed by some data :)
Big Internet tech has been close to saturation for a long time, we've been in a zero-interest-rate environment for a long time, and a lot of companies grossly overhired during COVID in the vain hope that the behavior changes were permanent. Shrinking the sector at this point is a natural response.
GenAI finally showing potential for real value at the same time is not totally a coincidence; people who want high-yield investment opportunities need something to chase and it's natural to hype up whatever is available. Crypto was clearly on the downswing and GenAI was new and had potential. So of course the sector is going to get flooded with investment cash, both inside of companies and in the startup ecosystem.
Probably. My background is a PhD in the physical sciences + high performance computing, where I’ve pivoted to applying those techniques to machine learning. Depending on the current tech hype cycle, this background matches various job titles: data scientist, applied scientist, research scientist, quantitative engineer, machine learning engineer, research software engineer, scientific software engineer, etc.
I typically go with whichever likely matches the most LinkedIn search filters of the day. Unfortunately, I can’t bring myself to put “AI” in my job title (despite the potential career benefits) because it just feels kind of icky. Maybe for a role that performs fundamental AGI research, but that’s about it.
numbers always make things more fun. especially opinions.
the fifth point tracks with subjective pulse checks in several places, which is roughly the following.
“ai” isn’t at a point to replace developers, in fact it isn’t consistently enough to augment them.
the squeeze is due to economic trends. market conditions. “ai” was seen as a potential solution. it is not snake oil, not yet panacea. so instead, time for everyone to get more efficient. do more with less. get 9h out of 8h days.
Anecdotal but we are in an AI freeze. I was, emphasize was, using copilot for over a quarter for development augmentation and seeing good velocity gains, our organization, now that it is all over the news (AI not our organization), has asked us to stop using everything except for one free search provider until they can come up with organizational rules. It is like walking through hip deep mud. The difference between this and dot com surge in 98-20 is that it happened within 6 months instead of 3-5 years.
AI is hot. Its seems like every company wants to sprinkle some 'AI magic' to pump their valuations.
Cheap money fueled SASS startup are dying, leading to reductions in demand for other roles. Larger more established SASS companies are shedding employees added during the pandemic too.
- This is on Blind, which means respondents would have registered with an employer email address, so these are not entry-level candidates (unless they had an internship, or they registered their own domain name to sign up with)
- Since it's on Blind, there are likely more people focused on their next career move than if it was on, say, Hacker News, so people looking for a job are probably overrepresented in the 'dataset'. This is also skewed towards U.S. users (according to a site called 'semrush', 62% of Blind users are from the U.S.), and U.S. by many accounts is the country hit hardest by offshoring in the new remote-work landscape (since salaries there have been historically higher than anywhere else)
Not to say AI isn't a contributing factor, but I think the supposition in the article headline should be taken with a grain of salt