If you are a "leader" trying to hire a "skilled" AI worker, you are probably an idiot. It's literally a brand new field! - you are choosing the most limp-wristed, back-seat response possible. The only skill you are going to hire for is the ability to lie on a resume.
Unfortunately, I think this myopia extends to the entire breadth of the "skills gap". Any executive complaining is kind of admitting they lack the skills to actually manage humans.
What actual leaders are doing is the same thing they have always done - hire smart, enthusiastic people and give them the resources they need. The whole point is to recognize potential talent - not just reading the tag and slotting them into thankless positions then complaining when they don't fit. But this requires trust, and flexibility, trial and error, and actual knowledge. And certainly no executive would want themselves treated in such a way.
But this kind of leadership is hard and requires admitting failure occasionally - so for some reason spending 3 months cross-training a programmer to be an IT Operations specialist is seen as inferior to waiting 9 months to overpay someone new.
If you are a "leader" trying to hire a "skilled" AI worker you are probably trying just trying to buy the ability to say "AI" a lot in your press releases. Save your money, just say "AI" a lot, you'll be better off than your competitors - trying to shoehorn The New Clippy™ into every corner of their app. To wit:
> It showed that 40% of 2,000 C-level executives surveyed plan to use genAI tools such as ChatGPT to cover critical skills shortages through the automation of tasks.
And even if you find a skilled AI worker, they'll just try to convince you not to build a chatbot. So much golf time wasted!
> “If there’s a desire to delegate critical activities and functions to genAI, it is essential that senior management first develops a deeper understanding of the data management processes, including what data can and cannot be used to train these systems,” said David Emm, Kaspersky’s principal security researcher.
This reminds me of every tech craze before it. I remember when executives thought they could just 'buy' a website and be done with it all that bother. So we got stuck with shitty online banking for most of a decade.
AI is most certainly not a new field. NN are decades old, it's only modern LLM transformer tech that is new.
Perhaps look at a primer on NN feed forward, RNN and transformer evolution.
There's also GOFAI, genetic algorithms, pattern matching algorithms (image recognition is a subset of NN), rules based algorithms (e.g. Bayesian) and others.
As a tech lead dev, I hire ML specialists and most can't even tell me what GPT stands for in ChatGPT, but some actually have a the full background I mention above, and there's very few of these, even less of those understand the mathematics, but just how to use Langchain or AWS services.
So it's true there is a skills shortage, but it's heavily polluted by people jumping on the AI bandwagon to try and get a lucrative/high paying job, and this is probably what you are seeing.
Read the article, they're not hiring for "AI" as you think of it, literally "Gen AI" is listed as a skill they want to hire for. You're potentially misreading the room here, the types complaining here are the type of executive described in this tweet:
The sad part is for most of these jobs you actually do only need to know how to use Langchain or AWS services (ok maybe not langchain). The mathematics part is generally just for the interviews.
> With AI ‘sucking the air out of almost all non-AI investments in the whole tech world,’ companies are cutting what they believe are unnecessary jobs
Not enough AI on your CV. Is there a php or perl scripting AI yet? I imagine a future of lots of little AIs that know their own out-of-fashion domain talking to a bigger AI. Maybe layers and layers of AIs. I wonder if they'll have meetings or chats.
Maybe we can give them ambition to self-automate or create new layers.
Going by the article, I guess that would be because the applicants don't have the skills that are in demand. The top one being "Business AI skills (i.e. prompt engineers, low/no code developers, business analysts)". Much fewer respondents cited a gap in "Software development skills", which I guess is what most applicants here have.
And it's certainly possible that CS/programming/etc. largely remains a solid engineering/engineering-adjacent career but some of the geo- and large tech-specific compensation is mostly an anomaly.
Causality is the other way—the job market being in the toilet is causing the skills shortage. Why don't companies staff up, then?
Yeah, good question. I think they're trying to see how long they can ride along on "what we've got now is good enough" and then when they've find out they can't, it's going to hurt.
"Along with AI skills, skills in IT ops and cloud development are severely lacking, IDC reports"
Salaries are flat, layoffs are rampant, job openings in free fall across multiple countries...there is no skill shortage, except maybe in specific areas of ML, and even there not in all.
Pretty skeptical of this given how many recent CS grads there are (and yeah I'm aware of the usual complaints of "but they can't pass a basic coding interview!!" but most of the ones I've seen seem pretty competent)
A basic coding interview doesn't really cover what companies are looking for. There's a lot of talk about devops and cloud development, which means they're looking for people who are already familiar with developing software that's cloud native rather than just general software development ability.
At my day job, we actually spend a significant amount of time working with juniors to get them up to speed on deploying to kubernetes (writing manifest/helm files, building cloud-ready configurations, handling autoscaling etc). The company I work at sees a lot of value in investing senior engineer time into teaching new grads the ropes but I feel like a lot of companies would rather just hire people who already know how to do this stuff.
I don't really know how you'd usefully learn any of that unless you already needed to figure it out at some point while on the job.
I'm not a junior, maybe intermediate-senior depending on how one sees it, but I'm definitely seeing these requirements in a ton of jobs, but since I've largely been working frontend in the last few teams I've been a part of.
I know that I need to pivot back to more of a full-stack kind of situation, and I'm working on rebuilding those backend skills, but at this point it feels like I've already missed the boat.
Honestly in the past year or two I have been fairly impressed with recent grads I have worked with. On the other hand I've run across several people that got laid off from one of the big tech companies and I am shocked how little they know. Things that should have taken a week or two such as setting up an api endpoint to return records or consuming one of our apis ends up taking 4+ months for them to complete.
Is there a word for "creating an air of certainty around that which is otherwise mostly speculation?" This entire article is chock full of speculation.
On HN's front page: A complaint and long discussion about leetcode interviews. Why do companies interview for something that is only an infinitessimal fraction of the job they're hiring for?
So if you're suffering "a critical tech skills shortage", maybe consider stopping using leetcode as an interview technique?
I might be dumb here, but I don't understand the chart of "most important IT skills". What are the percentages of? Why do they add up to much more than 100%?
This honestly makes no sense given the current job market and constant firings. I wish there was some way to normalise what "shortage" orgs think they have vs how much they're willing to pay for people with such skills.
Within 2 years, 90% of organizations will suffer a shortage in supply of people willing to to be micro-managed for the price they are willing to pay, in the open office locations they are willing to put forward.
I hate that word in this type of context. I consider it to be mostly meaningless, yet also leads to decisions that involve eliminating cost centers like IT, security, and customer support, because those aren't seen as creating value.
Maybe a factor of why this happens is the expectation for senior engineers and higher to do less hands-on work and more management and planning work, which leads to an overall degradation of technical skills because engineers don't specialize in their domain for their entire career.
In other words, the tech skill shortage is self-inflicted with the industry pushing for engineers to add "business value" as the number one priority instead of focusing on mastering their technical skills throughout their entire career.
Companies always think they have a skills shortage. They also can only onboard (and offboard) a finite number of people and they are unable as a viable business proposition to simply outbid every other company for talent.
Early reports indicate that the NHS cyber attack happened because non-technical managers refused to install or upgrade a security patch, believing it was unnecessary. So err yeah...
Unfortunately, I think this myopia extends to the entire breadth of the "skills gap". Any executive complaining is kind of admitting they lack the skills to actually manage humans.
What actual leaders are doing is the same thing they have always done - hire smart, enthusiastic people and give them the resources they need. The whole point is to recognize potential talent - not just reading the tag and slotting them into thankless positions then complaining when they don't fit. But this requires trust, and flexibility, trial and error, and actual knowledge. And certainly no executive would want themselves treated in such a way.
But this kind of leadership is hard and requires admitting failure occasionally - so for some reason spending 3 months cross-training a programmer to be an IT Operations specialist is seen as inferior to waiting 9 months to overpay someone new.