
Deep learning job postings have collapsed in the past six months - bpesquet
https://twitter.com/fchollet/status/1300137812872765440
======
eric_b
I've worked in lots of big corps as a consultant. Every one raced to harness
the power of "big data" ~7 years ago. They couldn't hire or spend money fast
enough. And for their investment they (mostly) got nothing. The few that
managed to bludgeon their map/reduce clusters in to submission and get
actionable insights discovered... they paid more to get those insights than
they were worth!

I think this same thing is happening with ML. It was a hiring bonanza. Every
big corp wanted to get an ML/AI strategy in place. They were forcing ML in to
places it didn't (and may never) belong. This "recession" is mostly COVID
related I think - but companies will discover that ML is (for the vast
majority) a shiny object with no discernible ROI. Like Big Data, I think we'll
see a few companies execute well and actually get some value, while most will
just jump to the next shiny thing in a year or two.

~~~
apohn
"Like Big Data, I think we'll see a few companies execute well and actually
get some value, while most will just jump to the next shiny thing in a year or
two."

Here's another aspect - in many places nobody listens to the actual people
doing the work. In my last job I was hired to lead a Data Science team and to
help the company get value of Stats/ML/AI/DL/Buzzword. And I (and my team)
were promptly overridden on every decision of what projects an expectations
were realistic and what were not. I left, as did everybody else that reported
to me, and we were replaced by people who would make really good BS slides
that showed what upper management wanted to see. A year after that the whole
initiative was cancelled.

Back in 2000 I was in a similar position with a small company jumping on the
internet as their next business model. Lots of nonsense and one horrible web
based business later, the company failed.

It's the same story over and over again. Some winners, lot of losers, many by
self-inflicted wounds.

~~~
mrtksn
If you think about it, that's the natural outcome. Why? Because people in
corporations don't have the incentive to benefit the business but to progress
their careers and that's done through meeting the goals for their position and
make their upper ups progress with their careers too.

So essentially, you have a system where people spend other people's resources
for living and their success is judged by making the chain link above happy.
In especially large companies it's easy to have a disconnect from the product
because people in the top specialise in topics that have nothing to do with
the product. If the people at the top want to have this shiny new thing that
the press and everyone else is saying that it's the next big thing, you better
give them the new shiny thing if you want to have a smooth career. In publicly
traded companies, this is even more prevalent because people who buy and sell
the stocks would be even more disconnected from the product and tied to the
buzzwords.

The more technical minded people who have the hunch on tech miss the point of
the organisation that they are in and get very frustrated. It's probably the
reason why startups can be much more fulfilling for deeply technical people.

~~~
apohn
>If you think about it, that's the natural outcome. Why? Because people in
corporations don't have the incentive to benefit the business but to progress
their careers and that's done through meeting the goals for their position and
make their upper ups progress with their careers too.

This is one of the reasons I roll my eyes whenever I read something like
"McKinsey says 75% of Big Data/AI/Buzzword projects do not deliver any value."
What's the baseline for failing and/or delivering zero value because those
projects were destined to fail?

~~~
bonoboTP
> because of silly management decisions?

The whole point is, from their point of view those decisions are rational.
It's much more lucrative from their (managers') personal point of view to
develop a smokes-and-mirrors looks-good-on-ppt AI project. To be safe from
risk, don't give the AI people too much responsibility, let them "do stuff",
who cares, the point is we can now say we are an AI-driven company on the
brochures, and we have something to report up to upper management. When they
ask "are we also doing this deep learning thing? It's important nowadays!" we
say "Of course, we have a team working on it, here's a PPT!". An actual AI
project would have much bigger risks and uncertainty. I as a manager may be
blamed for messing up real company processes if we actually rely on the AI. If
it's just there but doesn't actually do anything, it's a net win for me.

Note how this is not how things run when there are real goals that can be
immediately improved through ML/AI and it shows up immediately on the bottom
line, like ad and recommendation optimizations in Youtube or Netflix or core
product value like at Tesla etc.

The bullshit powerpoint AI with frustrated and confused engineers happens in
companies where the connection is less direct and everyone only has a nebulous
idea of what they would even want out of the AI system (extract valuable
business knowledge!).

~~~
huffmsa
I think the problem a lot of places has been wanting "appealing" ML/AI
solutions. The kind you write papers about and put on Powerpoints.

The useful AI/ML isn't glamorous, it's quite boring and ugly. Things like spam
detection, image labeling, event parsing, text classification.

It's hard to get a big, shiny model into direct user facing systems.

~~~
bonoboTP
What would you categorize as shiny in this case? "spam detection, image
labeling, event parsing, text classification" can be implemented in lots of
ways, simple and shiny as well.

Either way I don't think it matters too much because people can't really tell
simple from shiny as long as the buzzword bullet points are there.

The point is rather that the job of the data science team is to deliver
prestige to the manager, not to deliver data science solutions to actual
practical problems. It's enough if they work on toy data and show "promising
results" and can have percentages, impressive serious charts and numbers on
the powerpoint slides.

I've heard from many data scientists in such situations that they don't get
any input on what they should actually do, so they make up their own questions
and own tasks to model, which often has nothing to do with actual business
value, but they toy around with their models, produce accuracy percentages and
that's enough.

------
AznHisoka
According to data from Revealera.com, if you normalize the data, the % of job
openings that mention 'deep learning' has actually remained stable YoY:
[https://i.imgur.com/sDoKwD0.png](https://i.imgur.com/sDoKwD0.png)

* Revealera.com crawls job openings from over 10,000 company websites and analyzes them for technology trends for hedge funds.

~~~
james412
I think the fact the original tweet was not normalized in this incredibly
obvious way is at least one valid reason companies could use less deep
learning folk

~~~
ipsum2
The original tweet is one of the authors of TensorFlow (specifically Keras, a
large section of the API for v2), if it's any indication of the quality of the
framework.

------
simonw
Something I've learned: when non-engineers ask for an AI or ML implementation,
they almost certainly don't understand the difference between that and an
"algorithmic" solution.

If you solve "trending products" by building a SQL statement that e.g. selects
items with the largest increase of purchases this month in comparison to the
same month a year ago, that's still "AI" to them.

Knowing this can save you a lot of wasted time.

~~~
jon_richards
Any sufficiently misunderstood algorithm is indistinguishable from AI.

~~~
xmprt
In my AI class in college, we learned about first order logic. To me it didn't
seem like we were really learning AI but I couldn't quite put my finger on it.
I guess it's because it made too much sense so in my mind it couldn't be AI.

~~~
jldugger
This is basically a form of the AI effect[1]:

> The AI effect occurs when onlookers discount the behavior of an artificial
> intelligence program by arguing that it is not real intelligence.

[1]:
[https://en.wikipedia.org/wiki/AI_effect](https://en.wikipedia.org/wiki/AI_effect)

~~~
harerazer
Ah yes, the AI version of the no true scotsman fallacy

------
ineedasername
Most data-related problems, or extraction of knowledge from data, simply
doesn't benefit from Deep Learning.

In my experience, what many organizations lack is simple but high-quality
"Business Analytics": Reporting & dashboards are developed that look good but
jam too much information together. It is often the wrong information:

Something is requested, and the developer develops exactly what was asked. The
problem is that it wasn't what was _needed_ because the person making the
request couldn't articulate the question in the same terms the developer would
understand. The request will say "Give me X & Y" when the real question is "I
want to understand the impact of Y on X". The person gets X & Y, looks at it
every day in their dashboard, and never sees much that is useful. The initial
request should always be the start of a conversation, but that often doesn't
happen. A common result are people in departments spending tons of time in
Excel sorting, counting, making pivot tables, etc., when all of that could be
automated.

This is part of the reason why companies often go looking for some new "silver
bullet" to solve their data problems. They don't have the basics down, and
don't understand the data problems well enough to seek out a solution.

~~~
momokoko
I think we’re starting to see peak managerialism. The latest wave in stats has
shown more than anything that a significant shortfall in basic statistics
knowledge makes it almost impossible to make good decisions with vast amounts
of data.

Without the skillsets to work with and then understand that data, they are
forced into this long process of asking for data to be put into reporting and
dashboards and then once they finally get them, either fixating on the limited
metrics it provides while being oblivious to other context not in front of
them, or to instead forced to start another long iteration to adjust that
reporting and dashboards.

We’ve gone almost 30 years believing management was the sole skill required to
manage teams and companies, but dealing with the new era of data is starting
to show the limits

------
The_rationalist
I observe the state of the art on most Nlp tasks since many years: In
2018,2019 there was huge progress made each year on most tasks. 2020,except
for a few tasks have mostly stagnated... NLP accuracy is generally not
production ready but the pace of progress was quick enough to have huge hopes.
The root cause of the evil is: Nobody has build upon the state of the art pre
trained language: XLnet while there are hundreds of declinaisons of BERTs.
Just because of Google being behind it, if XLnet was owned by Google 2020
would have been different. I also believe that pre trained language have
reached a plateau and we need new original ideas such as bringing variational
autoencoder to Nlp and using metaoptimizers such as Ranger.

The most pathetic one is that: Many major Nlp tasks have old SOTA in BERT just
because nobody cared of _using_ (not improving) XLnet on them which is
absolute shame, I mean on many major tasks we could trivially win many
percents of accuracy but nobody qualified bothered to do it,where goes the
money then? To many NIH papers I guess.

There's also not enough synergies, there are many interesting ideas that just
needs to be combined and I think there's not enough funding for that, it's not
exciting enough...

I pray for 2021 to be a better year for AI, otherwise it will show evidence
for a new AI progress winter

~~~
bratao
I do not agree with this. I work heavily with NLP models for production in the
Legal domain (where my baseline is where a 8GB 1080 must predict more than
1000 words/sec). This year was when our team glued enough pieces of Deep
Learning to outperform our previous statistic/old ML pipeline that was been
optimized for years.

Little things compound such as optimizers ( Ranger/Adahessian), better RNN (
IndRNN, Linear Transformers, Hopfield networks ) and techniques (cache
everywhere, Torch script,gradient accumulation training)

~~~
maxlamb
Interesting. What's the main goal(s) of your NLP models?

~~~
bratao
We work on multiple models, all related to legal proceedings and lawsuits,
such as: \- Structure Judicial Federal Register texts \- Identify entities in
Legal texts (citation to laws, other lawsuits) \- Predict time to completion,
risk and amount due of a lawsuit \- Classifying judicial proceedings to non
lawyers

~~~
grumple
How accurate is your prediction of time/risk/amount? How useful is identifying
entities or classifying proceedings?

------
EForEndeavour
While this sounds plausible and has a lot of "prior" credibility coming from
someone as central to deep learning as François Chollet, I'd love to see
corroborating signal in actual job-posting data, from LinkedIn, Indeed,
GlassDoor, etc. Backing up this kind of claim with data is especially
important given the fact that the pandemic is disrupting all job sectors to
varying degrees.

As you can imagine, searching Google for "linkedin job posting data" doesn't
work so great. The closest supporting data I could find is this July report on
the blog of a recruiting firm named Burtch Works [1]. They searched LinkedIn
daily for data scientist job postings (so not specifically deep learning) and
observed that the number of postings crashed between late March and early May
to 40% of their March value, and have held steady up to mid-June, where the
report data period ends.

There's also this Glassdoor Economic Research report [2], which seems to draw
heavily from US Bureau of Labor Statistics data available in interactive
charts [3]. The most relevant bit in there is that the "information" sector
(which includes their definitions of "tech" and "media") has not yet started
an upward recovery in job postings, as of July.

[1] [https://www.burtchworks.com/2020/06/16/linkedin-data-
scienti...](https://www.burtchworks.com/2020/06/16/linkedin-data-scientist-
job-postings-stabilizing-is-recovery-around-the-corner/)

[2] [https://www.glassdoor.com/research/july-2020-bls-jobs-
report...](https://www.glassdoor.com/research/july-2020-bls-jobs-report/)

[3] [https://www.bls.gov/charts/employment-
situation/employment-l...](https://www.bls.gov/charts/employment-
situation/employment-levels-by-industry.htm)

~~~
deepGem
Here are some data points from March. [https://towardsdatascience.com/whats-
happened-to-the-data-sc...](https://towardsdatascience.com/whats-happened-to-
the-data-science-job-market-in-the-past-month-88c748a4cd25)

~~~
EForEndeavour
I actually found this, but decided not to post it because it only captures the
first few weeks of post-crisis patterns, and doesn't contextualize any of the
deep-learning-specific job losses against the broader job market, which as we
all know was doing the same thing, directionally. It would be really cool to
get an updated report of that level of detail from the author, who seems
active on Twitter
([https://twitter.com/neutronsneurons](https://twitter.com/neutronsneurons)),
but not Medium: that April job report is his latest article.

------
supergeek133
I feel like it was also a classic case of running before we could crawl.
Jumping from A to Z before we could go from 0 to 1.

I work at an Residential IoT company, there are quite a few really valid use
cases for Big Data and even ML. (Think about predictive failure).

We hired more than one expensive data scientist in the past few years, and had
big strategies more than once. But at the end of the day it's still "hard" to
ask a question such as "if I give you a MAC Address give me the runtime for
the last 6 months".

We're trying to shoot for the moon, when all I've ever asked is I want an API
to show me indoor temp for particular device over a long period.

~~~
pbourke
Everyone wants to fire up Tensorflow, Keras and PyTorch these days. Fewer
people want to work in Airflow and SSIS, spend days tuning ETL, etc. This is
the domain of data engineering, which bridges software engineering and data
science with a dash of devops. I’ve been working in this field for a couple of
years and it’s clear to me that data engineering is a necessary foundation and
impact multiplier for data science.

~~~
jnwatson
Don't forget data cleaning. A huge issue I've seen is just getting sufficient
data of a high enough quality.

Also, (for supervised classification problems) labelling is a big problem.

It is almost as if we need a "data janitor" title.

~~~
ajb
Phht you don't want to call it data janitor; no-one good will want that title.
At least call it Data Integrity Engineer or something reasonably high-status.

~~~
anthuswilliams
At my company we have just created a position called Data Steward.

------
joelthelion
Meh, only for people who bought into the hype without real use cases. Which I
agree may be numerous.

In my company though, we've been applying DL with great success for a few
years now, and there are at least five years of work remaining. And that's not
spending any time doing research or anything fancy: just picking the low-
hanging fruit.

~~~
freyr
I think many companies have real problems, but find that DL ends up being a
poor solution in practice for various reasons.

You need not only real use cases, but use cases that happens to well with DL’s
trade offs and limitations. I think many companies hired with very unrealistic
expectations here.

------
bane
I managing some teams right now that do a mix of high-end ML stuff with more
prosaic solutions. The ML team is smart, and pretty fast with what they do,
but they tend to (as many comments here have mentioned) focus on delivering
only PhD level work. This translates into taking simple problems and trying to
deorbit the ISS through a wormhole on it rather than just getting something in
place that answers the problem.

In conjunction with this, it turns out 99% of the problems the customer is
facing, despite their belief to the contrary, aren't solved best with ML, but
with good old fashioned engineering.

In cases where the problem can be approached either way, the ML approach
typically takes much longer, is much harder to accomplish, has more
engineering challenges to get it into production, and the early ramp-up stages
around data collecting, cleaning and labeling are often almost impossible to
surmount.

All that being said, there are some things that are only really solvable with
some ML techniques, and that's where the discipline shines.

One final challenge is that a lot of data scientists and ML people seem to
think that if it's not being solved using a standard ML or DL algorithm then
it _isn 't_ ML, even if it has all of the characteristics of being one. The
gatekeeping in the field is horrendous and I suspect it comes from people who
don't have strong CS backgrounds wrapping themselves too tightly against their
hard-earned knowledge rather than having an expansive view of what can solve
these problems.

~~~
danielscrubs
Get your math and your domain knowledge straight and you can do a lot with
little. Lots of programmers want to be ml engineers because the prestige is
higher because you normally take in PhDs. The big problem is hype, people are
throwing AI at everything as...garbage marketing. It’s at the point where if
you say you use AI in your software title, I know you suck, because you aren’t
focusing on solving a problem you are focusing on being cool which will never
end well.

------
softwaredoug
There's a lot of what I call "model fetishism" in machine learning.

Instead of focusing our energies on the infrastructure and quality of data
around machine learning, there's eagerness to take bad data to very high-end
models. I've seen it again and again at different companies, usually always
with disastrous consequences.

A lot of these companies would do better to invest in engineering and domain
expertise around the problem than worry about the type of model they're using
to solve the problem (which usually comes later, once the other supporting
maturity pieces are in place)

~~~
actusual
This is why my interview question focuses around applying linear regression to
a complex domain. It weeds out an enormous number of candidates.

There are 5 ML models that we maintain where I work, and none of then are more
complicated than linear regression or random forests. Convincing me to use
something more complex would take an enormous amount of evidence. Domain
knowledge is king.

------
arcanus
This is an anecdote with no data. And the entire global economy is in a
recession, so the fact deep learning might have fewer job postings isn't
particular notable.

I'll note that in my personal anecdote, the megacorps remain interested in and
hiring in ML as much as ever.

~~~
arvindch
He's now posted a follow-up analysis of LinkedIn Job postings:
[https://twitter.com/fchollet/status/1300417952211034112?s=20](https://twitter.com/fchollet/status/1300417952211034112?s=20)

~~~
nibnalin
Would be interesting to see this dip relative to other tech subfields like
javascript/react or even data science and other such keywords. Does anyone
know of a public LinkedIn dataset?

The author disables tweet replies so I'm not sure where they get their numbers
from.

------
occamrazor
Missing in the original chart/data: have ML/DL job postings decrease more or
less than other comparable job categories (programming, business analyst,
etc.)

~~~
mritchie712
Great point. Not as good point: is looking for pytorch and tf the right
measure?

~~~
siliconvalley1
In his tweet I thought he made it clear he wasn't predicting an AI specific
slowdown but a universal recession due to Covid?

~~~
proverbialbunny
He's wrong and using not the best data for such an assertion.

Data science jobs are not slowing down, though they're not really increasing
either.

In comparison since 2016 software engineering jobs revolving around building
up systems for data scientists have increased 6 fold, maybe even more since I
last looked.

------
dcolkitt
99% of the time you don't need a deep recurrent neural network with an
attention based transformer. Most times, you just need a bare-bones logistic
regression with some carefully cleansed data and thoughtful, domain-aware
feature engineering.

Yes, you're not going to achieve state-of-the-art performance with logistic
regression. But for most problems the difference between SOTA and even simple
models is not nearly as large as you might think. And two, even if you're
cargo-culting SOTA techniques, it's probably not going to work unless you're
at an org with an 8-digit R&D budget.

------
tomhallett
I know very little about the DL/ML space, but as a full-stack engineer it
feels like most companies have tried to replicate what FAANG companies do
(heavy investment in data/ml) when the cost/benefit simply isn't there.

Small companies need to frame the problem as:

1) Do we have a problem where the solution is discrete and already solved by
an existing ML/DL model/architecture?

2) Can we have one of our existing engineers (or a short-term contractor) do
transfer learning to slightly tweak that model to our specific problem/data?

Once that "problem" actually turns into multiple "machine learning problems"
or "oh, we just need todo this one novel thing", they will probably need to
bail because it'll be too hard/expensive and the most likely outcome will be
no meaningful progress.

Said in another way: can we expect an engineer to get a fastai model up and
running very quickly for our problem? If so, great - if not, then bail.

ie: the solution for most companies will be having 1 part-time "citizen data
scientist" [1] on your engineering team.

[1]: [https://www.datarobot.com/wiki/citizen-data-
scientist/](https://www.datarobot.com/wiki/citizen-data-scientist/)

------
kovac
The way I see it, only those companies that had already been using a data
oriented approach to business can really reap the benefits of ML. From a
company's point of view, ML/AI should be a natural evolution of an existing
tool set to better solve problems they have been trying to solve in the past
using deterministic methods and then statistical methods, etc. Any other
project that is diving right into ML is likely to fail because

1\. There's no clear problem statement. They have never formulated one and now
trying to bolt ML on to their decision making.

2\. They don't have well catalogued data for engineers/scientists to work with
because they never tried to do rigorous analysis of data before ML became a
thing.

3\. Managers have no idea how to deal with data driven insights. What if the
results are completely unintuitive to them? Are they going to change their
processes abruptly? What if the results are aligned with what they have been
always doing? Is it worth paying for something that they have been doing
intuitively for decades?

I'm not a data scientist. But the biggest complaint I hear from my colleagues
is that they lack data to train models.

~~~
scollet
Yeah, you really shouldn't conform data to the problem. It's more an emergent
silver gun than a constructed silver bullet.

------
lm28469
Isn't it the same pattern every 10 years or so for "AI" related tech ? Some
people hype tech X as being a game changer - tech X is way less amazing than
advertised - investors bail out - tech X dies - rinse and repeat.

[https://en.wikipedia.org/wiki/AI_winter](https://en.wikipedia.org/wiki/AI_winter)

~~~
rjtavares
This is more akin to the Internet bubble than the previous AI winter. The
technology is valuable for business, but the hype is huge and companies aren't
ready for it yet.

------
nutanc
AI has a business problem.

Very few businesses I know actually have a deep learning problem. But they
want a deep learning solution. Lest they get left out of the hype train.

~~~
rjtavares
Blockbuster didn't have an Internet problem.

~~~
discreteevent
Dentistry didn't have a sledgehammer problem and, after all these years, it
still doesn't.

------
calebkaiser
"This is evident in particular in deep learning job postings, which collapsed
in the past 6 months."

Have they? Specifically, have they "collapsed" relative to the average decline
in job listings mid-pandemic?

------
fnbr
I am a DL researcher at a top industry lab.

I'm completely unsurprised by this. Regularly, at lunch, I'll ask my coworkers
if they know of any DL applications that are making O($billions), and no one
knows any outside of FAANG.

FAANG is making an insane amount of money due to DL. Outside of them though, I
don't know who's making money here. When I was interviewing for jobs, there
were a ton of startups that were trying to do things with DL that would have
been better done with a few if statements and a random forest, and that had a
total market size in the millions.

I think that, eventually, there'll be a market for this stuff, but I'm not
convinced that it's anywhere near being widespread.

I was also a consultant before my current role. The vast majority of non-tech
firms don't have their data in well organized + cleaned databases. Just moving
from a mess of Excel sheets to Python scripts + SQL databases would have made
a HUGE difference to the vast majority of clients I worked with, but even that
was too big of a transformation.

Basically, everyone with the sophistication to take advantage of DL/ML already
has the in-house expertise to do it. There's almost no one in the intersection
of "Could make $$$ doing DL" && "Has the technical infrastructure to integrate
DL".

------
ur-whale
That may be true in the research arena (where Mr Chollet works), but I don't
think that's the case in terms of where deep learning is actually applied in
industry, nor will it be the case for years to come IMO.

It's just that much that needed to be invented has been invented and now it's
time to apply it everywhere it can be applied, which is a great many place.

------
insomniacity
Some context, for those unfamiliar:
[https://en.wikipedia.org/wiki/AI_winter](https://en.wikipedia.org/wiki/AI_winter)

~~~
cochne
The poster explicitly states he does not think this is indicative of AI
winter.

~~~
the-dude
Mentioning context does not mean the OP assumes equivalence.

It is context.

------
jungletime
I've been using voice commands on my android phone, in situations where I
can't use my hands. Most often all I want to do is.

1\. Start and stop a podcast.

2\. Play music

3\. Ask for the time

The phone understands me, but then android breaks the flow, so I have to use
my hands.

1\. It will ask me to unlock the phone first? I have gloves and a mask on. It
won't recognize my face, and my gloves don't register touches. Why do I have
to unclock the phone to play music in the first place.

2\. It gets confused on which app to play the music/podcast on. Wants to open
youtube app, or spotify, and so on ...

3\. Not consistent. I can say the same thing, and sometimes it will do one
things, and another next time.

4\. If I'm playing a video, and I want to show it full screen. I have to
maximize and touch the screen. Why can't it play full screen be default.

~~~
physicsguy
I have similar with my Google Home; it can play Netflix but can't work out BBC
iPlayer most of the time. And many times if I ask it to play music, it'll give
an error saying it can't play on YouTube Music because I don't have a
subscription, even though my default music player is Spotify in my account.

------
mijail
My favorite joke on this is "The answer is deep learning, now whats the
problem?"

~~~
proverbialbunny
labeled data

------
Kednicma
It's not exactly a great year for extrapolating trends about what people are
doing with their time. I wonder how much of this is 2020-specific and not just
due to the natural cycle of AI winters.

~~~
hprotagonist
at least some is pure 2020. we want to hire, we can’t right now.

~~~
abrichr
Why not? I would have thought it was a buyer’s market now with all the
layoffs.

~~~
freeone3000
There's tons of layoffs because businesses are doing _really badly_. Current
cashflow may not support another developer. Future cashflow doesn't look that
great in any B2C market, either, and the B2B markets will start to look slim
pickings not too far after that.

~~~
abrichr
I’m not sure I follow. My question was directed toward OP, whose company is
hiring, which presumably means they are doing well. Can you please clarify?

------
Ericson2314
Finally! Big companies need to realize they must understand what what they are
doing with technology to get any value of out it.

They've long resisted that, of course, but I'm pretty sure half the popular of
deep learning was it leveled the playing field, making engineers as ignorant
of the inner-workings of their creations as the middle managers.

May the middle-manager-fication of work, and acceptance of ignorance that goes
with, fail.

\-----

Then again, I do prefer it when many of those old moronic companies flounder,
so maybe this is a bad thing that they're wising up.

------
SomeoneFromCA
Deep Learning has become mainstream. The place work at actually uses 2
unrelated products based on NN.

------
poorman
I imagine this correlates to the "blockchain" postings.

------
not2b
I would have expected a comparison to job postings in general: how do deep
learning job postings compare to job postings for any kind of technical
position?

------
realradicalwash
Meanwhile, the academic job market, certainly in my area, ie
linguistics/computational linguistics, has collapsed, too. A colleague did a
similar and equally nice analysis here:
[https://twitter.com/ruipchaves/status/1279075251025043457](https://twitter.com/ruipchaves/status/1279075251025043457)

It's tough atm.

------
whoisjuan
Companies trying to add machine learning to everything they do like if that's
going to solve all their problems or unlock new revenue streams.

80 or 90% of what companies are doing with machine learning results in systems
with a high computing cost that are clearly unprofitable if seen as revenue
impacting units. Many similar things can be achieved with low-level heuristics
that result in way smaller computing costs.

But nobody wants to do that anymore. There's nothing "sexy" or "cool" about
breaking down your problems and trying to create rule-based systems that
addresses the problem. Semantic software is not cool anymore, and what became
cool is this super expensive blackbox that requires more computer power than
regular software. Companies have developed this bias for ML solutions because
they seem to have this unlimited potential for solving problems, so it seems
like a good long term investment. Everyone wants to take that bus.

Don't get me wrong. I love ML, but people use it for the stupidest things.

------
kfk
Data science and ML In big companies are pulling resources away from the real
value add activities like proper data integrity, blending sources, improving
speed performance. Yes Business Intelligence is not cool anymore. Yes I also
call my team “data analytics”. But let’s not forget the simple fact that “data
driven” means we give people insights when and where they need them. Insights
could be coming from an sql group by, ML, AI, watching the flying of birds,
but they are still simply a data point for some human to make a decision. That
means we need to produce the insight, being able to communicate it to people,
have the the credibility for said people to actually listen to what we are
saying. Focusing on how we put that data point together is irrelevant,
focusing on hiring PHDs to do ML is most likely going to end in a failure
because PHDs are not predictive of great analytical skills, experience and
things like sql are much better predictors.

------
andrewprock
On the plus side, ML systems have become commoditized to the point that any
reasonably skilled software engineer can do the integration. From there, it
really comes down to understanding the product domain inside and out.

I have seen so many more projects derailed by a lack of domain knowledge than
I have seen for lack of technical understanding in algorithms.

------
spicyramen
Every company of course is very different, but I have seen that companies
understood that fro Deep Learning you need a Pytorch or TF expert or maybe
some other framework and most of these experts already work in Google/Facebook
or any other advanced companies (NVIDIA, Microsoft, Cruise, etc), hiring is
very difficult and cost is high. Then you can start using regular SQL and/or
AutoML to get some insights. For a large number of companies that's enough.
When there is so much complexity, such as DL modeling there's little
transparency and management want to understand things. After COViD time will
tell, but my take is that only a few companies need DL.

------
atsushin
I'm currently a masters student and I'm rather glad I opted not to take a
specialized degree such as Machine Learning, taking on computer science
instead. All this discussion about DS, ML, AI (and even CS) becoming over-
saturated has made me rather wary and I worry that I'm choosing the wrong
'tracks' to study (currently doing ML and Cybersecurity as I genuinely am
interested in those fields). I won't be graduating until next year but I'm
forcing myself to be optimistic that the tech job market will be in a better
place by then.

------
x87678r
In general does anyone know if its a good time to look for a new dev job? I
was really going to move this year, but it seems sensible to wait. Just sucks
to see friends with RSUs going up in value so quickly.

~~~
flavor8
No harm in having a recruiter or two feed you opportunities on a regular basis
to interview at (just be up front with them that you're holding out for a
solid fit for your criteria). Better to have a job while interviewing than be
under pressure to accept the first half decent thing that comes along.

------
gdsdfe
For most companies ML is just part of the long term strategy, with covid
priorities have shifted from long term R&D to short term survival, so I don't
see anything out of the ordinary here

------
samfisher83
A lot of thee c folks aren't tech folks or even math folks. They want to try
to use deep learning to do prediction or get some insight when something as
simple as regression would have worked.

~~~
Barrin92
what's particularly surprised me is how effective gradient boosting is in
practise. I've seen so many cases of real world applications where just using
catboost or whatever worked ~95% as well or even just as well as some super
complicated deep learning approach and it saves you ten times the cost

~~~
disgruntledphd2
To be fair, if you're willing to write code to perform feature engineering for
you, you can often replace the complicated boosting approach with a much
simpler regression model.

Turtles all the way down, I guess.

------
tanilama
Deep Learning has been so commoditized and compartmentize over the past 5
years, now I think average SDE with some basic understanding of it can do a
reasonable job in application.

------
dboreham
There will always be Snake Oil salesmen and hence Snake Oil..

------
camoverride
I don't think anyone should freak out when they see a tweet like this: deep
learning is just one particularly trendy part of ML, which is just one piece
of data science, which is just one job title in the "working with data" career
space. I think that most people with backgrounds or interests in DL are very
well equipped to participate in the (ever more important) data science world.

------
alpineidyll3
Booms imply crashes. Anyone who is surprised at this couldn't be smart enough
to be a good machine learning engineer.

------
code4tee
No question ML is powerful and can do great things. Also no question a lot of
companies where just throwing money at stuff for fear of being seen as behind
in this space. When the going gets tough such vanity efforts are the first
things to go.

Teams adding measurable value for their companies should be fine but others
might not be.

------
astrea
In my industry (research), we still have a strong line of business. Some
commercial clients have killed their contracts with us to save money during
the COVID era, but government contracts are still going strong. In areas where
there's a clear use case I think there is still work to go around.

------
darepublic
My belief in an AI breakthrough is so strong that I would invite another AI
winter to try to play catch up

~~~
mac01021
What is your belief based on?

------
ponker
The graph means very little without a comparison line of “all programming
jobs” and/or “all jobs.”

------
m0zg
Out of curiosity: are there job postings that did not "collapse" over the past
six months?

------
Traubenfuchs
Good riddance. The majority of it is snakeoil, relabeling and "smoke and
mirrors". A lot of smart or lucky people made a lot of money, a lot of dumb
people with power over money lost... probably insignificant amounts of it.

------
emmap21
ML/DL is at the exploratory phase for most companies. I have no surprise when
seeing this post. Nevertheless, this also open new opportunities in other
domains and new kind of business based on data. I have no doubt.

------
ISL
Is there a LinkedIn tool that allows you to make similar trend plots as shown
in the Twitter thread, or has the author been archiving the data over time?

------
rch
Unless you're doing ML/DL/etc _research_ then what you're really doing is
engineering, like always.

------
make3
the fact that he doesn't allow people to answer his tweets making data-less
claims like this is really a problem

~~~
itg
He labels anyone who criticizes him as a troll. Unfortunately he is a public
figure in the ML space and does have his share of trolls, but doesn't take too
well to even well thought out replies.

~~~
make3
he's so French, in the worse way possible. I say that as a French person
myself

~~~
eanzenberg
Also his analysis is shoddy. He shows an absolute decrease in DL job postings
since covid hit, and claims that DL is in decline irrespective if other fields
like SWE are also in a similar decline. Utterly surprised by the analysis
given the data.

------
hankchinaski
covid has certainly sped up the transition to the "plateau" state in the
ML/DL/AI hype cycle

------
dgellow
Is that a worldwide trend, or is it based on US data? That's not clearly
stated in the tweet.

------
MattGaiser
How does that compare to job postings overall? Those would have fallen off a
cliff as well.

------
phre4k
If you ever talked to one of the self proclaimed 'AI experts,' you know why.

------
magwa101
Sufficient DL frameworks are now in the cloud and it is mostly an engineering
problem.

------
SrslyJosh
I guess nobody's model... _puts on sunglasses_ ...predicted this event.

------
pts_
I have seen ML and big data crowd out remote openings though.

------
bitxbit
And yet data center spend has gone through the roof. Why?

------
arthurcolle
Why was this headline changed?

------
booleanbetrayal
I believe this to be an obvious that the Singularity has already occurred.

------
recursivedoubts
memento mori:
[https://en.wikipedia.org/wiki/AI_winter](https://en.wikipedia.org/wiki/AI_winter)

------
rahimiali
Citation needed.

------
eanzenberg
This needs to be normalized to “job posting collapse in the past 6 months”
unless you expect DL jobs to grow while everything shrinks? I’m somewhat
surprised by the analysis from someone’s who’s “data driven.” I mean, he even
says so as much in the twitter thread:

“To be clear, I think this is an economic recession indicator, _not_ the start
of a new AI winter.”

So, looks like he discovered an economic recession.

~~~
AznHisoka
If you normalize the data, there is absolutely 0% change in the # of job
openings for deep learning:
[https://i.imgur.com/sDoKwD0.png](https://i.imgur.com/sDoKwD0.png)

