
Your AI skills are worth less than you think - szopa
https://medium.com/@szopa/your-ai-skills-are-worth-less-than-you-think-e4b5640adb4f
======
cjlars
I had the same moment of realization a few years ago when my workplace decided
to hold a kaggle-style competition to select an ML consultancy to prioritize
incoming prospective customers. The consultants had given us their sales pitch
about how they had the best algorithms, credentialed specialists, proprietary
data and so forth. Meanwhile I was just an Excel Monkey who convinced his boss
to let me teach myself ML on the company clock.

So off I go to pull my data and spend about half a day building a script to
score the leads with a randomforest package, literally teaching myself the
syntax as I go. A couple weeks later, the scores come back and I'm second of
seven, beating six companies with multi-million dollar funding rounds.

My secret? Nothing really, I knew the data well so I created a few features
that the outside consultants apparently missed, which apparently made up for
my lack of ML skills. Turns out the 'proprietary data' they were touting was
basically just census data and none of their engineers could be bothered to
think through the real world situation on the ground and just fed the data
into their system raw.

I think there are some really interesting opportunities for people who are
working on the cutting edge of ML with things like self driving cars, AlphaGo,
and others, but there are an awful lot of business problems can be solved with
'good enough' solutions with relatively simple, undergrad level algorithms and
good domain knowledge.

~~~
shoo
well done. good on your workplace for setting up a competition to benchmark
competing solutions & good on you for learning how to do it yourself.

> there are an awful lot of business problems can be solved with 'good enough'
> solutions with relatively simple, undergrad level algorithms and good domain
> knowledge

i agree. also, the subsystem making predictions is but one part of the overall
system, and the performance of the overall system is often most impacted by
the weakest link in the chain. this might not be the predictions, it might be
data quality, or how well the system can execute on the output of the
predictions.

in some sense this is freeing --- it can be incredibly challenging - or
perhaps impossible - to make "optimal" decisions, but in a lot of places
there's value in just doing a good enough job.

------
Ragib_Zaman
I have a background in mathematics and am currently studying Machine Learning
in a Master's program. Soon after I started I came to the same conclusion as
the author of the article. It is becoming very easy for someone with some
basic coding ability and shaky mathematical foundations to do some online
courses, emulate a few tutorials or maybe even sign up for a bootcamp and,
voilà! 6 months later they are more than competent enough to quickly put
together ML solutions that are close to an 'optimal' solution which requires
much more in-depth knowledge and engineering time. Close enough that the
quality/amount of data and feature engineering (basically, domain knowledge
gained through general problem solving skills rather than ML skill) is far
more important than the details of any model.

This is why, when I start applying for jobs soon, I plan to maintain and frame
my skill set as a quantitative problem solver, not an ML specialist (as I had
thought I would do when I started this Master's degree). Soon the value of
being able to build a good ML model will decline, but hopefully the knowledge
and skill set of those with a proper training and understanding of this field
(mathematical ability, computer science/programming ability and problem
solving ability) will still be valuable.

------
headcanon
If you defined "AI skills" as being able to take some cleaned data and plug it
into some python library to solve some pre-defined objective a la kaggle, then
sure, thats no more or less easy than any other kind of data transformation.
The difficult part is coming up with said data and objectives, being able to
ask the right questions and integrate it into an actual monetizable product.
There is still a huge long tail of applications that these classes of
algorithms promise, even if the number of PhDs optimizing these algorithms
against ImageNet or MNIST plateau.

I see "AI skills" as more of a form of literacy as opposed to python-notebook-
jockeying. That is, having a general knowledge of what these algorithms are
good for, being able to recognize when and how a classifier or a regressor
could help, and compiling the data necessary to accomplish it.

I work at an "AI" startup as an engineer, mostly on the applications side. We
have a human analyst team that annotates our data for us, with the idea that
we will eventually supplement that team with a classifier that will help us
scale the operation with a similar team size. Data quality is certainly a
challenge, and a lot of that is because the in-house app that our analysts use
isn't very flexible, and they have to keep track of much of their work with
excel spreadsheets, which adds a lot of mental overhead to their jobs.
Additionally, a lot of the work is necessarily subjective, and the analysts
have to move quickly through the data to annotate it, which leaves room for
error. I realized that by improving the UX of our internal apps with good ol'
application design/engineering, we can improve our data quality, and thereby
improve our model. I consider that an "AI skill" even if I never have to touch
a python notebook to accomplish it.

------
Eridrus
So I got started working on ML in 2015 as well, and it has gotten far easier,
but the thing that has gotten easier is the engineering side of things, not
the ML bits.

And tbh, "take the latest model and tune it on your data" has gotten easier,
but not trivial, since the code people release for the latest model is what I
would call "research quality" code. Some labs will release production-ready
code, but it's pretty rare. Particularly as more labs pick up PyTorch and you
realise you can't afford to run that in prod.

I think the author is right about basically everything else he says, but I
also don't think it's an argument against picking up these skills, since
they're going to be foundational to basically everything going forward IMO. So
while there won't be a huge premium for ML engineers, it will be a necessary
skill set if you want to work on challenging problems.

And at some point the curve for adding data flattens out, and it becomes
cheaper to hire engineers to think than subject matter experts to label data.

------
corporateguy55
I’ve noticed this as well. I would argue that ML might be easier than regular
programming. Often times it’s clear cut which model to use, there’s enough
examples online to do it, and the amount of code is minimal. ML is no
different than any other programming domain that has a good framework, it
quickly becomes plug and play.

------
thatoneuser
This seems more like a reflection of how cut throat tech startups are than
talking to AI specifically.

------
guard0g
Totally agree, It's all about the ML data and RL experiences - not the
quantity but the quality.

