
Deep Learning Is Eating Software - gk1
https://petewarden.com/2017/11/13/deep-learning-is-eating-software/
======
thisisit
Some of my work is related to traditional BI tools and not one day goes with
me hearing someone, specially the management, talk about applying machine
learning in some way.

Now while I can see some aspects which can be improved by usage of ML many
don't tend to understand the implication of such systems. A good example came
last year. On the BI system there existed 200+ reports with an average user
having access to maybe 20 reports based on their department - finance, sales
etc. There was a big push from the management to "improve" user experience
with ML. Build a recommendation system they said. There was no cost benefit
analysis done for using a complex ML based recommendation system on 20 reports
- It sounded cool and in line with the hype.

Then there is the question of user involvement. While I believe there is
surely a case to made for stuff which involves software engineers, like the
ones mentioned in the post - search ranking, data center energy usage etc. But
things which require non-software engineers is a bit doubtful. This is because
as the post puts it - "This doesn’t require the same technical skills as
traditional programming, but it _does need a deep knowledge of the problem
domain._ " And engineers cannot be expected to have deep knowledge of every
function. If they get too drunk on the ML kool aid, and some of them do, the
end result will be mess.

------
sytelus
Deep learning is certainly eating conferences, funding and PhDs. And that
wasn't a bad thing until everyone got focused on generating another random
architecture that yields another 2% improvement on their favorite dataset so
their paper gets through.

~~~
niyazpk
>> another 2% improvement

2% is HUGE at this point, at least on the datasets that I am familiar with -
ImageNet, MS-Coco, PascalVOC etc. And at this point, any modifications or
strategies that gets you the 2% improvement is noteworthy, and I know that
people in my team are looking forwards to techniques that will give us these
improvements.

~~~
skierscott
> 2% is HUGE at this point

Hell, on MNIST 0.14% is huge. Geff Hinton created an entire new architecture
to get 0.25% error, which is far better than the baseline 0.39% error [1].

To be fair, he did reduce the error by 35%.

[1]:[https://arxiv.org/pdf/1710.09829.pdf](https://arxiv.org/pdf/1710.09829.pdf)

------
carterehsmith
Sure. Can we see an example of Deep Learning used to create a regular CRUD
app?

~~~
fny
I wouldn't be surprised to see deep learning optimize a GraphQL backend by
indexing and caching queries as needed.

~~~
carterehsmith
Indexing and caching is optimization. Who did the real job, that was being
optimized?

If my account balance is $5, and I buy that yummy Cheese Swirls for $2, I
expect the balance to be $3 afterward. Deep Learning will not help, or frankly
it can only make this simple calculation wrong. So... no, deep learning is not
eating software. BTW deep learning IS software, so this is all just headline-
grabbing.

~~~
fny
Deep learning will allow software to infer how your comment maps to a CRUD
operation, and instead of having someone code up a backend and think through a
schema, it'll happen on the fly.

So yes, I suspect deep learning (or ML more generally) will eat up a lot of
the CRUD, glue code, and repetetive data-pipeline related crap I deal with
daily.

I yearn for the day when I can tell my db "store this" and "give me that"
without having to think of the umpteen* data stores that back it all.

* No joke. We have a real-time feed, a few warehouses, and external data sources we need to wrangle to build applications. From my perspective, some statistical machine could easily do that with a bit of human help.

 __If you think we should rearchitect our infrastructure, your right! But
doesn 't that seem like something a computer could be good at?

------
Derbasti
Not every problem can be solved by matrix multiplication and gradient descent.
Particularly, many problems require _exact_ solutions, and not just good
approximations (think aerospace or science or finance). And speaking of
science particularly, many solutions require _insight_ , which ML often can't
provide.

------
halfnibble
After reading this, is machine learning just the coded version of what you do
in Calculus II? Analyzing a scatter plot and trying different equations to get
the correlation coefficient value close to "1" and predict the next value?
Because that would seem to me to have limited usefulness.

~~~
joepanda
What you describe sounds like regression, which is a powerful technique with
varied applications, but is not the same as deep learning. Aspects from
calculus do play an important role in ML notably in the gradient descent
algorithm used to adjust weights. If you’re interested in learning more I
recommend the 3blue1brown series on YouTube

