
One More Reason Not to Be Scared of Deep Learning - amplifier_khan
http://www.lab41.org/one-more-reason-not-to-be-scared-of-deep-learning/
======
EwanG
Quick Summary - Not having enough data to access won't affect your results as
much as you might think, and so shouldn't prevent you from trying out Deep
Learning.

~~~
joe_the_user
The thing about that is it might depend on the data set you work with.

I have the impression that machine learning kind of torn between accomplishing
things that humans can do but machines can't and trying to get useful
information from inherently uninformative data sets.

I can see that recommendation engines are things that a lot of companies want
but when recommendations just come from prior history, perhaps there are hard
limits to how much any pattern finder can deduce. No doubt, one can define the
problem and get better but the reason that adding more data in this case
doesn't get you that much improvement is that you hit diminishing returns on
your data.

------
foobaruser
The article reminded me of a paper that described a NN that was able to learn
with just a few examples. However I'm not able to find that paper on my notes.

Does anyone remember which paper is it?

~~~
T-A
You may be thinking of this: [https://www.technologyreview.com/s/544376/this-
ai-algorithm-...](https://www.technologyreview.com/s/544376/this-ai-algorithm-
learns-simple-tasks-as-fast-as-we-do/)

~~~
nl
The paper is here:
[http://cims.nyu.edu/~brenden/LakeEtAl2015Science.pdf](http://cims.nyu.edu/~brenden/LakeEtAl2015Science.pdf)

This isn't deep learning (or a neural network at all). However, it is an
extremely interesting approach.

Most of the previous "low data" deep learning approaches I've seen are broadly
based around the approaches seen in "Zero-Shot Learning Through Cross-Modal
Transfer"[1]

That's not really low data in the sense that it needs lots of data for initial
training, but then is able to learn new things with very few examples.

[1] [http://papers.nips.cc/paper/5027-zero-shot-learning-
through-...](http://papers.nips.cc/paper/5027-zero-shot-learning-through-
cross-modal-transfer)

~~~
sgt101
You might also be interested in things like the Anglican language from Oxford.
[http://www.robots.ox.ac.uk/~fwood/anglican/](http://www.robots.ox.ac.uk/~fwood/anglican/)

~~~
stan_rogers
We do not presume to come to this Thine output trusting in our own
correctness, but in Thy manifold and great Processors. Print, we beseech Thee,
the content of Thy variable X, according to Thy promises made unto mankind
through Thy servant Alan, in whose name we ask. Amen.

------
partycoder
One reason to be scared is: it's a significant improvement.

Only a couple of years ago, this was probably the state of the art:
[http://www.cs.nyu.edu/~yann/research/norb/](http://www.cs.nyu.edu/~yann/research/norb/)

You can see how much progress has been made since then.

Yes, some people have been able to find synthetic images that can fool a deep
learning based solution.

Even humans are stupid too at young ages.
[https://www.youtube.com/watch?v=gnArvcWaH6I](https://www.youtube.com/watch?v=gnArvcWaH6I)

Scientists will understand why and find a way around it eventually.

The field we call "human intelligence tasks" is narrowing down at a really
aggressive pace and that is scary.

~~~
burkaman
This article is encouraging people to use deep learning. It's for people who
are scared to try it, not scared of it.

~~~
partycoder
the site seemed to be unavailable when i tried to access it first. my bad.

~~~
argonaut
There really needs to be an HN rule against commenting before reading (at
least part of) the article...

~~~
Houshalter
That's not entirely fair when sites are unavailable. And especially annoying
is when sites are behind paywalls. Sometimes I use an old mobile device, and
most news sites cause the browser to crash. So I just read the comments
instead.

Though leaving a top level comment based just on the title is a little
extreme. Even if you know the subject, you have no idea what the argument or
information in the article is. I wouldn't mind replying to someone else's
comment though.

------
sgt101
yes, but why?

Why does image learning require more data than text for deep networks? (if it
does)

What does that mean?

Also amused by the idea of leaving out layers in deep models - perhaps we
could call these new models "neural networks"?

------
argonaut
You should balance the classes. That's an enormous flaw right there.

(the original Crepe paper, not to mention every other machine learning paper
in general, balances the classes).

------
PhasmaFelis
I've read the article and skimmed the Wikipedia page, and I still don't know
what deep learning is.

~~~
Houshalter
Basically just neural networks. The significant difference is that in the past
neural networks were "shallow" and only had one layer (as were most other
machine learning algorithms.) Now people are building neural networks with
hundreds of layers and millions of parameters.

If you aren't familiar with neural networks, I think this is a good
introduction that goes into a lot of detail:
[https://karpathy.github.io/neuralnets/](https://karpathy.github.io/neuralnets/)

~~~
radarsat1
This is the usual definition, assuming NN, because it's a conveniently general
model with a well-understood training method. However, to me, deep learning is
a more general concept: several layers to learn features, with a final
classification or regression layer; where "layer" is some statistical learning
process. The key idea is that features can be learned in a hierarchical way,
not so much that they are NN layers, which is an implementation detail.

For example, several layers of high-dimensional k-means unsupervised learning
followed by a simple linear classification stage can perform very well.

------
toisanji
Bad title.

