
Deep Learning: What, Why and Applications - troopkevin
http://aiehive.com/deep-learning-applications/
======
nedsma
The cached content:
[http://webcache.googleusercontent.com/search?q=cache:http://...](http://webcache.googleusercontent.com/search?q=cache:http://aiehive.com/deep-
learning-applications/)

------
aibottle
If you already know a bit about Machine Learning don't read this, you will
gain nothing. If you don't already know about Machine Learning don't read
this, you will not learn anything here. I have no idea how this can be
trending. Over-simplified, generalized bs.

~~~
signa11
> If you already know a bit about Machine Learning don't read this, you will
> gain nothing. If you don't already know about Machine Learning don't read
> this, you will not learn anything here. I have no idea how this can be
> trending. Over-simplified, generalized bs.

indeed. much better off reading this instead:
[http://karpathy.github.io/neuralnets/](http://karpathy.github.io/neuralnets/)

~~~
k__
For me the dificult is rising too fast after "Strategy 1"

First they come with

    
    
        f(x,y) = xy
    

Well, sounds easy.

suddendly

    
    
        df(x,y)/dx = (f(x+h,y)-f(x,y))/h
    

wait what?!

~~~
Aeolos
This is essential knowledge you can get from a number of different places:
first semester of university, any sort of 3d or image/signal processing, or
just plain old wikipedia.

You could still use a deep learning framework as a black box without knowing
how to differentiate a function. However, you'd have trouble following any
relevant blog post / scientific paper, understanding _how_ it works or
developing any relevant new algorithm.

~~~
k__
Okay, I'm sorry.

The article is well written, my brain just turns off if I read higher math.

------
BinRoo
>> Figure-2 shows that performance of deep learning is much better than non-
deep learning algorithm.

This is a nonsensical over-generalization.

>> In addition, it is automatically do the feature extraction.

At the cost of interpretability. Let's not even mention the dreadful nights of
tweaking parameters (such as dropout probability, activation function, network
architecture, learning rate, optimization function, various pre-processing
tricks, pre-training to warm-start, convolution parameters, maxpool
parameters, and so much more).

~~~
rawnlq
I know nothing about deep learning but I'm curious. Why can't tweaking
parameters be automated? Why is human intuition necessary here?

~~~
visarga
> Why can't tweaking parameters be automated?

Not only that it can be done, but it works even by random trial. Sophisticated
optimization techniques work out just 2x faster, so if you have cheap GPU, you
can run 2x more trials and get your hyperparameters fine tuned.

If you want to apply optimization, you can use one of the popular libraries
like hyperopt and MOE.

~~~
kdelok
This really depends on what the hyperparameter landscape looks like. In
general, global optimisation is a very hard problem and doing it by random
sampling might take longer than the lifetime of the universe (cf Levinthal's
paradox for proteins).

------
samet
Slashdot/HN effect:

Resource Limit Is Reached

The website is temporarily unable to service your request as it exceeded
resource limit. Please try again later.

~~~
Bombthecat
Well, at least he set one. Imagine this poor guy would have needed to oay over
$1000 :)

