
Neural Networks Demystified - maxlambert
http://lumiverse.io/series/neural-networks-demystified
======
yxlx
I am very interested in this subject but I was unable to finish watching these
videos. The background music is incredibly distracting. I have a name for that
kind of music, I call it Silicon Valley Music because it is the kind of music
used in a lot of startup product videos. The narrator voice style is also
pretty much the same that is used in those. I wanted to like these videos but
I did not feel that they had any value what so ever. With all of these kinds
of videos, I feel like the producers want the audience to feel bliss or
learning but that they never actually deliver on that so instead it seems
insincere and underhanded.

~~~
maroonblazer
I'm confused by your comment. You seem to be saying two different things:

1) The music and choice of narrator is getting in the way of the (presumably
valuable) content.

2) The content itself, regardless of the choice of narrator or whether or not
there's music behind it, is not useful.

If it's the latter then why complain about the former?

~~~
cLeEOGPw
People feel discomfort while listening to these videos because they feel like
they are being persuaded by a weak young man to believe things he himself does
not believe. This is caused by:

1\. High pitch voice. Either it is modulated or not, the voice informs us that
a person we are listening to is low social status, therefore subconsciously we
assign lesser significance to anything he says.

2\. Infomercial-like intonation. Guy may be professional narrator, and his
intonation is like you would find in advertisements where someone tries to
sell you something really worthless. So people again subconsciously tend to
"categorize" this kind of information as unwanted and are used to filtering it
out. The effect is strengthened by the choice of music.

3\. The narrator clearly has no understanding (or gives that impression) about
the subject he is told to talk about. This is subconsciously felt by people
through his intonation and the way to put emphasis on random words, and that
also translates to two things: again reminds us of infomercials (meaningless
talking to occupy time) and malicious persuasion - he wants me to believe in
something he does not believe (actually does not understand).

This all adds up to our brain signaling something is wrong, therefore
complaints. There is more to education than content.

------
rayalez
Hey, everyone! I'm the founder of lumiverse.io, it's pretty incredible to see
our website on the front page of HN!

I want lumiverse to become an awesome community where people can discover and
discuss great educational videos.

We've launched only recently, the site is still in active development, I'm
improving it every day. If you have any feedback - please let me know =)

(Also feel free to contact me at raymestalez@gmail.com)

~~~
iammyIP
nice work, i am interested in this kind of information, but i am no big fan of
the edutainment video format, i prefer text or a university lecture. some
feedback about the first video:

\- why the music? should this be information or an awfully mixed guitar rap
song with the worst flow ever? i can't stand the use of elevator music in
these kind of videos and also regard this kind of video production as an act
of disrespect towards music in general.

\- why not use more descriptive variable names, e.g. hours and score instead
of x and y? the mystification of these kind of things comes partially from
generalised abstraction and undescriptive variable names. since you already
use a real world example, why not reflect it with the variable names?

\- less speed, more pauses in general do good for demystification of such a
topic, the typical 10-second-attention-span youtube-edutainment-video-style
might not really fit here

\- the drawings and the general flow are nice and well done

~~~
adenadel
This website curates content, so you would have to go to the content creator
with this sort of input
([http://www.welchlabs.com/](http://www.welchlabs.com/)).

------
max_
Every beginner in Neural networks should probablystart with this and follow
with Karpathy's
[http://karpathy.github.io/neuralnets](http://karpathy.github.io/neuralnets)
and may be later on
[http://neuralnetworksanddeeplearning.com/](http://neuralnetworksanddeeplearning.com/)

~~~
IshKebab
Also [http://colah.github.com/](http://colah.github.com/)

~~~
max_
I didn't see this, thanx!!

------
elsherbini
These are also available on Stephen Welch's youtube channel[1]. He uploaded
the last one in this series Janaury 2015.

[1]:
[https://www.youtube.com/playlist?list=PLiaHhY2iBX9hdHaRr6b7X...](https://www.youtube.com/playlist?list=PLiaHhY2iBX9hdHaRr6b7XevZtgZRa1PoU)

~~~
kqr2
Per guidelines :
[https://news.ycombinator.com/newsguidelines.html](https://news.ycombinator.com/newsguidelines.html)

    
    
      Please submit the original source. If a post reports on 
      something found on another site, submit the latter.
    

The youtube URL you cited would probably be a better source link.

~~~
cbennett
Yes, it was misleading I thought all these videos were original content of
that site; in fact, they are original content produced by Welch Labs:
[http://www.welchlabs.com/](http://www.welchlabs.com/)

------
0vermorrow
My initial feedback would be that the music is way too loud and that I really
like that you used Python to show how to construct the bits and pieces.

------
protomyth
Nice, one suggestion, please ditch the music, its distracting and doesn't play
well with speaker's voice.

------
jordigh
Why have neural networks become synonymous with backprop networks? Is it
because those are the most successful? What happened to bidirectional
associatve memory and Kohonen maps? Does anyone take the biological
inspiration of neural networks seriously anymore?

~~~
limsup
people are mostly interested in what is successful and backprop/gradient
descent (with certain specific architectures) is what has been dropping error
rates on all sorts of tasks in recent years.

------
cjcenizal
This is really great, but like a few other people, I found the music
incredibly distracting. At first I thought I had left my Spotify on!

------
V-2
At the risk of not sounding very clever, I have to admit it's way too fast for
me

~~~
cJ0th
These (excellent) videos made me realize that picking up AI as a hobby (which
I intended) would not be a good idea in my case. There are simply too many
different domains of knowledge I would have to learn about to be able to
finally do something slightly useful.

I neither have the time nor the brain to do this. Fascinating stuff, though.

~~~
V-2
My thoughts exactly. There's even too much of meta-knowledge here :) Without
some solid background, it's not something you can casually pick up as a hobby.
But yes, as a programmer (not dealing with anything AI-related - right now
it's middle-of-the-road mobile apps) I am very impressed by what deep learning
proves to be capable of

------
Achshar
Man the the second video gets steep, quickly.

~~~
goshx
That was my impression too. There should be a video to demystify that second
video :)

~~~
slagfart
Yes! Where can I turn to for this?

Is there a course that teaches me just the maths that I need to understand
this?

------
pjdorrell
Plan A: The material is difficult, so present it very slowly and carefully.

Plan B: To avoid boredom, present the material as quickly and densely as
possible, with lots of constantly changing detail, with simultaneous visual,
audio and even some light background music.

These videos are a bit like "Hitchhiker's Guide to the Galaxy" meets "Khan
Academy".

Some people will like them, and some won't. I like them.

------
therobot24
no one can ever claim there that is a shortage of tutorials with regard to
neural nets (and deep learning)

~~~
pmalynin
Tutorials - sure. Actual rigorous analysis - not at all.

~~~
ivan_ah
Well, this is pretty good
[http://neuralnetworksanddeeplearning.com/](http://neuralnetworksanddeeplearning.com/)

Also these lecture notes
[https://github.com/joanbruna/stat212b](https://github.com/joanbruna/stat212b)

~~~
pmalynin
No no, I mean the theoretical analysis of neural networks. It's true that some
authors attempt to do that, in fact even famous researches attempt to do this.
However their arguments and analysis often breakdown in the general case and
what their papers boil down to is that neural networks are good at modeling
functions that they are good at modeling.

In fact the state of the field of machine learning is essentially the state of
mathematics before Cauchy and Weierstrass.

------
rdlecler1
To demystify NNs further we need to stop graphically representing spurious
interactions. If you can perturb or remove a link win between any two neurons
i & j, then that interaction is spurious and shouldn't be represented by in
the graphical network representation. Doing this iteratively you can start to
better appreciate that neural networks are computational circuits that use
thereshold functions instead of logic gates.

~~~
soared
To me 'demystifying' really means making simpler to understand. Yours sounds
like technical pedantry, that while true (idk?) is meaningless to anyone who
actual needs a neural net demystified.

~~~
rdlecler1
Um, no. Show an electrical engineer the circuit diagram of an 8-bit added and
they'll know the function right away. The function of the NN is similarly
determined by the topological circuitry--in fact the function can often be
preserved when you represent these networks as Boolean network, but all that's
concealed when we don't remove spurious connections and it starts to feel like
weird voodoo mathemagic. This is not an opinion, I've published on this and
have seen scientist get confused because of it.

------
pawelwentpawel
It's a great short introduction, really worth looking through the code
examples that they have on github too -
[https://github.com/stephencwelch/Neural-Networks-
Demystified](https://github.com/stephencwelch/Neural-Networks-Demystified)

There is certainly no shortage of new tutorials bubbling up on neural nets.
One of my favorites -
[https://www.cs.ox.ac.uk/people/nando.defreitas/machinelearni...](https://www.cs.ox.ac.uk/people/nando.defreitas/machinelearning/)

------
masthead
People who are finding a difficult time concentrating, an clone the git repo
where it has ipython Jupyter notebooks.

Link to Installation of Anaconda(by Continuum):
[https://www.continuum.io/downloads](https://www.continuum.io/downloads)

Link to the Git Repo: [https://github.com/stephencwelch/Neural-Networks-
Demystified](https://github.com/stephencwelch/Neural-Networks-Demystified)

------
JD557
In case the author of the videos is here, do you have plans to add some videos
about Convolutional NNs and Recurrent NNs?

I know a lot of developers like myself that know about traditional NNs, but
are not familiar with those two.

~~~
colah3
Not the author, but I wrote an article introducing conv nets you might find
helpful: [http://colah.github.io/posts/2014-07-Conv-Nets-
Modular/](http://colah.github.io/posts/2014-07-Conv-Nets-Modular/)

I also have an article on RNNs, although it's focused on explaining a special
version, called an LSTM: [http://colah.github.io/posts/2015-08-Understanding-
LSTMs/](http://colah.github.io/posts/2015-08-Understanding-LSTMs/)

If you have experience with functional programming, you might find this a nice
way to think about Conv Nets/RNNs/etc:
[http://colah.github.io/posts/2015-09-NN-Types-
FP/](http://colah.github.io/posts/2015-09-NN-Types-FP/)

~~~
wedesoft
Thanks. Very nice articles.

------
txprog
I liked! Already done the first 3, the music didn't bothered me neither the
voice (i'm not a native english at all.). As for the concept explain, it's
true that some equation are a little bit out of reach for someone that has
less mathematical background. But i understand the concept so far, which are
well explained.

I'm not sure i would be able to just write (or decide) the equation for the
neural network. On what criteria can you decide this will work better or not?
What are your key to make a decision?

------
birdwatcher9
get rid of the music and find someone to talk who uses less sibilance

------
rjcrystal
I saw some of the videos and i agree with the suggestions provided here but i
really like the simple way of explaining and less mathematical more
programmatic approach. It'd be awesome if you could make some neural networks
with cuda or any other library like tensorflow or theano. Good luck.

------
coherentpony
Cool. I have a word of advice, though. The term 'scalar product' when
referring to the product of two vectors is a scalar, not a vector. In the
back-propagation video you mis-spoke here.

Otherwise, good job.

------
TheAwesomeA
Great videos!

Does somebody know a source for a nice data analytics/machine learning
taxonomy or something (grouped by the class of problems the different methods
solve)..?

------
BinaryIdiot
This looks really interesting. Thanks! I'll save these to start watching
later.

------
alexjv89
Awesome video ! Crisp and to the point

------
dubmax123
great content, most annoying music ever!

------
sandra_saltlake
Mathematical more programmatic approach.

