
Capsule Networks Tutorial [video] - isp
https://www.youtube.com/watch?v=pPN8d0E3900
======
ntenenz
When Hinton approves, you know you've done well...

[https://www.reddit.com/r/MachineLearning/comments/7ew7ba/d_c...](https://www.reddit.com/r/MachineLearning/comments/7ew7ba/d_capsule_networks_capsnets_tutorial/dq8yc9p/)

~~~
isp
For anyone who hasn't watched the video: this comment is on topic and
certainly relevant, because Hinton himself invented Capsule Networks.
[https://www.wired.com/story/googles-ai-wizard-unveils-a-
new-...](https://www.wired.com/story/googles-ai-wizard-unveils-a-new-twist-on-
neural-networks/)

------
cloverich
I think this is the same author that published "Hands-On Machine Learning with
Scikit-Learn and TensorFlow...". The quality of the book (thus far) is so high
that I immediately started Googling about the author to try and learn more
(and did not learn much), assuming he must be well known. I did not learn
much, but can at least say the book is fantastic.

[1]: [https://www.amazon.com/Hands-Machine-Learning-Scikit-
Learn-T...](https://www.amazon.com/Hands-Machine-Learning-Scikit-Learn-
TensorFlow/dp/1491962291)

~~~
colmvp
I second that book. Well worth the small price and very accessible to read

~~~
isp
Yes, same author. He has a picture of his book at the end of the video
(21:40). With these recommendations, and after that video, I am going to buy &
read his book.

------
Mmrnmhrm
Nice video, however instead of riding the hype train of arxiv, could we wait
until peer review analyzes the paper?

If someone other than Hinton presented a YADLA (Yet Another Deep Learning
Architecture) that does not achieve state of the art level of performance in
the basic datasets, it would not be very well received.

~~~
nabla9
/user/geoffhinton 1 year ago

> Over the last three years at Google I have put a huge amount of work into
> trying to get an impressive result with capsule-based neural networks. I
> haven't yet succeeded. That's the problem with basic research. There is no
> guarantee that ideas will work even if they seem very promising. Probably
> the best results so far are in Tijmen Tieleman's PhD thesis. But it took 17
> years after Terry Sejnowski and I invented the Boltzmann machine learning
> algorithm before I found a version of it that worked efficiently. If you
> really believe in an idea you just have to keep trying.

[https://www.reddit.com/r/MachineLearning/comments/4w6tsv/ama...](https://www.reddit.com/r/MachineLearning/comments/4w6tsv/ama_we_are_the_google_brain_team_wed_love_to/d6dmw6f/)

Most of the deep learning papers published are just exploring and
incrementally building upon the ideas 'Canadian Mafia' (Hinton, LeCun and
Bengio) discovered years ago. At some point this 'idea space' is explored and
understood and we hit the wall just like before. Let's hope that people doing
basic research can find new breakthroughs in less than 17 years.

~~~
Mmrnmhrm
I'm not saying to discard this research. I'm suggesting to wait until it is
peer-reviewed and published before jumping on it.

To me, the capsule concept seems reasonable, and I have my personal opinion
about its strengths and flaws. But my opinion hardly matters.

I expect peer reviewers from NIPS to have a better understanding that I have,
and I trust them to filter and clean this idea, instead of trusting the
research just because of the name that signs the paper.

To me, although it has its flaws, the _double-blind_ _peer-reviewed_ processes
is important.

~~~
nabla9
The paper has already been accepted to NIPS 2017. Poster session is Tue Dec
5th 06:30 - 10:30 PM @ Pacific Ballroom #94

[https://papers.nips.cc/paper/6975-dynamic-routing-between-
ca...](https://papers.nips.cc/paper/6975-dynamic-routing-between-capsules.pdf)

------
dnautics
Two impressions: 1. when I saw the original Hinton proposal of capsule
networks, I thought it was kind of halfway to a hofstadter-style cognitive
machine from his work "conceptual slippages". Now understanding it more, I am
more confident in my assessment.

2\. I think that implementations are going to be hamstrung by the clunky
nature of tensorflow's architecture... Did anyone else feel this?

~~~
halflings
What is clunky exactly about tensorflow's architecture?

~~~
dnautics
for starters, the problem that you have to separate the definition of the
computational graph from the actual execution of the function (there's
separate declarative and imperative stages).

This problem is not insurmountable. Something like this would be really cool:

[https://www.youtube.com/watch?v=ijI0BLf-
AH0](https://www.youtube.com/watch?v=ijI0BLf-AH0)

------
isp
I found this video to be much easier-to-follow than previous posts focusing on
intuition, e.g.,
[https://news.ycombinator.com/item?id=15690121](https://news.ycombinator.com/item?id=15690121)

------
mycat
How does it compare with, Spiking Neural Network? Both use vectors (but in
different way) to encode more information

------
georgehm
The author answers some questions in the comments as well. Worth checking out!

------
tw1010
I predict capsule networks will not nearly have as big of an impact on the ML
community as many think it will. Why? Because the main reason they exist is to
address performance issues in really advanced, cutting-edge, models. But that
is not what drives upvotes here and on reddit. The failure of capsule networks
to pick up steam and the continued popularity of GANs, I think, is a signal
that the main reason ML is still trendy and in vogue is because the subject,
AI, tickles the imagination of engineers, but is still, five years after ML
started to become popular, not as big in actual practical engineering systems
as what the outside public might think it is.

~~~
chillee
I think you have a misconception of what capsule networks are. They are not
intended to address "performance issues in really advanced models", they are
intended as another paradigm in deep learning that Geoff Hinton thinks has a
lot of promise.

I also don't know what you mean by "the failure of capsule networks to pick up
steam". The paper literally came out a month ago. It's too early to say
whether it'll "pick up steam" or not.

I also don't understand what you mean by "the continued popularity of GANs"
showing anything.

