
Technical Book on Deep Learning - Anon84
https://github.com/tomepel/Technical_Book_DL
======
tw1010
I love living in 2017 when stuff like this is just created out of the blue
making my life just a little bit easier, and at almost no cost.

~~~
corporateslave2
The cost is the impermanence of knowledge capital

~~~
rlanday
Yeah, everyone's machine learning Ph.D.s are useless now because you spent an
afternoon reading a 100 page PDF.

~~~
corporateslave3
You would be surprised. Deep learning usurped a ton of knowledge about SVMs
and such, and then TensorFlow/Keras made it easy to write and deploy good
models with little knowledge.

Clearly its more complex than that, but this is true for a good number of use
cases.

------
Cacti
Might help if you made the actual PDF more prominent. It took me a few minutes
just to figure out where, you know, the book actually was. Change the
README.md or move the child tex/pdf/etc. into into sub-dirs or something.

~~~
randcraw
I still can't find the actual book. Too bad.

~~~
colmvp
The name of the file is White_book.pdf. I found it by downloading the repo,
and sorting by file size.

------
gyom
It's cool to see that much dedication. It's useful when people take the time
to summarize knowledge in a book to serve as reference.

But ... I have the feeling that the author, who is relatively new to the field
(by his own admission), expanded a lot of formulas and made certain parts of
the theory more complicated than it should be.

Look around page 60. There are formulas with 6 summation signs in front of
them, with all kinds of little indices floating around. How about page 37 ?

In a way, the whole point about the chain rule (and software libraries that
implement it) is that you can stay in "math world" to do the reasoning, and
not think about the job of managing the computation.

Same idea with expression as much as possible in terms linear algebra
primitives. Matrix multiplication is easier to understand when it's not broken
apart into sums whose indices you have to track.

~~~
ssivark
See author's note excerpted below; that was an explicit goal of the project.

> This work has no benefit nor added value to the deep learning topic on its
> own. It is just the reformulation of ideas of brighter researchers to fit a
> peculiar mindset: the one of preferring formulas with ten indices but where
> one knows precisely what one is manipulating rather than (in my opinion
> sometimes opaque) matrix formulations where the dimension of the objects are
> rarely if ever specified.

\-- I think that having those things written out explicitly is of great help
to those not fully comfortable with formal manipulations. It is particularly
useful when implementing those operations in low-level code. I say this even
though I personally find the Einstein notation [1] most convenient.

[1]:
[https://en.wikipedia.org/wiki/Einstein_notation](https://en.wikipedia.org/wiki/Einstein_notation)

~~~
gyom
I had not caught that note from the author. Thanks for pointing it out.

------
mendeza
How should one utilize the knowledge in the book? When do derivations play a
role in deep learning? I am thinking it helps make decisions on what layers
will help or not, or explain why this architecture improves some baseline. I
would love to hear anyone's thought on the matter!

------
md8
arxiv link
[https://arxiv.org/abs/1709.01412](https://arxiv.org/abs/1709.01412)

------
Cozumel
Link to actual book:
[https://github.com/tomepel/Technical_Book_DL/raw/master/Whit...](https://github.com/tomepel/Technical_Book_DL/raw/master/White_book.pdf)

------
vladf
Warning: font is Palatino

~~~
laichzeit0
Ok I know about Comic Sans and Papyrus but what's the deal with Palatino?

~~~
vladf
It's math characters are nearly illegible. The font itself is OK, but the math
it causes is rarely used for a reason.

~~~
tomepel
I am the author of the document. My aesthetics sense is pretty limited as you
have observed. I like the mathpazo font, but if you don't share my weird
tastes, you can just download the repo, comment the \usepackage[sc]{mathpazo}
line in formatAndDefs.tex and you're good to go.

