
PyTorch internals - stablemap
http://blog.ezyang.com/2019/05/pytorch-internals/
======
alexrigler
Great to see this here! Unfortunately I didn't get to record it but Edward
presented these slides wonderfully at the PyTorch NYC meetup [1]. If anyone is
in New York we're trying to host monthly events. We're mostly focused on
technical deep dives. All are welcome!

(1)[[https://www.meetup.com/PyTorch-
NYC/](https://www.meetup.com/...](https://www.meetup.com/PyTorch-
NYC/\]\(https://www.meetup.com/PyTorch-NYC/\))

~~~
m_ke
Yeah Edward was really amazing. It was the best talk that I've seen in a
while.

------
barbecue_sauce
Not a machine learning guy, but are the internals of the popular frameworks
(TensorFlow, PyTorch, etc.) vastly different? Are they mostly different in
algorithm implementations, or are there more philosophical differences?

~~~
p1esk
Short answer: they started out conceptually different (static vs dynamic
graph), but seem to be converging lately (no pun intended!)

~~~
amelius
As in PT moving to TF, or TF moving to PT, or both?

~~~
m0zg
More TF moving to PT IMO. Dynamic graph is a superior paradigm for research
(where the users of these frameworks tend to spent the vast majority of their
time), and it can be traced and exported for inference as needed.

------
foobiekr
This is wonderfully written.

~~~
asavinov
And also wonderfully visualized! Does anybody know what tool can be used to
produce such figures?

~~~
stevesimmons
This blog post suggests he uses a Surface Book 2:

[http://blog.ezyang.com/2019/03/microsoft-surface-
book-2/](http://blog.ezyang.com/2019/03/microsoft-surface-book-2/)

