Hacker News new | past | comments | ask | show | jobs | submit login
PyTorch internals (ezyang.com)
179 points by stablemap 6 months ago | hide | past | web | favorite | 10 comments



Great to see this here! Unfortunately I didn't get to record it but Edward presented these slides wonderfully at the PyTorch NYC meetup [1]. If anyone is in New York we're trying to host monthly events. We're mostly focused on technical deep dives. All are welcome!

(1)[https://www.meetup.com/PyTorch-NYC/](https://www.meetup.com/...


Yeah Edward was really amazing. It was the best talk that I've seen in a while.


Not a machine learning guy, but are the internals of the popular frameworks (TensorFlow, PyTorch, etc.) vastly different? Are they mostly different in algorithm implementations, or are there more philosophical differences?


Short answer: they started out conceptually different (static vs dynamic graph), but seem to be converging lately (no pun intended!)


As in PT moving to TF, or TF moving to PT, or both?


More TF moving to PT IMO. Dynamic graph is a superior paradigm for research (where the users of these frameworks tend to spent the vast majority of their time), and it can be traced and exported for inference as needed.


This is wonderfully written.


And also wonderfully visualized! Does anybody know what tool can be used to produce such figures?


This blog post suggests he uses a Surface Book 2:

http://blog.ezyang.com/2019/03/microsoft-surface-book-2/


An iPad Pro with notability would do the trick.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: