
PyTorch: An Imperative Style, High-Performance Deep Learning Library [pdf] - stablemap
https://arxiv.org/abs/1912.01703
======
jspisak
Finally the solution to all of your PyTorch citation problems! :)

~~~
crazypyro
Relevant Github issue:
[https://github.com/pytorch/pytorch/issues/4126](https://github.com/pytorch/pytorch/issues/4126)

~~~
godelski
It is surprising to me that they don't put this in the README.

(I do know it is in the root directory[0], it is just common practice to have
it in the README. To be stupidly obvious)

[0]
[https://github.com/pytorch/pytorch/blob/master/CITATION](https://github.com/pytorch/pytorch/blob/master/CITATION)

------
scythe
Man it’s kinda sad as a Lua fan to see so much interest in a project where the
main goal is just to not use Lua.

I guess academics like familiarity and Lua insistently refuses to be like
other languages (arrays and maps in one type, 1-based arrays, nonstandard
builtin patterns, etc).

~~~
uoaei
Academics simply don't have the time to learn languages that do not have
substantial ecosystems and relative ease-of-use.

~~~
scythe
It’s true - there have been attempts to fix it, but nobody has created
something other people want to use. The Torch project largely replaced all of
the then-popular Lua packages — wxLua was dropped for Torch’s internal qtLua,
the Lua concurrency libraries (Lanes and luaproc) were ignored in favor of
zeroMQ, LPeg and Lua patterns were generally less popular than PCRE and Re2
bindings, _et cetera_. Maybe Torch is to blame (NIH syndrome), maybe the Lua
packages weren’t up to the task, maybe communication within the community is
too hard (Lua lacks centralized discussion channels where experienced users
are regularly active), but in the end, Lua didn’t come away looking good here.

Learning a new language wasn’t too hard when that language was Python, after
all.

~~~
cfusting
We'd all be happier writing math; writing code is just a nuisance.

~~~
uoaei
The Julia programming language's development started explicitly to address
this sentiment.

------
amrrs
If you want to read it online - [https://www.arxiv-
vanity.com/papers/1912.01703/](https://www.arxiv-
vanity.com/papers/1912.01703/)

~~~
wjn0
ah, yes, why squint at a PDF when you can squint at LaTeX compile errors
instead

------
zapnuk
Is there a typo in Listing 1?

The forward function of the conv net should use:

t3 = self.fc(t2)

instead of:

t3 = self.fc(t1)

AFAIK the nn.functional.relu function is NOT inplace by default [1]

[https://pytorch.org/docs/stable/nn.functional.html](https://pytorch.org/docs/stable/nn.functional.html)

~~~
MiroF
yes that's a typo

------
foxes
It's a bit funny to call it imperative, when really at the end of the day, the
objective is to get something where you have very little insight into what the
neural net. is doing to the state.

