
Tensorflow User Experience - luu
https://nostalgebraist.tumblr.com/post/189464877164/attention-conservation-notice-machine-learning
======
tanilama
Tensorflow is a proof that some people in Google don't understand software
engineering: They don't know and can't decide even what interface they want to
expose/hide.

It is a disaster.

~~~
gradpanda123
It’s not that they don’t know how to be engineers, it’s that politically they
all want to own their “thing” in TF.

This type of exchange is pretty much what happens all the time:
[https://github.com/williamFalcon/pytorch-
lightning/issues/35](https://github.com/williamFalcon/pytorch-
lightning/issues/35)

There is no reason why two things can’t be merged together for better UX and
functionality, but two humans have egos, want to lead projects, and be in
charge, so things remain separate.

~~~
tanilama
I don'think that dispute what I have just said.

Software Engineering isn't just about put stuff out there, it also involves
careful planning and roadmap around abstractions.

As high profile as a project as TF, if they don't have a clear way to tell
having many parallel implementations for essentially the same functionality is
confusing and hurt developer experience and ultimately harms the project
itself, then that only explains the disaster , as a result of mismanagement,
but doesn't undo the damage.

~~~
streetcat1
Not in a land grab. TF is crucial to google dominance in AI.

------
coolness
Agreed wholeheartedly. PyTorch is so much better at user experience. This
tweet by Andrej Karpathy summarizes my thoughts when I switched
[https://twitter.com/karpathy/status/868178954032513024?lang=...](https://twitter.com/karpathy/status/868178954032513024?lang=en)

------
chillee
Funnily enough, the Tensor2tensor library he mentions that replaced the old
deprecated layer is now also deprecated.

[https://github.com/tensorflow/tensor2tensor](https://github.com/tensorflow/tensor2tensor)

> It is now in maintenance mode — we keep it running and welcome bug-fixes,
> but encourage users to use the successor library Trax.

~~~
p1esk
What?? It’s literally on my todo list of new things to learn!

------
dustintran
Hello. I'm the person that was linked to in that GitHub issue!

I sympathize with the post's frustration. The TF tutorials on the official
website are well-written. But they mostly cover basic features, and as a
recent Reddit thread described
([https://old.reddit.com/r/MachineLearning/comments/e4pxqp/d_i...](https://old.reddit.com/r/MachineLearning/comments/e4pxqp/d_ive_been_switching_over_from_pytorch_to_tf_20/)),
the support ecosystem is lacking as StackOverflow and blog posts are out-of-
date due to all the software churning. I'm not a TF engineer, but as someone
with experience designing libraries on top of TF, even I find myself sifting
through Stack Overflow/blog post code to find the new best practices..

Regarding Bayesian layers, it's actually a NeurIPS paper this year
([https://papers.nips.cc/paper/9607-bayesian-layers-a-
module-f...](https://papers.nips.cc/paper/9607-bayesian-layers-a-module-for-
neural-network-uncertainty)). I worked on an early prototype in TensorFlow
Probability but ended up abandoning the design as I found it inflexible in
practice. The solution is the NeurIPS paper, and it's experimental: there are
no promises of stability (in fact, we even moved the code from Tensor2Tensor
to another repository
([https://github.com/google/edward2/](https://github.com/google/edward2/)), of
which has yet to have an official package release!).

Software for uncertainty models is more on the research fringe, and this
should be made clearer in official TensorFlow solutions building on these
designs.

------
jlebar
I really like the UX of JAX,
[https://github.com/google/jax](https://github.com/google/jax)

The API is numpy plus like four functions. That's the beginning and end. It
does JIT compilation under the hood, so can run quite fast.

~~~
nihilest
That's not a fair comparison because JAX does not really provide high-level
libraries (and for ones it does, it introduces many more than those four
functions). For example, there must be at least 4 different neural network
libraries in JAX. (e.g.,
[https://github.com/google/jax/blob/master/jax/experimental/s...](https://github.com/google/jax/blob/master/jax/experimental/stax.py),
[https://github.com/google/trax](https://github.com/google/trax)).

~~~
jlebar
Sure, but OP's point is that TensorFlow's built-in libraries suck _and you 're
forced to use them_. They are confusing, change all the time, etc etc.

Since JAX's API is lower-level, you can choose a neural network API that
doesn't have the problems of TF's API. Or worst case, you make it yourself;
you can see that the higher-level APIs provided by JAX are all very simple.

------
Nimitz14
Haha, this really nailed several painpoints of TF. Laughed and nodded at the
"there are no experts users, instead there's experts of 2017 TF and experts of
2019 TF".

------
antipaul
“I just want to make a neural net with something that doesn’t remind me of
Microsoft Office. Is that too much to ask?“

PyTorch?

------
ddtaylor
Yup. Google gave us TPU usage for free and we gave up on using it.

Also, everything you think or do is deprecated. This comment has been
deprecated.

~~~
jspisak
We have been working with G on TPU support for PyTorch. Have you tried it out?
[https://github.com/pytorch/xla](https://github.com/pytorch/xla)

------
solveit
In _completely unrelated news_ , I noticed that a recent Google paper (AugMix)
used PyTorch.

------
make3
Most of the field is moving to PyTorch because of what the author is
describing. PyTorch is super nice by all accounts.

------
pavlov
_> "Actually, you know what it reminds me of, in some ways? With the profusion
of backwards-incompatible wheel-reinventing features, and the hard-won
platform-specific knowledge you just know will be out of date in two years?"_

Every JavaScript web UI framework ever?

------
m0zg
If PyTorch had something like TensorFlow Lite, TF would probably be dead
already. I certainly wouldn't need it. At the moment, though, TFLite is the
most sane option for inference on phones and other ARM devices.

~~~
jspisak
PyTorch Mobile is a start and is available for iOS and Android. Given folks
like PFN and Microsoft are (or will be heavy contributors) i would expect
support for more devices to broaden. Have you tried it out yet? No need for a
separate set of op semantics or framework.. :)
[https://pytorch.org/mobile/home/](https://pytorch.org/mobile/home/)

~~~
m0zg
Anything that can't use mobile GPU (or DSP/TPU for quantized inference) is
pretty useless IMO, because it's just not energy efficient enough to be
practical in a battery powered device, even if it's fast enough.

~~~
Nimitz14
Once pytorch is updated to use XNNPACK (being worked on right now) I think it
should be fine to use. That plus QNNPACK makes inference quite low on power
usage in my (admittedly limited, just integrated XNNPACK) experience.

~~~
m0zg
As a rule, CPU burns at least 5x the energy per FLOP. So no, CPU is not a
viable option on mobile if you need to do inference constantly. For "every now
and then" cases, sure.

~~~
Nimitz14
Interesting, thanks.

------
rsokl
If anyone is interested in a dead-simple numpy-based autograd library:
[https://mygrad.readthedocs.io/en/latest/](https://mygrad.readthedocs.io/en/latest/)

I created this for educational purposes, but it is quite robust, simple, well-
tested, and well-documented. It also includes neural network style operations
like N-dimensional convolutions and pooling.

Plus backprop through all variations of einsum :)

~~~
needinfoadiff
Hello! Couls you please let me know a good place to start learning aboht
automatic differentiation?

------
rjsw
Is Tensorflow good ? I'm still working through trying to get Bazel to build.

~~~
mark_l_watson
Why build it yourself. Just pip install it into a fresh Python environment.

~~~
ganstyles
Conda works really well for this too, all in one. Aside from that, I've set up
a requirements.txt that I just use in every project, copy it to my working
directory, build, stay in virtual environment.

------
batmansmk
Tensorflow Developer Experience issues are due to the fast deep learning
algorithm evolution when it was designed.

Now that the invariants are known, better abstractions have been designed,
such as Keras.

~~~
wdroz
First Keras version was released the 27 March 2015.

First TensorFlow version was released the 9 November 2015.

Both had to follow the evolution of DL, but Keras was developer-friendly from
the beginning.

~~~
jeffshek
That's a little unfair no? Supporting TF's API is much easier than building TF
for Google (and then releasing it to the public).

