
Does TensorFlow Suffer from the Second-System Effect? - ajschumacher
http://planspace.org/20170311-does_tensorflow_suffer_from_the_second_system_effect/
======
jkred5
I don't know if I'd consider TF a second-system to DistBelief alone as much as
a second-system to a family of Caffe, Theano and Torch. Those predecessors
each got some subset of design elements right, IMO.

For instance, symbolic differentiation and compiling is something theano got
right, and ease of switching between GPU and CPU in Caffe and theano. And
Torch seems to have a simple scripted feel through Lua.

But none of them seem as deliberately designed as TensorFlow. Theano has a
sprawling "academic feeling" codebase with many contributors, and can be
opaque. Caffe focused mostly on making feedforward convolutional nets simple,
and doesn't have symbolic differentiation and pycaffe doesn't seem to support
everything easily. Torch picked Lua as a language rather than python and that
seems to have made it annoying for some to pick up and debug. Theano is
probably the easiest framework to mash up techniques like LSTMs with convnets,
but none of them do it very well, IMO.

TF seems to get right the big things that worked in prior frameworks: symbolic
differentiation, compiling, the ability to use intermediate results from other
networks all together, very useful debugging tools with TensorBoard and graph
inspection. The core argument here seems to be that many won't find the
ability to run specific parts of the computation graph on different hardware.
That may be true. But that can largely be ignored. It is annoying that TF
defaults to allocating all memory to itself
([http://stackoverflow.com/questions/34199233/how-to-
prevent-t...](http://stackoverflow.com/questions/34199233/how-to-prevent-
tensorflow-from-allocating-the-totality-of-a-gpu-memory)), for instance, but
by and large I think the core design of a computation graph that can be spread
across hardware by design is a net win, even if a lot of people won't
appreciate that until a few years from now.

