
TensorFlow 101 - sidcool
https://mubaris.com/2017-10-21/tensorflow-101
======
shadowmint
I'm pretty sure this already well covered in
[https://www.tensorflow.org/programmers_guide/](https://www.tensorflow.org/programmers_guide/)

I don't think reading this will give you any understanding of what tensorflow
is useful for, or how to do it.

The steps of writing a tensorflow program are always like the 'how to draw an
owl'; first you define some simple tensors... then just, do the rest of it.

Step 1: Define tensors and inputs. OK!

Step 2: Linear regression. OK! (useless, but sure)

Step 3: Generate high resolution cat videos from a corpus of dog videos. Uh...
????

There are plenty of very good simple introductions to tensorflow.

The tensorflow tutorials themselves try to drop people in 'from the top' with
high level practical examples, so we're good on that front too.

What's missing is a middle ground of 'and then do something practical but
simple'.

You know what the 'best practice' advice for working with a GAN is?

Find someone else's implementation, copy it, and tweek the hyper parameters,
change the input.

This is why I recommend people learn keras, not tensorflow; because it isn't
super production ready and practical, but it _will_ let you learn to build and
test models easily.

...and if there's one 'programmers guide' to tensorflow, it's exactly that:

You don't just 'build' a tensorflow model; problem solved off you go.

Nope, you're going to be going back and tweaking and changing and randomly
trying different stuff over and over again until you stumble into a 'good
enough' solution to run with.

...and that solution; it might almost work for some other similar domains...
but it probably doesn't generalize. You'll probably have to do the whole thing
from scratch again.

Machine learning. Fun times.

~~~
mark_l_watson
Good points. One comment: I use GANs at work. While useful they can be
difficult to train. Sometimes the loss function for the combined discriminator
and generator does not decrease with training as much as you would like, but
the generator used on its own is still useful. Goodfellow, inventor of GAN,
only spends about one page in his long deep learning book on GANs. RNNs also
make good generators and are easier to train. Anyway, GANs may not be good for
practical how-to tutorials.

Edit: good advice on using Keras. Keras is ‘understandable’ in the sense that
reading the code for Keras itself is useful and Francois Chollet, creater of
Keras, has a fantastic new book out - which I strongly recommend.

------
partycoder
I like the "AI for humans" series, by Jeff Heaton. It's a series of books plus
some code examples that can be found here:

[https://github.com/jeffheaton/aifh](https://github.com/jeffheaton/aifh)

For python, no libraries other than numpy are used in the examples. Which is
not production friendly, but it offers insights into the concepts which I
think is a good start.

~~~
scarlac
Which libraries would you recommend for production use instead of numpy?

~~~
partycoder
I would rather say in addition to numpy. Tensorflow, and the more recent CNTK
may be good candidates for a production environment. Keras is a simplified
interface to those.

Other libraries that people have used include PyTorch and Scikit-learn. But
right now Tensorflow and CNTK might be among the fastest/more scalable.

------
conceptoriented
From this (and many other) tutorial it is not clear if tensors in tensorflow
are true mathematical tensors (that is, having covariant and contravariant
indices) or they are multidimensional arrays. The name Tensorflow and
terminology suggests that Tensorflow manipulates mathematical tensors, for
example:

    
    
      Scalars -> Vectors -> Matrices -> Tensors (really tensors?)
    

but what you see are multidimensional arrays. It is of course not a big
problem but probably could be clarified somewhere at least in small font to
avoid ambiguity. Or Tensorflow objects are true tensors indeed?

~~~
dragandj
Even if they were (I doubt), I haven't found a clear and informed description
how they relate exactly to tensors found in math/physics literature. I agree
with your view that they look more like nd-arrays.

~~~
zazen
I was surprised when I first saw the word "tensor" being thrown around by
computer scientists to apparently mean just multi-dimensional array. But then
I thought, well, "vector" is very widely used - including by mathematicians -
to mean simply an nx1 or 1xn array, rather than an object which transforms a
certain way under coordinate tranformations. So in the same way, I suppose we
really might as well use "tensor" to mean "just" a multi-dimensional array of
numbers, in contexts where coordinate transformations aren't important.
Mathematical physics can continue to use the other definition where necessary,
just as it does for vectors.

~~~
dragandj
The trouble with that approach is that in CS, tensors are mostly used in
machine learning, which is _very_ math-dependent. So, you read in a textbook
or a paper that something can be done elegantly by using some linear algebra
operation, or some transformation on a tensor, and are delighted, because your
library says to be tensor-based, but, then, when you try to code it, whoops;
you meant you had tensor support, but all you've got is a multidimensional
array memory layout...

------
scarlac
As someone who went through the official Tensorflow 'get started' and many
mini tutorials the net, I found this article of no help. It introduces a
multitude of libraries, math notations, ML concepts and lots of (random?)
undocumented constants.

There's a big leap from relatively simple concepts to the relatively complex
example for what is to be considered "101". I would still recommend that
people start with
[https://www.tensorflow.org/get_started/get_started](https://www.tensorflow.org/get_started/get_started)

