
TensorFlow Example: Fit a straight line - jostmey
https://github.com/jostmey/NakedTensor/blob/master/serial.py
======
mockery
Can someone explain what on earth the type of 'error' is? It starts off as 0.0
but then gets +='ed with expressions containing tf.Variables, so I assume it
turned into some object that is basically function? Why not write it as an
actual function? Are they doing automatic differentiation for their gradient
descent, or something like that?

(Grumble grumble something about dynamic and/or implicit type systems.)

~~~
mockery
Ok so looking into this a bit more, that's exactly what they're doing. In
fact, TensorFlow is linked as an example on the wiki page for automatic
differentiation
[[https://en.wikipedia.org/wiki/Automatic_differentiation](https://en.wikipedia.org/wiki/Automatic_differentiation)]

"TensorFlow™ is an open source software library for numerical computation
using data flow graphs. Nodes in the graph represent mathematical operations,
while the graph edges represent the multidimensional data arrays (tensors)
communicated between them."
[[https://www.tensorflow.org/](https://www.tensorflow.org/)]

In retrospect the name "Tensor Flow" makes a lot more sense now - I had only
ever seen it in the context of machine learning, and assumed it was pretty
specific to that domain.

------
hughes
Odd mix of (I presume) "bare bones" and "rock bottom" in the title.
Alternative might be "Rock Bones"?

~~~
mholt
I think it's a pun on the repo name, "NakedTensor"

------
rrggrr
Can someone ELI5 this example?

~~~
IMTDb
This example builds a really simple tensorflow model that is able to predict
the result of a function y = f(x) where y = f(x) is a straight line.

The tensorflow example is able to predict the result of this function for any
x, without knowing the function itself only by knowing some example points
(points for which the model knows both x and y).

This is a "Hello World" in the area of AI, adapted for the tensorflow library.

------
dia80
I just don't see the point of all these simple introductions to some ML topic.
If you want to do a good job it is hard. If you need a simple introduction and
don't just dig in a persevere with harder stuff you probably aren't going to
succeed in whatever ML task you aspire to.

~~~
chriswarbo
I upvoted because I'm quite familiar with ML theory and like to read the
latest tweaks, etc. but I've never used a tool like TensorFlow. Papers, blog
posts, etc. usually give diagrams and/or fully-fledged code snippets. That's
great for the high-level overview, and low-level nitty-gritty, respectively.

Yet minimal code examples like this let me see straight away what TensorFlow
programs look like, without having to distinguish between fundamental aspects
and snippet-specific ones.

My only remark would be to put a comment at the top something like "Fit a
straight line, of the form y=m*x+b".

I guessed that the "m" and "b" variables were referring to these common
usages, but was wary of this assumption until line 14.

For all I know, "m" and "b" could be common parameter names for some
TensorFlow config or something :)

~~~
dgacmu
Another suggestion: 'operation' would be better named 'train_step' \-- it's
more descriptive and eliminates the need for the comment at the end.

Also, it's best practice and common tensorflow idiom to do:

    
    
        init = tf.initialize_all_variables()
        ...
        with tf.Session() as session:
             session.run(init)
    

Not just for style, but because creating new graph nodes after creating the
session requires tearing down and then re-setting-up some of the internal
session stuff. It's much faster to move the initialize_all_variables()
creation outside of the session.

