问题
A very simple example in tensorflow: min (x + 1)^2
where x
is a scalar. The code is:
import tensorflow as tf
x = tf.Variable(initial_value=3.0)
add = tf.add(x, 1)
y = tf.square(add)
optimizer = tf.train.GradientDescentOptimizer(learning_rate=0.01)
train = optimizer.minimize(y)
then write graph to disk
graph = tf.get_default_graph()
writer = tf.summary.FileWriter("some/dir/to/write/events")
writer.add_graph(graph=graph)
finally visualize it in tensorboard, it looks like this
question is, why node "Add" is connected with gradients? I think since I try to minimize y, node "Square" should be, is it a bug? can anyone explain it?
回答1:
There is no bug involved. You just need to understand what is a gradient and know how to compute one yourself. So (x+1)^2' = 2*(x+1)
. Which means that you do not need to calculate (x+1)^2
to calculate the gradient. If you will zoom in the gradient part you will see that it calculated the gradient of your square and figured out that it does which part of the graph is needed there :
Here is a more interesting and more intuitive example:
import tensorflow as tf
x = tf.Variable(initial_value=3.0)
y = tf.cos(x)
train = tf.train.GradientDescentOptimizer(learning_rate=0.01).minimize(y)
with tf.Session() as sess:
writer = tf.summary.FileWriter('logs', sess.graph)
writer.close()
You should know that cos(x)' = - sin(x)
. Which means that only x
is needed to calculate gradient. And this is what you see in the graph:
来源:https://stackoverflow.com/questions/44342432/is-gradient-in-the-tensorflows-graph-calculated-incorrectly