问题
Keep getting these errors with tensorflow 2.0. Is this supposed to work?
import tensorflow as tf
import numpy as np
x = tf.constant(3.0)
with tf.GradientTape() as t:
t.watch(x)
y = (x - 10) ** 2
opt = tf.optimizers.Adam()
opt.minimize(lambda: y, var_list=[x])
回答1:
In the tape you only have to compute the forward pass the optimizer and the minize definition are not part of the forward pass, thus you have to remote them.
Moreover, if you want to use the minimize method of the optimizer, you don't have to use the tf.GradienTape object, but just define the forward pass (loss computation) as a function, then the optimizer will create the tape + minimize the function for you.
However, since you want to use a constant and not a variable, you have to use a tf.GradientTape and manually compute the loss value.
import tensorflow as tf
x = tf.constant(3.0)
with tf.GradientTape() as t:
t.watch(x)
y = (x - 10) ** 2
grads = t.gradient(y, [x])
Of course you can't apply the gradients
opt = tf.optimizers.Adam()
opt.apply_gradients(zip([y], [x]))
since x is not a trainable variable, but a constant (the apply_gradients call will raise an exception)
来源:https://stackoverflow.com/questions/55552538/tensorflow-2-0-attributeerror-tensor-name-is-meaningless-when-eager-execution