Can't access TensorFlow Adam optimizer namespace

拜拜、爱过 提交于 2019-12-19 04:17:25

问题


I'm trying to learn about GANs and I'm working through the example here.

The code below using the Adam optimizer gives me the error

"ValueError: Variable d_w1/Adam/ does not exist, or was not created with tf.get_variable(). Did you mean to set reuse=None in VarScope?"

I'm using TF 1.1.0

d_loss_real = tf.reduce_mean(tf.nn.sigmoid_cross_entropy_with_logits(logits=Dx, labels=tf.fill([batch_size, 1], 0.9)))
d_loss_fake = tf.reduce_mean(tf.nn.sigmoid_cross_entropy_with_logits(logits=Dg, labels=tf.zeros_like(Dg)))
d_loss = d_loss_real + d_loss_fake

tvars = tf.trainable_variables()

d_vars = [var for var in tvars if 'd_' in var.name]
g_vars = [var for var in tvars if 'g_' in var.name]



# Train the discriminator
# Increasing from 0.001 in GitHub version
with tf.variable_scope(tf.get_variable_scope(), reuse=False) as scope:

    # Next, we specify our two optimizers. In today’s era of deep learning, Adam seems to be the
    # best SGD optimizer as it utilizes adaptive learning rates and momentum. 
    # We call Adam's minimize function and also specify the variables that we want it to update.
    d_trainer_real = tf.train.AdamOptimizer(0.0001).minimize(d_loss_real, var_list=d_vars)
    d_trainer_fake = tf.train.AdamOptimizer(0.0001).minimize(d_loss_fake, var_list=d_vars)

I think the Adam optimizer is taking the variables into its own namespace but for some reason they aren't initialized. I do call global_variables_initializer later in the code, as can be seen on the github page. I'm checking through the documentation, I think it may be related to me having to put some kind of reuse_variables() call in there, but I'm not sure.

Any help much appreciated.


回答1:


Your ValueError is caused by creating new variables within the variable_scope.reuse==True.

Variables are created by Adam, when you call the minimize function of Adam, for saving momentums of each trainable variables in your graph.

Actually, the code "reuse=False" DOES NOT work as you expected. The reuse state cannot change back to False forever once you set it to True, and the reuse state will be inherited by its all sub scopes.

with tf.variable_scope(tf.get_variable_scope(), reuse=False) as scope:
    assert tf.get_variable_scope().reuse == True

I guess you have set reuse to True somewhere before the post codes, thus the default variable_scope.reuse==True. Then you create a new variable_scope for Adam, however, new scope will inherit the reuse state of default scope. Then, Adam creates variable under state reuse==True, which raises an error.

The solution is to add a sub scope under the graph's default scope when you set variable_scope.reuse=True, then the default scope.reuse is still False, and Adam.minimize will work.



来源:https://stackoverflow.com/questions/44440900/cant-access-tensorflow-adam-optimizer-namespace

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!