Can't access TensorFlow Adam optimizer namespace

亡梦爱人 提交于 2019-12-01 00:29:53

Your ValueError is caused by creating new variables within the variable_scope.reuse==True.

Variables are created by Adam, when you call the minimize function of Adam, for saving momentums of each trainable variables in your graph.

Actually, the code "reuse=False" DOES NOT work as you expected. The reuse state cannot change back to False forever once you set it to True, and the reuse state will be inherited by its all sub scopes.

with tf.variable_scope(tf.get_variable_scope(), reuse=False) as scope:
    assert tf.get_variable_scope().reuse == True

I guess you have set reuse to True somewhere before the post codes, thus the default variable_scope.reuse==True. Then you create a new variable_scope for Adam, however, new scope will inherit the reuse state of default scope. Then, Adam creates variable under state reuse==True, which raises an error.

The solution is to add a sub scope under the graph's default scope when you set variable_scope.reuse=True, then the default scope.reuse is still False, and Adam.minimize will work.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!