I created a trainable variable in a scope. Later, I entered the same scope, set the scope to reuse_variables, and used get_varia
After looking at the documentation and the code, I was not able to find a way to remove a Variable from the TRAINABLE_VARIABLES.
tf.get_variable('weights', trainable=True) is called, the variable is added to the list of TRAINABLE_VARIABLES.tf.get_variable('weights', trainable=False), you get the same variable but the argument trainable=False has no effect as the variable is already present in the list of TRAINABLE_VARIABLES (and there is no way to remove it from there)When calling the minimize method of the optimizer (see doc.), you can pass a var_list=[...] as argument with the variables you want to optimizer.
For instance, if you want to freeze all the layers of VGG except the last two, you can pass the weights of the last two layers in var_list.
You can use a tf.train.Saver() to save variables and restore them later (see this tutorial).
saver.save(sess, "/path/to/dir/model.ckpt").saver.restore(sess, "/path/to/dir/model.ckpt").Optionally, you can decide to save only some of the variables in your checkpoint file. See the doc for more info.