问题
I am working with autoencoders. My checkpoint contains the complete state of the network (i.e. the encoder, decoder, optimizer, etc). I want to fool around with the encodings. Therefore, I would only need the decoder part of the network in my evaluation mode.
How can I read only a few specific variables from the existing checkpoint, so that I can reuse their values in another model?
回答1:
There's list_variables method in checkpoint_utils.py which lets you see all saved variables.
However, for your use-case, it may be easier to restore with a Saver. If you know the names of the variables when you saved the checkpoint, you can create a new saver, and tell it to initialize those names into new Variable objects (possibly with different names). This is used in CIFAR example to select a restore a subset of variables. See Choosing which Variables to Save and Restore in the Howto
回答2:
Another way, that would print all checkpoint tensors (or just one, if specified) along with their content:
from tensorflow.python.tools import inspect_checkpoint as inch
inch.print_tensors_in_checkpoint_file('path/to/ckpt', '', True)
"""
Args:
file_name: Name of the checkpoint file.
tensor_name: Name of the tensor in the checkpoint file to print.
all_tensors: Boolean indicating whether to print all tensors.
"""
It will always print the content of the tensor.
And, while we are at it, here is how to use checkpoint_utils.py (suggested by the previous answer):
from tensorflow.contrib.framework.python.framework import checkpoint_utils
var_list = checkpoint_utils.list_variables('./')
for v in var_list:
print(v)
回答3:
You can view the saved variables in .ckpt file using,
import tensorflow as tf
variables_in_checkpoint = tf.train.list_variables('path.ckpt')
print("Variables found in checkpoint file",variables_in_checkpoint)
来源:https://stackoverflow.com/questions/38944238/how-do-i-list-certain-variables-in-the-checkpoint