How to get weights in tf.layers.dense?

前端 未结 7 693
悲哀的现实
悲哀的现实 2020-12-29 07:25

I wanna draw the weights of tf.layers.dense in tensorboard histogram, but it not show in the parameter, how could I do that?

相关标签:
7条回答
  • 2020-12-29 07:36

    I came across this problem and just solved it. tf.layers.dense 's name is not necessary to be the same with the kernel's name's prefix. My tensor is "dense_2/xxx" but it's kernel is "dense_1/kernel:0". To ensure that tf.get_variable works, you'd better set the name=xxx in the tf.layers.dense function to make two names owning same prefix. It works as the demo below:

    l=tf.layers.dense(input_tf_xxx,300,name='ip1')
    with tf.variable_scope('ip1', reuse=True):
        w = tf.get_variable('kernel')
    

    By the way, my tf version is 1.3.

    0 讨论(0)
  • 2020-12-29 07:39

    in TF2 weights will output a list in length 2

    weights_out[0] = kernel weight

    weights_out[1] = bias weight

    the second layer weight (layer[0] is the input layer with no weights) in a model in size: 50 with input size: 784

    inputs = keras.Input(shape=(784,), name="digits")
    x = layers.Dense(50, activation="relu", name="dense_1")(inputs)
    x = layers.Dense(50, activation="relu", name="dense_2")(x)
    outputs = layers.Dense(10, activation="softmax", name="predictions")(x)
    
    model = keras.Model(inputs=inputs, outputs=outputs)
    model.compile(...)
    model.fit(...)
    
    kernel_weight = model.layers[1].weights[0]
    bias_weight = model.layers[1].weights[1]
    all_weight = model.layers[1].weights
    print(len(all_weight))                      #  2
    print(kernel_weight.shape)                  # (784,50)
    print(bias_weight.shape)                    # (50,)
    
    0 讨论(0)
  • 2020-12-29 07:40

    The weights are added as a variable named kernel, so you could use

    x = tf.dense(...)
    weights = tf.get_default_graph().get_tensor_by_name(
      os.path.split(x.name)[0] + '/kernel:0')
    

    You can obviously replace tf.get_default_graph() by any other graph you are working in.  

    0 讨论(0)
  • 2020-12-29 07:42

    Is there anything wrong with

    model.get_weights()
    

    After I create a model, compile it and run fit, this function returns a numpy array of the weights for me.

    0 讨论(0)
  • 2020-12-29 07:49

    I am going crazy with tensorflow.

    I run this:

    sess.run(x.kernel)

    after training, and I get the weights.

    Comes from the properties described here.

    I am saying that I am going crazy because it seems that there are a million slightly different ways to do something in tf, and that fragments the tutorials around.

    0 讨论(0)
  • 2020-12-29 07:52

    The latest tensorflow layers api creates all the variables using the tf.get_variable call. This ensures that if you wish to use the variable again, you can just use the tf.get_variable function and provide the name of the variable that you wish to obtain.

    In the case of a tf.layers.dense, the variable is created as: layer_name/kernel. So, you can obtain the variable by saying:

    with tf.variable_scope("layer_name", reuse=True):
        weights = tf.get_variable("kernel") # do not specify
        # the shape here or it will confuse tensorflow into creating a new one.
    

    [Edit]: The new version of Tensorflow now has both Functional and Object-Oriented interfaces to the layers api. If you need the layers only for computational purposes, then using the functional api is a good choice. The function names start with small letters for instance -> tf.layers.dense(...). The Layer Objects can be created using capital first letters e.g. -> tf.layers.Dense(...). Once you have a handle to this layer object, you can use all of its functionality. For obtaining the weights, just use obj.trainable_weights this returns a list of all the trainable variables found in that layer's scope.

    0 讨论(0)
提交回复
热议问题