Why do we name variables in Tensorflow?

前端 未结 4 806
佛祖请我去吃肉
佛祖请我去吃肉 2020-12-24 01:53

In some of the places, I saw the syntax, where variables are initialized with names, sometimes without names. For example:

# With name
var = tf.Variable(0, n         


        
4条回答
  •  暖寄归人
    2020-12-24 02:03

    The name parameter is optional (you can create variables and constants with or without it), and the variable you use in your program does not depend on it. Names can be helpful in a couple of places:

    When you want to save or restore your variables (you can save them to a binary file after the computation). From docs:

    By default, it uses the value of the Variable.name property for each variable

    matrix_1 = tf.Variable([[1, 2], [2, 3]], name="v1")
    matrix_2 = tf.Variable([[3, 4], [5, 6]], name="v2")
    init = tf.initialize_all_variables()
    
    saver = tf.train.Saver()
    
    sess = tf.Session()
    sess.run(init)
    save_path = saver.save(sess, "/model.ckpt")
    sess.close()
    

    Nonetheless you have variables matrix_1, matrix_2 they are saves as v1, v2 in the file.

    Also names are used in TensorBoard to nicely show names of edges. You can even group them by using the same scope:

    import tensorflow as tf
    
    with tf.name_scope('hidden') as scope:
      a = tf.constant(5, name='alpha')
      W = tf.Variable(tf.random_uniform([1, 2], -1.0, 1.0), name='weights')
      b = tf.Variable(tf.zeros([1]), name='biases')
    

提交回复
热议问题