Getting the current learning rate from a tf.train.AdamOptimizer

前端 未结 5 1262
清酒与你
清酒与你 2020-12-05 00:11

I\'d like print out the learning rate for each training step of my nn.

I know that Adam has an adaptive learning rate, but is there a way i can see this (for visuali

5条回答
  •  眼角桃花
    2020-12-05 00:47

    Sung Kim suggestion worked for me, my exact steps were:

    lr = 0.1
    step_rate = 1000
    decay = 0.95
    
    global_step = tf.Variable(0, trainable=False)
    increment_global_step = tf.assign(global_step, global_step + 1)
    learning_rate = tf.train.exponential_decay(lr, global_step, step_rate, decay, staircase=True)
    
    optimizer = tf.train.AdamOptimizer(learning_rate=learning_rate, epsilon=0.01)
    trainer = optimizer.minimize(loss_function)
    
    # Some code here
    
    print('Learning rate: %f' % (sess.run(trainer ._lr)))
    

提交回复
热议问题