I\'d like print out the learning rate for each training step of my nn.
I know that Adam has an adaptive learning rate, but is there a way i can see this (for visuali
Sung Kim suggestion worked for me, my exact steps were:
lr = 0.1
step_rate = 1000
decay = 0.95
global_step = tf.Variable(0, trainable=False)
increment_global_step = tf.assign(global_step, global_step + 1)
learning_rate = tf.train.exponential_decay(lr, global_step, step_rate, decay, staircase=True)
optimizer = tf.train.AdamOptimizer(learning_rate=learning_rate, epsilon=0.01)
trainer = optimizer.minimize(loss_function)
# Some code here
print('Learning rate: %f' % (sess.run(trainer ._lr)))