Getting the current learning rate from a tf.train.AdamOptimizer

前端 未结 5 1281
清酒与你
清酒与你 2020-12-05 00:11

I\'d like print out the learning rate for each training step of my nn.

I know that Adam has an adaptive learning rate, but is there a way i can see this (for visuali

5条回答
  •  时光说笑
    2020-12-05 00:48

    In Tensorflow 2:

    optimizer = tf.keras.optimizers.Adagrad(learning_rate=0.1)  # or any other optimizer
    print(optimizer.learning_rate.numpy())  # or print(optimizer.lr.numpy())
    

    Note: This gives you the base learning rate. Refer to this answer for more details on adaptive learning rates.

提交回复
热议问题