The learning rate decay function tf.train.exponential_decay
takes a decay_steps
parameter. To decrease the learning rate every num_epochs
In the learning_rate below,
learning_rate = tf.train.exponential_decay(starter_learning_rate, global_step,
100000, 0.96, staircase=True)
starter_learning_rate can be changed after desired epochs by defining a function like:
def initial_learning_rate(epoch):
if (epoch >= 0) and (epoch < 100):
return 0.1
if (epoch >= 100) and (epoch < 200):
return 0.05
if (epoch >= 200) and (epoch < 500):
return 0.001
And then you may initialize your starter_learning_rate inside the for loop (iterating over epochs) as:
for epoch in range(epochs): #epochs is the total number of epochs
starter_learning_rate = initial_learning_rate(epoch)
...
Note
The global_step variable is not changed in:
decayed_learning_rate = starter_learning_rate *
decay_rate ^ (global_step / decay_steps)