TensorFlow: How to set learning rate decay based on epochs?

前端 未结 3 1736
予麋鹿
予麋鹿 2021-01-15 20:52

The learning rate decay function tf.train.exponential_decay takes a decay_steps parameter. To decrease the learning rate every num_epochs

3条回答
  •  难免孤独
    2021-01-15 21:09

    In the learning_rate below,

    learning_rate = tf.train.exponential_decay(starter_learning_rate, global_step,
                                               100000, 0.96, staircase=True)
    

    starter_learning_rate can be changed after desired epochs by defining a function like:

    def initial_learning_rate(epoch):
        if (epoch >= 0) and (epoch < 100):
            return 0.1
        if (epoch >= 100) and (epoch < 200):
            return 0.05
        if (epoch >= 200) and (epoch < 500):
            return 0.001
    

    And then you may initialize your starter_learning_rate inside the for loop (iterating over epochs) as:

    for epoch in range(epochs): #epochs is the total number of epochs
    starter_learning_rate = initial_learning_rate(epoch)
    ...
    

    Note

    The global_step variable is not changed in:

    decayed_learning_rate = starter_learning_rate *
                            decay_rate ^ (global_step / decay_steps)
    

提交回复
热议问题