Tensorflow Object Detection: use Adam instead of RMSProp

孤街浪徒 提交于 2019-12-23 18:58:05

问题


I'm training a CNN with this [.config file][1]:

rms_prop_optimizer: {
    learning_rate: {
      exponential_decay_learning_rate {
        initial_learning_rate: 0.004
        decay_steps: 800720
        decay_factor: 0.95
      }
    }
   momentum_optimizer_value: 0.9
   decay: 0.9
   epsilon: 1.0
}   

}
As you can see there is a rms_prop as optimizer. What if I would like to use Adam? How am I supposed to edit this file?


回答1:


if I'm right, you're trying to use the object_detection model with a pre-trained network offered by Tensorflow, am I right? Then, if you know a little of programming, you can take a look at models/research/object_detection/builders/optimizer_builder.py and see which are the optimizer that can be used and with which parameters. Instead if you just want a out-of-the-box solution, this is how I did:

optimizer {
    # momentum_optimizer {
    adam_optimizer: {
      learning_rate: {
        manual_step_learning_rate {
          initial_learning_rate: .0002
          schedule {
            step: 4500
            learning_rate: .0001
          }
          schedule {
            step: 7000
            learning_rate: .00008
          }
          schedule {
            step: 10000
            learning_rate: .00004
          }
        }
      }
      # momentum_optimizer_value: 0.9
    }
    use_moving_average: false
  }

In my (little) experience I noticed that using the same learning_experience as momentum_optimizer makes the learning too fast and/or brings to NaN Losses, so I usually decrease it of 10 times or more. I'm trying just now. :)



来源:https://stackoverflow.com/questions/51915803/tensorflow-object-detection-use-adam-instead-of-rmsprop

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!