What does use_locking=True do in TensorFlow optimizers?

限于喜欢 提交于 2020-01-09 10:50:30

问题


Does it only protect against asynchronous updates or does it also cause other access to the variable to wait for the update? I'm using the same model for training and inference at the same time and want to make sure that inference is always done on a consistent model.


回答1:


Passing use_locking=True when creating a TensorFlow optimizer, or a variable assignment op, causes a lock to be acquired around the relevant updates to the variable. Other optimizers/assignments on the same variable also created with use_locking=True will be serialized.

However, there are two caveats that you should bear in mind when using this option:

  • Reads to the variables are not performed under the lock, so it is possible to see intermediate states and partially-applied updates. Serializing reads requires additional coordination, such as that provided by tf.train.SyncReplicasOptimizer.

  • Writes (optimizers/assignments) to the same variable with use_locking=False are still possible, and will not acquire the lock. The programmer is responsible for ensuring that these writes do not occur.



来源:https://stackoverflow.com/questions/39715915/what-does-use-locking-true-do-in-tensorflow-optimizers

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!