Pytorch equivalent features in tensorflow?

社会主义新天地 提交于 2020-12-13 03:31:46

问题


I recently was reading a Pytorch code and came across loss.backward() and optimizer.step() functions, are there any equivalent of these using tensorflow/keras?


回答1:


loss.backward() equivalent in tensorflow is tf.GradientTape(). TensorFlow provides the tf.GradientTape API for automatic differentiation - computing the gradient of a computation with respect to its input variables. Tensorflow "records" all operations executed inside the context of a tf.GradientTape onto a "tape". Tensorflow then uses that tape and the gradients associated with each recorded operation to compute the gradients of a "recorded" computation using reverse mode differentiation.

optimizer.step() equivalent in tensorflow is minimize(). Minimizes the loss by updating the variable list. Calling minimize() takes care of both computing the gradients and applying them to the variables.

If you want to process the gradients before applying them you can instead use the optimizer in three steps:

  1. Compute the gradients with tf.GradientTape.
  2. Process the gradients as you wish.
  3. Apply the processed gradients with apply_gradients().

Hope this answers your question. Happy Learning.



来源:https://stackoverflow.com/questions/61623722/pytorch-equivalent-features-in-tensorflow

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!