pytorch - connection between loss.backward() and optimizer.step()

前端 未结 5 508
谎友^
谎友^ 2020-12-23 13:04

Where is an explicit connection between the optimizer and the loss?

How does the optimizer know where to get the gradients of the loss wit

5条回答
  •  情书的邮戳
    2020-12-23 13:31

    When you call loss.backward(), all it does is compute gradient of loss w.r.t all the parameters in loss that have requires_grad = True and store them in parameter.grad attribute for every parameter.

    optimizer.step() updates all the parameters based on parameter.grad

提交回复
热议问题