pytorch - connection between loss.backward() and optimizer.step()

前端 未结 5 495
谎友^
谎友^ 2020-12-23 13:04

Where is an explicit connection between the optimizer and the loss?

How does the optimizer know where to get the gradients of the loss wit

5条回答
  •  小蘑菇
    小蘑菇 (楼主)
    2020-12-23 13:24

    Short answer:

    loss.backward() # do gradient of all parameters for which we set required_grad= True. parameters could be any variable defined in code, like h2h or i2h.

    optimizer.step() # according to the optimizer function (defined previously in our code), we update those parameters to finally get the minimum loss(error).

提交回复
热议问题