Where is an explicit connection between the optimizer
and the loss
?
How does the optimizer know where to get the gradients of the loss wit
Short answer:
loss.backward()
# do gradient of all parameters for which we set required_grad= True
. parameters could be any variable defined in code, like h2h
or i2h
.
optimizer.step()
# according to the optimizer function (defined previously in our code), we update those parameters to finally get the minimum loss(error).