Manually update momentum terms in pytorch optimizers

后端 未结 0 1540
我在风中等你
我在风中等你 2021-02-11 07:34

The Adam optimizer has several terms that are used to add "momentum" to the gradient descent algorithm, making the step size for each variable adaptive:

Specifi

相关标签:
回答
  • 消灭零回复
提交回复
热议问题