How to print the “actual” learning rate in Adadelta in pytorch
问题 In short : I can't draw lr/epoch curve when using adadelta optimizer in pytorch because optimizer.param_groups[0]['lr'] always return the same value. In detail : Adadelta can dynamically adapts over time using only first order information and has minimal computational overhead beyond vanilla stochastic gradient descent [1]. In pytorch, the source code of Adadelta is here https://pytorch.org/docs/stable/_modules/torch/optim/adadelta.html#Adadelta Since it requires no manual tuning of learning