Adding L1/L2 regularization in PyTorch?

后端 未结 5 2111
南旧
南旧 2020-12-24 01:20

Is there any way, I can add simple L1/L2 regularization in PyTorch? We can probably compute the regularized loss by simply adding the data_loss with the r

5条回答
  •  无人及你
    2020-12-24 02:02

    For L2 regularization,

    l2_lambda = 0.01
    l2_reg = torch.tensor(0.)
    for param in model.parameters():
        l2_reg += torch.norm(param)
    loss += l2_lambda * l2_reg
    

    References:

    • https://discuss.pytorch.org/t/how-does-one-implement-weight-regularization-l1-or-l2-manually-without-optimum/7951.
    • http://pytorch.org/docs/master/torch.html?highlight=norm#torch.norm.

提交回复
热议问题