PyTorch equivalence for softmax_cross_entropy_with_logits

时间秒杀一切 提交于 2021-01-26 03:48:38

问题


I was wondering if there is an equivalent loss function in PyTorch for TensorFlow's softmax_cross_entropy_with_logits.


回答1:


equivalent loss function in PyTorch for TensorFlow's softmax_cross_entropy_with_logits

It is torch.nn.functional.cross_entropy

It takes logits as inputs (performs log_softmax internally). In here logits are just some values that are not probabilities, outside of [0,1] interval.

But, logits are also the values that will be converted to probabilities. If you consider the name of tensorflow function you will understand it is pleonasm.

Because with_logits part assumes softmax will be called.

In PyTorch the implementation looks like this:

loss = F.cross_entropy(x, target)

Which is equivalent to :

lp = F.log_softmax(x, dim=-1)
loss = F.nll_loss(lp, target)

It is not F.binary_cross_entropy_with_logits because this function assumes multi label classification:

F.sigmoid + F.binary_cross_entropy = F.binary_cross_entropy_with_logits

It is not torch.nn.functional.nll_loss either because this function takes log-probabilities (after log_softmax()) not logits.




回答2:


Following the pointers in several threads, I ended up with the following conversion. I will put post my solution here in case anyone else falls to this thread. It is modified from here, and behaves as expected within this context.

# pred is the prediction with shape [C, H*W]
# gt is the target with shape [H*W]
# idx is the boolean array on H*W for masking

# Tensorflow version
loss = tf.nn.sparse_softmax_cross_entropy_with_logits( \
          logits=tf.boolean_mask(pred, idx), \
          labels=tf.boolean_mask(gt, idx)))

# Pytorch version       
logp = torch.nn.functional.log_softmax(pred[idx])
logpy = torch.gather(logp, 1, Variable(gt[idx].view(-1,1)))
loss = -(logpy).mean()



回答3:


@Blade Here's the solution I came up with!

import torch
import torch.nn as nn
import torch.nn.functional as F


class masked_softmax_cross_entropy_loss(nn.Module):
    r"""my version of masked tf.nn.softmax_cross_entropy_with_logits"""
    def __init__(self, weight=None):
        super(masked_softmax_cross_entropy_loss, self).__init__()
        self.register_buffer('weight', weight)

    def forward(self, input, target, mask):
        if not target.is_same_size(input):
            raise ValueError("Target size ({}) must be the same as input size ({})".format(target.size(), input.size()))

        input = F.softmax(input)
        loss = -torch.sum(target * torch.log(input), 1)
        loss = torch.unsqueeze(loss, 1)
        mask /= torch.mean(mask)
        mask = torch.unsqueeze(mask, 1)
        loss = torch.mul(loss, mask)
        return torch.mean(loss)

Btw: I needed this loss function at the time (Sept 2017) because I was attempting to translate Thomas Kipf's GCN (see https://arxiv.org/abs/1609.02907) code from TensorFlow to PyTorch. However, I now notice that Kipf has done this himself (see https://github.com/tkipf/pygcn), and in his code, he simply uses the built-in PyTorch loss function, the negative log likelihood loss, i.e.

loss_train = F.nll_loss(output[idx_train], labels[idx_train])

Hope this helps.

~DV




回答4:


A solution

from thexp.calculate.tensor import onehot
from torch.nn import functional as F
import torch

logits = torch.rand([3,10])
ys = torch.tensor([1,2,3])
targets = onehot(ys,10)
assert F.cross_entropy(logits,ys) == -torch.mean(torch.sum(F.log_softmax(logits, dim=1) * targets, dim=1))

onehot function:

def onehot(labels: torch.Tensor, label_num):
    return torch.zeros(labels.shape[0], label_num, device=labels.device).scatter_(1, labels.view(-1, 1), 1)


来源:https://stackoverflow.com/questions/46218566/pytorch-equivalence-for-softmax-cross-entropy-with-logits

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!