Cross Entropy in PyTorch

前端 未结 3 466
时光说笑
时光说笑 2020-12-13 02:47

I\'m a bit confused by the cross entropy loss in PyTorch.

Considering this example:

import torch
import          


        
相关标签:
3条回答
  • 2020-12-13 03:05

    In your example you are treating output [0, 0, 0, 1] as probabilities as required by the mathematical definition of cross entropy. But PyTorch treats them as outputs, that don’t need to sum to 1, and need to be first converted into probabilities for which it uses the softmax function.

    So H(p, q) becomes:

    H(p, softmax(output))
    

    Translating the output [0, 0, 0, 1] into probabilities:

    softmax([0, 0, 0, 1]) = [0.1749, 0.1749, 0.1749, 0.4754]
    

    whence:

    -log(0.4754) = 0.7437
    
    0 讨论(0)
  • 2020-12-13 03:06

    Your understanding is correct but pytorch doesn't compute cross entropy in that way. Pytorch uses the following formula.

    loss(x, class) = -log(exp(x[class]) / (\sum_j exp(x[j])))
                   = -x[class] + log(\sum_j exp(x[j]))
    

    Since, in your scenario, x = [0, 0, 0, 1] and class = 3, if you evaluate the above expression, you would get:

    loss(x, class) = -1 + log(exp(0) + exp(0) + exp(0) + exp(1))
                   = 0.7437
    

    Pytorch considers natural logarithm.

    0 讨论(0)
  • 2020-12-13 03:13

    I would like to add an important note, as this often leads to confusion.

    Softmax is not a loss function, nor is it really an activation function. It has a very specific task: It is used for multi-class classification to normalize the scores for the given classes. By doing so we get probabilities for each class that sum up to 1.

    Softmax is combined with Cross-Entropy-Loss to calculate the loss of a model.

    Unfortunately, because this combination is so common, it is often abbreviated. Some are using the term Softmax-Loss, whereas PyTorch calls it only Cross-Entropy-Loss.

    0 讨论(0)
提交回复
热议问题