Should I use softmax as output when using cross entropy loss in pytorch?

℡╲_俬逩灬. 提交于 2020-07-18 04:23:59

问题


I have a problem with classifying fully connected deep neural net with 2 hidden layers for MNIST dataset in pytorch.

I want to use tanh as activations in both hidden layers, but in the end, I should use softmax.

For the loss, I am choosing nn.CrossEntropyLoss() in pytorch, which (as I have found out) does not want to take one-hot encoded labels as true labels, but takes LongTensor of classes instead.

My model is nn.Sequential() and when I am using softmax in the end, it gives me worse results in terms of accuracy on testing data. why?

import torch
import torch.nn as nn

inputs, n_hidden0, n_hidden1, out = 784, 128, 64, 10
n_epochs = 500
model = nn.Sequential(nn.Linear(inputs, n_hidden0, bias = True), 
                 nn.Tanh(), 
                 nn.Linear(n_hidden0, n_hidden1, bias = True),
                 nn.Tanh(),
                 nn.Linear(n_hidden1, out, bias = True),
                 nn.Softmax()  # SHOULD THIS BE THERE?
                 )

criterion = nn.CrossEntropyLoss()
optimizer = torch.optim.SGD(model.parameters(), lr=0.1, momentum = 0.5)

for epoch in range(n_epochs):
    y_pred = model(X_train)
    loss = criterion(y_pred, Y_train)
    print('epoch: ', epoch+1,' loss: ', loss.item())
    optimizer.zero_grad()
    loss.backward()
    optimizer.step()

Thanks for any help :)


回答1:


As stated in the torch.nn.CrossEntropyLoss() doc:

This criterion combines nn.LogSoftmax() and nn.NLLLoss() in one single class.

Therefore, you should not use softmax before.



来源:https://stackoverflow.com/questions/55675345/should-i-use-softmax-as-output-when-using-cross-entropy-loss-in-pytorch

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!