PyTorch equivalence for softmax_cross_entropy_with_logits
问题 I was wondering if there is an equivalent loss function in PyTorch for TensorFlow's softmax_cross_entropy_with_logits . 回答1: equivalent loss function in PyTorch for TensorFlow's softmax_cross_entropy_with_logits It is torch.nn.functional.cross_entropy It takes logits as inputs (performs log_softmax internally). In here logits are just some values that are not probabilities, outside of [0,1] interval. But, logits are also the values that will be converted to probabilities. If you consider the