tf.nn.softmax_cross_entropy_with_logits and sequential applying softmax and cross_entropy give different gradients

前端 未结 0 1689
长发绾君心
长发绾君心 2020-12-17 06:26

I can\'t fully understand, tf.nn.softmax_cross_entropy_with_logits performs under the hood. I have two versions of computing cross entropy loss after applying s

相关标签:
回答
  • 消灭零回复
提交回复
热议问题