How can I implement a weighted cross entropy loss in tensorflow using sparse_softmax_cross_entropy_with_logits

前端 未结 3 1732
广开言路
广开言路 2020-12-08 05:32

I am starting to use tensorflow (coming from Caffe), and I am using the loss sparse_softmax_cross_entropy_with_logits. The function accepts labels like 0,

3条回答
  •  生来不讨喜
    2020-12-08 06:14

    The class weights are multiplied by the logits, so that still works for sparse_softmax_cross_entropy_with_logits. Refer to this solution for "Loss function for class imbalanced binary classifier in Tensor flow."

    As a side note, you can pass weights directly into sparse_softmax_cross_entropy

    tf.contrib.losses.sparse_softmax_cross_entropy(logits, labels, weight=1.0, scope=None)
    

    This method is for cross-entropy loss using

    tf.nn.sparse_softmax_cross_entropy_with_logits.
    

    Weight acts as a coefficient for the loss. If a scalar is provided, then the loss is simply scaled by the given value. If weight is a tensor of size [batch_size], then the loss weights apply to each corresponding sample.

提交回复
热议问题