How do i create Confusion matrix of predicted and ground truth labels with Tensorflow?

前端 未结 3 2060
天命终不由人
天命终不由人 2020-12-15 01:58

I have implemented a Nueral Network model for a classification with the help of using TensorFlow. But, i don\'t know how can i able to draw confusion matrix by using predic

3条回答
  •  夕颜
    夕颜 (楼主)
    2020-12-15 02:24

    If you want to produce a confusion matrix, and then later precision and recall, you first need to get your counts of true positives, true negatives, false positives and false negatives. Here is how:

    For better readibility, I wrote the code very verbose.

    def evaluation(logits,labels):
    "Returns correct predictions, and 4 values needed for precision, recall and F1 score"
    
    
        # Step 1:
        # Let's create 2 vectors that will contain boolean values, and will describe our labels
    
        is_label_one = tf.cast(labels, dtype=tf.bool)
        is_label_zero = tf.logical_not(is_label_one)
        # Imagine that labels = [0,1]
        # Then
        # is_label_one = [False,True]
        # is_label_zero = [True,False]
    
        # Step 2:
        # get the prediction and false prediction vectors. correct_prediction is something that you choose within your model.
        correct_prediction = tf.nn.in_top_k(logits, labels, 1, name="correct_answers")
        false_prediction = tf.logical_not(correct_prediction)
    
        # Step 3:
        # get the 4 metrics by comparing boolean vectors
        # TRUE POSITIVES
        true_positives = tf.reduce_sum(tf.to_int32(tf.logical_and(correct_prediction,is_label_one)))
    
        # FALSE POSITIVES
        false_positives = tf.reduce_sum(tf.to_int32(tf.logical_and(false_prediction, is_label_zero)))
    
        # TRUE NEGATIVES
        true_negatives = tf.reduce_sum(tf.to_int32(tf.logical_and(correct_prediction, is_label_zero)))
    
        # FALSE NEGATIVES
        false_negatives = tf.reduce_sum(tf.to_int32(tf.logical_and(false_prediction, is_label_one)))
    
    
    return true_positives, false_positives, true_negatives, false_negatives
    
    # Now you can do something like this in your session:
    
    true_positives, \
    false_positives, \
    true_negatives, \
    false_negatives = sess.run(evaluation(logits,labels), feed_dict=feed_dict)
    
    # you can print the confusion matrix using the 4 values from above, or get precision and recall:
    precision = float(true_positives) / float(true_positives+false_positives)
    recall = float(true_positives) / float(true_positives+false_negatives)
    
    # or F1 score:
    F1_score = 2 * ( precision * recall ) / ( precision+recall )
    

提交回复
热议问题