Specificity in scikit learn

后端 未结 4 1186
遇见更好的自我
遇见更好的自我 2021-02-04 05:19

I need specificity for my classification which is defined as : TN/(TN+FP)

I am writing a custom scorer function :

from sklearn.         


        
4条回答
  •  暗喜
    暗喜 (楼主)
    2021-02-04 05:23

    I appreciate this is an old question but thought I would mention sklearn pretty much does this (at least in scikit-learn v0.21.2, but I'm confident it's been around for ages)

    As I understand it, 'specificity' is just a special case of 'recall'. Recall is calculated for the actual positive class ( TP / [TP+FN] ), whereas 'specificity' is the same type of calculation but for the actual negative class ( TN / [TN+FP] ).

    It really only makes sense to have such specific terminology for binary classification problems. For a multi-class classification problem it would be more convenient to talk about recall with respect to each class. There is no reason why you can't talk about recall in this way even when dealing with binary classification problem (e.g. recall for class 0, recall for class 1).

    For example, recall tells us the proportion of patients that actual have cancer, being successfully diagnosed as having cancer. However, to generalize, you could say Class X recall tells us the proportion of samples actually belonging to Class X, being successfully predicted as belonging to Class X.

    Given this, you can use from sklearn.metrics import classification_report to produce a dictionary of the precision, recall, f1-score and support for each label/class. You can also rely on from sklearn.metrics import precision_recall_fscore_support as well, depending on your preference. Documentation here.

提交回复
热议问题