Calculate residual deviance from scikit-learn logistic regression model

会有一股神秘感。 提交于 2019-12-10 18:03:52

问题


Is there any way to calculate residual deviance of a scikit-learn logistic regression model? This is a standard output from R model summaries, but I couldn't find it any of sklearn's documentation.


回答1:


Actually, you can. Deviance is closely related to cross entropy, which is in sklearn.metrics.log_loss. Deviance is just 2*(loglikelihood_of_saturated_model - loglikelihood_of_fitted_model). Scikit learn can (without larger tweaks) only handle classification of individual instances, so that the log-likelihood of the saturated model is going to be zero. Cross entropy as returned by log_loss is the negative log-likelihood. Thus, the deviance is simply

def deviance(X, y, model):
    return 2*metrics.log_loss(y, model.predict_log_proba(X))

I know this is a very late answer, but I hope it helps anyway.




回答2:


You cannot do it in scikit-learn but check out statsmodels, GLMResults(API)



来源:https://stackoverflow.com/questions/50975774/calculate-residual-deviance-from-scikit-learn-logistic-regression-model

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!