Feature Importance with XGBClassifier

后端 未结 9 1160
隐瞒了意图╮
隐瞒了意图╮ 2020-12-14 10:03

Hopefully I\'m reading this wrong but in the XGBoost library documentation, there is note of extracting the feature importance attributes using feature_importances_

9条回答
  •  自闭症患者
    2020-12-14 11:07

    The alternative to built-in feature importance can be:

    • permutation-based importance from scikit-learn (permutation_importance method
    • importance with Shapley values (shap package)

    I really like shap package because it provides additional plots. Example:

    Importance Plot

    Summary Plot

    Dependence Plot

    You can read about alternative ways to compute feature importance in Xgboost in this blog post of mine.

提交回复
热议问题