How to get feature importance in xgboost?

后端 未结 11 1749
情深已故
情深已故 2020-12-13 07:00

I\'m using xgboost to build a model, and try to find the importance of each feature using get_fscore(), but it returns {}

and my train code

11条回答
  •  既然无缘
    2020-12-13 07:35

    For anyone who comes across this issue while using xgb.XGBRegressor() the workaround I'm using is to keep the data in a pandas.DataFrame() or numpy.array() and not to convert the data to dmatrix(). Also, I had to make sure the gamma parameter is not specified for the XGBRegressor.

    fit = alg.fit(dtrain[ft_cols].values, dtrain['y'].values)
    ft_weights = pd.DataFrame(fit.feature_importances_, columns=['weights'], index=ft_cols)
    

    After fitting the regressor fit.feature_importances_ returns an array of weights which I'm assuming is in the same order as the feature columns of the pandas dataframe.

    My current setup is Ubuntu 16.04, Anaconda distro, python 3.6, xgboost 0.6, and sklearn 18.1.

提交回复
热议问题