I\'m using xgboost to build a model, and try to find the importance of each feature using get_fscore()
, but it returns {}
and my train code
For anyone who comes across this issue while using xgb.XGBRegressor()
the workaround I'm using is to keep the data in a pandas.DataFrame()
or numpy.array()
and not to convert the data to dmatrix()
. Also, I had to make sure the gamma
parameter is not specified for the XGBRegressor.
fit = alg.fit(dtrain[ft_cols].values, dtrain['y'].values)
ft_weights = pd.DataFrame(fit.feature_importances_, columns=['weights'], index=ft_cols)
After fitting the regressor fit.feature_importances_
returns an array of weights which I'm assuming is in the same order as the feature columns of the pandas dataframe.
My current setup is Ubuntu 16.04, Anaconda distro, python 3.6, xgboost 0.6, and sklearn 18.1.