I\'m using xgboost to build a model, and try to find the importance of each feature using get_fscore(), but it returns {}
and my train code
Build the model from XGboost first
from xgboost import XGBClassifier, plot_importance
model = XGBClassifier()
model.fit(train, label)
this would result in an array. So we can sort it with descending
sorted_idx = np.argsort(model.feature_importances_)[::-1]
Then, it is time to print all sorted importances and the name of columns together as lists (I assume the data loaded with Pandas)
for index in sorted_idx:
print([train.columns[index], model.feature_importances_[index]])
Furthermore, we can plot the importances with XGboost built-in function
plot_importance(model, max_num_features = 15)
pyplot.show()
use max_num_features in plot_importance to limit the number of features if you want.