Hopefully I\'m reading this wrong but in the XGBoost library documentation, there is note of extracting the feature importance attributes using feature_importances_
The alternative to built-in feature importance can be:
scikit-learn
(permutation_importance methodI really like shap
package because it provides additional plots. Example:
You can read about alternative ways to compute feature importance in Xgboost in this blog post of mine.