I already know \"xgboost.XGBRegressor
is a Scikit-Learn Wrapper interface for XGBoost.\"
But do they have any other difference?
@Maxim, as of xgboost 0.90 (or much before), these differences don't exist anymore in that xgboost.XGBClassifier.fit:
callbacks
xgb_model
parameterWhat I find is different is evals_result
, in that it has to be retrieved separately after fit (clf.evals_result()
) and the resulting dict
is different because it can't take advantage of the name of the evals in the watchlist ( watchlist = [(d_train, 'train'), (d_valid, 'valid')]
) .