What is the difference between xgb.train and xgb.XGBRegressor (or xgb.XGBClassifier)?

后端 未结 3 1694
我寻月下人不归
我寻月下人不归 2020-12-14 17:56

I already know \"xgboost.XGBRegressor is a Scikit-Learn Wrapper interface for XGBoost.\"

But do they have any other difference?

3条回答
  •  佛祖请我去吃肉
    2020-12-14 18:48

    From my opinion the main difference is the training/prediction speed.

    For further reference I will call the xgboost.train - 'native_implementation' and XGBClassifier.fit - 'sklearn_wrapper'

    I have made some benchmarks on a dataset shape (240000, 348)

    Fit/train time: sklearn_wrapper time = 89 seconds native_implementation time = 7 seconds

    Prediction time: sklearn_wrapper = 6 seconds native_implementation = 3.5 milliseconds

    I believe this is reasoned by the fact that sklearn_wrapper is designed to use the pandas/numpy objects as input where the native_implementation needs the input data to be converted into a xgboost.DMatrix object.

    In addition one can optimise n_estimators using a native_implementation.

提交回复
热议问题