How to run a prediction on GPU?

♀尐吖头ヾ 提交于 2020-01-03 03:46:09

问题


I am using h2o4gpu and the parameters which i have set are

h2o4gpu.solvers.xgboost.RandomForestClassifier model.

XGBClassifier(base_score=0.5, booster='gbtree', colsample_bylevel=1,
colsample_bytree=1.0, gamma=0, learning_rate=0.1, max_delta_step=0,
max_depth=8, min_child_weight=1, missing=nan, n_estimators=100,
n_gpus=1, n_jobs=-1, nthread=None, num_parallel_tree=1, num_round=1,
objective='binary:logistic', predictor='gpu_predictor',
random_state=123, reg_alpha=0, reg_lambda=1, scale_pos_weight=1,
seed=None, silent=False, subsample=1.0, tree_method='gpu_hist')

When i am training this model and then predicting, everything is running fine on GPU.

However, when i am saving my model in pickle and then loading back into another notebook and then running a prediction through predict_proba on it, then everything is running on CPU.

Why is my prediction not running on GPU?


回答1:


The predictions are meant to run on CPU so you don't need a GPU to actually use the model.



来源:https://stackoverflow.com/questions/50810039/how-to-run-a-prediction-on-gpu

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!