How do I free all memory on GPU in XGBoost?

旧城冷巷雨未停 提交于 2020-01-14 22:42:41

问题


Here is my code:

clf = xgb.XGBClassifier(
  tree_method = 'gpu_hist',
  gpu_id = 0,
  n_gpus = 4,
  random_state = 55,
  n_jobs = -1
)
clf.set_params(**params)
clf.fit(X_train, y_train, **fit_params)

I've read the answers on this question and this git issue but neither worked.

I tried to delete the booster in this way:

clf._Booster.__del__()
gc.collect()

It deletes the booster but doesn't completely free up GPU memory.

I guess it's Dmatrix that is still there but I am not sure.

How can I free the whole memory?


回答1:


Well, I don't think there is a way that you can have access to the loaded Dmatrix cause the fit function doesn't return it. you can check the source code here on this github link:

So I think the best way is to wrap it in a Process and run it that way, like this:

from multiprocessing import Process

def fitting(args):
    clf = xgb.XGBClassifier(tree_method = 'gpu_hist',gpu_id = 0,n_gpus = 4, random_state = 55,n_jobs = -1)
    clf.set_params(**params)
    clf.fit(X_train, y_train, **fit_params)

    #save the model here on the disk

fitting_process = Process(target=fitting, args=(args))
fitting process.start()
fitting_process.join()

# load the model from the disk here


来源:https://stackoverflow.com/questions/56298728/how-do-i-free-all-memory-on-gpu-in-xgboost

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!