How to reuse cross_validation_fold_assignment() with GBM in H2o library with Python

倾然丶 夕夏残阳落幕 提交于 2021-01-28 05:09:10

问题


I run my model with H2o library. I run with 5 folds cross-validation.

model = H2OGradientBoostingEstimator(
                        balance_classes=True, 
                        nfolds=5,
                        keep_cross_validation_fold_assignment=True,
                        seed=1234)
model.train(x=predictors,y=response,training_frame=data)
print('rmse: ',model.rmse(xval=True))
print('R2: ',model.r2(xval=True))
data_nfolds = model.cross_validation_fold_assignment()

I got the cross-validation fold assignment. I try to reuse it for a new model with other parameters such as ntrees or stopping_rounds, but I did not find it in the documents.

https://docs.h2o.ai/h2o/latest-stable/h2o-docs/data-science/algo-params/keep_cross_validation_fold_assignment.html


回答1:


I found the answer.

nfolds_index = h2o.import_file('myfile_index.csv')
nfolds_index.set_names(["fold_numbers"])
data = data.cbind(nfolds_index)
model2 = H2OGradientBoostingEstimator( seed=1234)
model2.train(x=predictors,y=response,training_frame=data, fold_column="fold_numbers")
print('rmse: ',model2.rmse(xval=True))
print('R2: ',model2.r2(xval=True))


来源:https://stackoverflow.com/questions/64790872/how-to-reuse-cross-validation-fold-assignment-with-gbm-in-h2o-library-with-pyt

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!