xgboost

XGBOOST: sample_Weights vs scale_pos_weight

送分小仙女□ 提交于 2020-07-15 05:39:06
问题 I have a highly unbalanced dataset and am wondering where to account for the weights, and thus am trying to comprehend the difference between scale_pos_weight argument in XGBClassifier and the sample_weight parameter of the fit method. Would appreciate an intuitive explanation of the difference between the two, if they can be used simultaneously or how either approach is selected. The documentation indicates that scale_pos_weight : control the balance of positive and negative weights..&

Process finished with exit code 137 in PyCharm

折月煮酒 提交于 2020-07-04 20:00:55
问题 When I stop the script manually in PyCharm, process finished with exit code 137. But I didn't stop the script. Still got the exit code 137. What's the problem? Python version is 3.6, process finished when running xgboost.train() method. 回答1: Exit code 137 means that your process was killed by (signal 9) SIGKILL . In the case you manually stopped it - there's your answer. If you didn't manually stop the script and still got this error code, then the script was killed by your OS. In most of the

Feature Importance for XGBoost in Sagemaker

三世轮回 提交于 2020-06-27 12:51:27
问题 I have built an XGBoost model using Amazon Sagemaker, but I was unable to find anything which will help me interpret the model and validate if it has learned the right dependencies. Generally, we can see Feature Importance for XGBoost by get_fscore() function in the python API (https://xgboost.readthedocs.io/en/latest/python/python_api.html) I see nothing of that sort in the sagemaker api(https://sagemaker.readthedocs.io/en/stable/estimators.html). I know I can build my own model and then

Feature Importance for XGBoost in Sagemaker

北城以北 提交于 2020-06-27 12:51:09
问题 I have built an XGBoost model using Amazon Sagemaker, but I was unable to find anything which will help me interpret the model and validate if it has learned the right dependencies. Generally, we can see Feature Importance for XGBoost by get_fscore() function in the python API (https://xgboost.readthedocs.io/en/latest/python/python_api.html) I see nothing of that sort in the sagemaker api(https://sagemaker.readthedocs.io/en/stable/estimators.html). I know I can build my own model and then

Creating a Custom Objective Function in for XGBoost.XGBRegressor

為{幸葍}努か 提交于 2020-06-26 14:50:12
问题 So I am relatively new to the ML/AI game in python, and I'm currently working on a problem surrounding the implementation of a custom objective function for XGBoost. My differential equation knowledge is pretty rusty so I've created a custom obj function with a gradient and hessian that models the mean squared error function that is ran as the default objective function in XGBRegressor to make sure that I am doing all of this correctly. The problem is, the results of the model (the error

Creating a Custom Objective Function in for XGBoost.XGBRegressor

烂漫一生 提交于 2020-06-26 14:49:47
问题 So I am relatively new to the ML/AI game in python, and I'm currently working on a problem surrounding the implementation of a custom objective function for XGBoost. My differential equation knowledge is pretty rusty so I've created a custom obj function with a gradient and hessian that models the mean squared error function that is ran as the default objective function in XGBRegressor to make sure that I am doing all of this correctly. The problem is, the results of the model (the error

Creating a Custom Objective Function in for XGBoost.XGBRegressor

泪湿孤枕 提交于 2020-06-26 14:49:09
问题 So I am relatively new to the ML/AI game in python, and I'm currently working on a problem surrounding the implementation of a custom objective function for XGBoost. My differential equation knowledge is pretty rusty so I've created a custom obj function with a gradient and hessian that models the mean squared error function that is ran as the default objective function in XGBRegressor to make sure that I am doing all of this correctly. The problem is, the results of the model (the error

What is the use of base_score in xgboost multiclass working?

断了今生、忘了曾经 提交于 2020-06-17 09:33:40
问题 The bounty expires in 7 days . Answers to this question are eligible for a +100 reputation bounty. jared_mamrot is looking for an answer from a reputable source : This bounty is for a reproducible example illustrating the application of base_score or base_margin to a multiclass XGBoost classification problem (softmax or softprob) using R. I am trying to explore the working of Xgboost binary classification as well as for multi-class. In case of binary class, i observed that base_score is

How to visualize an XGBoost tree from GridSearchCV output?

大憨熊 提交于 2020-06-13 05:56:06
问题 I am using XGBRegressor to fit the model using gridsearchcv . I want to visulaize the trees. Here is the link I followed ( If duplicate) how to plot a decision tree from gridsearchcv? xgb = XGBRegressor(learning_rate=0.02, n_estimators=600,silent=True, nthread=1) folds = 5 grid = GridSearchCV(estimator=xgb, param_grid=params, scoring='neg_mean_squared_error', n_jobs=4, verbose=3 ) model=grid.fit(X_train, y_train) Approach 1: dot_data = tree.export_graphviz(model.best_estimator_, out_file=None

How to visualize an XGBoost tree from GridSearchCV output?

流过昼夜 提交于 2020-06-13 05:55:07
问题 I am using XGBRegressor to fit the model using gridsearchcv . I want to visulaize the trees. Here is the link I followed ( If duplicate) how to plot a decision tree from gridsearchcv? xgb = XGBRegressor(learning_rate=0.02, n_estimators=600,silent=True, nthread=1) folds = 5 grid = GridSearchCV(estimator=xgb, param_grid=params, scoring='neg_mean_squared_error', n_jobs=4, verbose=3 ) model=grid.fit(X_train, y_train) Approach 1: dot_data = tree.export_graphviz(model.best_estimator_, out_file=None