adjust weights for predicted classes in xgboost in loss function

大城市里の小女人 提交于 2019-12-12 00:47:31

问题


Is it possible to adjust the weighted error for a given target? What Im trying to do is weight the loss higher for rarer classes when predicting multi-classes.


回答1:


If using the core data structure you can set the weight of labels through the "set_weight" parameter:

set_weight(weight) Set weight of each instance.

Parameters: weight (array like) – Weight for each data point

While documentation is quite lackluster on that topic, i have found a reasonable answer that might be useful on this previous topic: How is the parameter "weight" (DMatrix) used in the gradient boosting procedure (xgboost)?

quoting it:

Instance Weight File

XGBoost supports providing each instance an weight to differentiate the importance of instances. For example, if we provide an instance weight file for the "train.txt" file in the example as below:

train.txt.weight

1

0.5

0.5

1

0.5

It means that XGBoost will emphasize more on the first and fourth instance, that is to say positive instances while training. The configuration is similar to configuring the group information. If the instance file name is "xxx", XGBoost will check whether there is a file named "xxx.weight" in the same directory and if there is, will use the weights while training models.

hope it helps!



来源:https://stackoverflow.com/questions/42618491/adjust-weights-for-predicted-classes-in-xgboost-in-loss-function

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!