xgboost xgb.dump tree coefficient

*爱你&永不变心* 提交于 2019-12-07 09:43:40

问题


I have a sample code here.

data(agaricus.train, package='xgboost')
train <- agaricus.train
bst <- xgboost(data = train$data, label = train$label, max.depth = 2,
eta = 1, nthread = 2, nround = 2,objective = "binary:logistic")
xgb.dump(bst, 'xgb.model.dump', with.stats = TRUE)

After building the model, I print it out as

booster[0]
0:[f28<-1.00136e-05] yes=1,no=2,missing=1,gain=4000.53,cover=1628.25
    1:[f55<-1.00136e-05] yes=3,no=4,missing=3,gain=1158.21,cover=924.5
        3:leaf=1.71218,cover=812
        4:leaf=-1.70044,cover=112.5
    2:[f108<-1.00136e-05] yes=5,no=6,missing=5,gain=198.174,cover=703.75
        5:leaf=-1.94071,cover=690.5
        6:leaf=1.85965,cover=13.25
booster[1]
0:[f59<-1.00136e-05] yes=1,no=2,missing=1,gain=832.545,cover=788.852
    1:[f28<-1.00136e-05] yes=3,no=4,missing=3,gain=569.725,cover=768.39
        3:leaf=0.784718,cover=458.937
        4:leaf=-0.96853,cover=309.453
    2:leaf=-6.23624,cover=20.4624

I have questions:

  1. I understand that Gradient boost tree averages results from these trees with some weighted coefficients. How can I get those coefs?

  2. Just to clarify. The value predicted by the trees are leaf = x, isn't it?

Thank you.


回答1:


Combined answer for Q1 and Q2:

The coefficient for all tree leaf scores for xgboost is 1. Simply sum all the leaf scores. Let the sum be S. Then apply logistic(2-class) function on it: Pr(label=1) = 1/(1+exp(-S))

I have verified this and used in production systems.



来源:https://stackoverflow.com/questions/32589141/xgboost-xgb-dump-tree-coefficient

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!