How is the gradient and hessian of logarithmic loss computed in the custom objective function example script in xgboost's github repository?

北慕城南 提交于 2019-12-05 07:53:45

The log loss function is given as:

where

Taking the partial derivative we get the gradient as

Thus we get the negative of gradient as p-y.

Similar calculations can be done to obtain the hessian.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!