Class Weight Syntax in Kernlab?

北战南征 提交于 2019-12-24 07:59:21

问题


Hi I am trying out classification for imbalanced dataset in R using kernlab package, as the class distribution is not 1:1 I am using the option of class.weights in the ksvm() function call however I do not get any difference in the classification scenario when I add weights or remove weights? So the question is what is the correct syntax for declaring the class weights?

I am using the following function calls:

model = ksvm(dummy[1:466], lab_tr,type='C-svc',kernel=pre,cross=10,C=10,prob.model=F,class.weights=c("Negative"=0.7,"Positive"=0.3)) 
#this is the function call with class weights 
model = ksvm(dummy[1:466], lab_tr,type='C-svc',kernel=pre,cross=10,C=10,prob.model=F) 

Can anyone please comment on this, am I following the right syntax of adding weights? Also I discovered that if we use the weights with prob.model=T the ksvm function returns a error!


回答1:


Your syntax is ok, but the problem of not-working-class-balance is fairly common in machine learning; in a way, the removal of some objects from the bigger class is an only method guaranteed to work, still it may be a source of error increase, and one must be careful to do it in an intelligent way (in SVM the potential support vectors should have the priority - of course now there is a question how to locate them).
You may also try to boost the weights over simple length ratio, lets say ten-fold, and check if it helped even a little or luckily rather overshoot the imbalance to the other side.



来源:https://stackoverflow.com/questions/3282916/class-weight-syntax-in-kernlab

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!