kernel matrix computation outside SVM training in kernlab

丶灬走出姿态 提交于 2019-12-06 06:14:31

问题


I was developing a new algorithm that generates a modified kernel matrix for training with a SVM and encountered a strange problem.

For testing purposes I was comparing the SVM models learned using kernelMatrix interface and normal kernel interface. For example,

# Model with kernelMatrix computation within ksvm
svp1 <- ksvm(x, y, type="C-svc", kernel=vanilladot(), scaled=F)
# Model with kernelMatrix computed outside ksvm
K <- kernelMatrix(vanilladot(), x)
svp2 <- ksvm(K, y, type="C-svc")
identical(nSV(svp1), nSV(svp2))

Note that I have turned scaling off, as I am not sure how to perform scaling on kernel matrix.

From my understanding both svp1 and svp2 should return the same model. However I observed that this not true for a few datasets, for example glass0 from KEEL.

What am I missing here?


回答1:


I think this has to do with same issue posted here. kernlab appears to treat the calculation of ksvm differently when explicitly using vanilladot() because it's class is 'vanillakernel' instead of 'kernel'.

if you define your own vanilladot kernel with a class of 'kernel' instead of 'vanillakernel' the code will be equivalent for both:

kfunction.k <- function(){
   k <- function (x,y){crossprod(x,y)}
   class(k) <- "kernel"
   k}
l<-0.1 ; C<-1/(2*l)

svp1 <- ksvm(x, y, type="C-svc", kernel=kfunction.k(), scaled=F)

K <- kernelMatrix(kfunction.k(),x)

svp2 <- ksvm(K, y, type="C-svc", kernel='matrix', scaled=F)

identical(nSV(svp1), nSV(svp2))

It's worth noting that svp1 and svp2 are both different from their values in the original code because of this change.



来源:https://stackoverflow.com/questions/27525011/kernel-matrix-computation-outside-svm-training-in-kernlab

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!