svm

Implementing a linear, binary SVM (support vector machine)

假如想象 提交于 2019-12-20 08:44:34
问题 I want to implement a simple SVM classifier, in the case of high-dimensional binary data (text), for which I think a simple linear SVM is best. The reason for implementing it myself is basically that I want to learn how it works, so using a library is not what I want. The problem is that most tutorials go up to an equation that can be solved as a "quadratic problem", but they never show an actual algorithm! So could you point me either to a very simple implementation I could study, or (better

How to Interpret Predict Result of SVM in R?

假装没事ソ 提交于 2019-12-20 08:37:20
问题 I'm new to R and I'm using the e1071 package for SVM classification in R. I used the following code: data <- loadNumerical() model <- svm(data[,-ncol(data)], data[,ncol(data)], gamma=10) print(predict(model, data[c(1:20),-ncol(data)])) The loadNumerical is for loading data, and the data are of the form(first 8 columns are input and the last column is classification) : [,1] [,2] [,3] [,4] [,5] [,6] [,7] [,8] [,9] 1 39 1 -1 43 -1 1 0 0.9050497 0 2 23 -1 -1 30 -1 -1 0 1.6624974 1 3 50 -1 -1 49 1

Test example set attributes should be equal to OR Superset of Training example set Rapidminer SVM

眉间皱痕 提交于 2019-12-20 06:04:15
问题 I am new to Rapid Miner and using SVM Linear in it. My model is as: I made Training Example set which consist of 3552 examples and just 2 attributes and I am doing nominal to numeric conversion, passing through SVM Linear model and then connecting model output in applying model. This is fine. In Test Example set, I have 735 examples with 2 attributes and doing nominal to numeric conversion and then applying this converted Example set to Applying Model. At this stage I am getting an error when

Hyperplane in SVM classifier

橙三吉。 提交于 2019-12-20 04:22:24
问题 I want to get a formula for hyperplane in SVM classifier, so I can calculate the probability of true classification for each sample according to distance from hyperplane. For simplicity imagine MATLAB's own example, load fisheriris xdata = meas(51:end,3:4); group = species(51:end); svmStruct = svmtrain(xdata,group,'showplot',true); Which gives, Where the hyperplane is a line and I want the formula for that. The hyperplane can also have a messy shape! What can I do? maybe there are other ways.

SVM之优化问题

ぃ、小莉子 提交于 2019-12-20 00:51:03
今天结合着人工智能课讲的SVM,相当于是又复习了一遍SVM,果然丝滑顺畅了许多,虽然许多细节问题还是不甚解,但是那都是需要下一步的计划才能解决的问题了,现在的知识再加上专项问题和实战,应该够用了,多多复习。 1.SVM建模 线性分类器和最优线性分类器 点到超平面的距离 SVM目标函数 简化目标函数 2.SVM求解之对偶问题 QP问题 SVM对偶问题 KKT SVM最终步骤 SMO求解λ * 3.SVM扩展 软间隔 核函数 来源: CSDN 作者: LotusQ 链接: https://blog.csdn.net/qq_30057549/article/details/103618216

Cannot understand plotting of decision boundary in SVM and LR

佐手、 提交于 2019-12-19 11:14:57
问题 For example we have f(x) = x. How to plot it? We take some x then calculate y and doing this operation again, then plot chart by dots. Simple and clear. But I cannot understand so clearly plotting decision boundary - when we haven't y to plot, only x. Python code for SVM: h = .02 # step size in the mesh Y = y # we create an instance of SVM and fit out data. We do not scale our # data since we want to plot the support vectors C = 1.0 # SVM regularization parameter svc = svm.SVC(kernel='linear'

SciKit One-class SVM classifier training time increases exponentially with size of training data

孤街浪徒 提交于 2019-12-19 10:03:13
问题 I am using the Python SciKit OneClass SVM classifier to detect outliers in lines of text. The text is converted to numerical features first using bag of words and TF-IDF. When I train (fit) the classifier running on my computer, the time seems to increase exponentially with the number of items in the training set: Number of items in training data and training time taken: 10K: 1 sec, 15K: 2 sec, 20K: 8 sec, 25k: 12 sec, 30K: 16 sec, 45K: 44 sec. Is there anything I can do to reduce the time

Weights from linear SVM model (in R)?

心不动则不痛 提交于 2019-12-19 03:36:55
问题 Using kernlab I've trained a model with code like the following: my.model <- ksvm(result ~ f1+f2+f3, data=gold, kernel="vanilladot") Since it's a linear model, I prefer at run-time to compute the scores as a simple weighted sum of the feature values rather than using the full SVM machinery. How can I convert the model to something like this (some made-up weights here): > c(.bias=-2.7, f1=0.35, f2=-0.24, f3=2.31) .bias f1 f2 f3 -2.70 0.35 -0.24 2.31 where .bias is the bias term and the rest

SVM Visualization in MATLAB

十年热恋 提交于 2019-12-19 01:27:09
问题 How do I visualize the SVM classification once I perform SVM training in Matlab? So far, I have only trained the SVM with: % Labels are -1 or 1 groundTruth = Ytrain; d = xtrain; model = svmtrain(groundTruth, d); 回答1: If you are using LIBSVM, you can plot classification results: % Labels are -1 or 1 groundTruth = Ytrain; d = xtrain; figure % plot training data hold on; pos = find(groundTruth==1); scatter(d(pos,1), d(pos,2), 'r') pos = find(groundTruth==-1); scatter(d(pos,1), d(pos,2), 'b') %

基于opencv的手写数字识别(MFC,HOG,SVM)

只谈情不闲聊 提交于 2019-12-19 00:24:10
参考了秋风细雨的文章: http://blog.csdn.net/candyforever/article/details/8564746 花了点时间编写出了程序,先看看效果吧。 识别效果大概都能正确。 好了,开始正题: 因为本程序是提取HOG特征,使用SVM进行分类的,所以大概了解下HOG的一些知识,其中我觉得怎么计算图像HOG特征的维度会对程序了解有帮助 关于HOG,我们可以参考: http://gz-ricky.blogbus.com/logs/85326280.html http://blog.csdn.net/raodotcong/article/details/6239431 关于手写的数字0-9的数据库下载地址和如何生成此数据库HOG特征的xml文件可以参考文章开头的参考博客。 本人提供一个已经训练好的关于此库我生成的xml文件,下载地址: http://pan.baidu.com/s/1qXSYp 训练模型 #include <iostream> #include <opencv2/opencv.hpp> #include <fstream> using namespace std; using namespace cv; int main() { vector<string> img_path;//输入文件名变量 vector<int> img_catg; int