svm

Can we obtain hybrid algorithm for spam filtering from Naive Bayes & SVM?

自闭症网瘾萝莉.ら 提交于 2021-02-08 10:22:06
问题 I am developing a spam filtering application. I need suggestions regarding the hybrid algorithm from Naive Bayes & SVM.(e.g. based on feature vector, probabilities). Any help is appreciated. Can we develop hybrid algorithm from Naive bayes & SVM? 回答1: Not sure why would you want to merge these two specific methods, but you could use ensemble learning methods for that. EDIT: based on your comments, it seems you already have two independently trained classifiers, and would like to use them

Behavior of C in LinearSVC sklearn (scikit-learn)

时光毁灭记忆、已成空白 提交于 2021-02-08 06:01:14
问题 First I create some toy data: n_samples=20 X=np.concatenate((np.random.normal(loc=2, scale=1.0, size=n_samples),np.random.normal(loc=20.0, scale=1.0, size=n_samples),[10])).reshape(-1,1) y=np.concatenate((np.repeat(0,n_samples),np.repeat(1,n_samples+1))) plt.scatter(X,y) Below the graph to visualize the data: Then I train a model with LinearSVC from sklearn.svm import LinearSVC svm_lin = LinearSVC(C=1) svm_lin.fit(X,y) My understand for C is that: If C is very big, then misclassifications

Is it possible to add a covariate (control for a variable of no interest) to an SVM model?

不问归期 提交于 2021-02-08 05:48:24
问题 I'm very new to machine learning and python and I'm trying to build a model to predict patients (N=200) vs controls (N=200) form structural neuroimaging data. After the initial preprocessing were I reshaped the neuroimaging data into a 2D array I built the following model: from sklearn.svm import SVC svc = SVC(C=1.0, kernel='linear') from sklearn.grid_search import GridSearchCV from numpy import range k_range = np.arange(0.1,10,0.1) param_grid=dict(C=k_range) grid=GridSearchCV(svc, param_grid

R caret unusually slow when tuning SVM with linear kernel

家住魔仙堡 提交于 2021-02-07 18:31:28
问题 I have observed a very strange behavior when tuning SVM parameters with caret . When training a single model without tuning, SVM with radial basis kernel takes more time than SVM with linear kernel, which is expected. However, when tuning SVM with both kernels over the same penalty grid, SVM with linear kernel takes substantially more time than SVM with radial basis kernel. This behavior can be easily reproduced in both Windows and Linux with R 3.2 and caret 6.0-47. Does anyone know why

Color and feature classification opencv

你离开我真会死。 提交于 2021-02-07 10:18:04
问题 I am new to machine learning and currently working on a project. The project is about classifying images based on feature and color attributes. I have tried classifying images through feature extraction based on the example given in the OpenCV with Python by Example book (the Dense extractor SIFT descriptor technique to generate a codebook and train SVM to classify the extracted feature) but I haven't tried yet combining both feature and color attributes since the images sampled were gray

Color and feature classification opencv

徘徊边缘 提交于 2021-02-07 10:17:28
问题 I am new to machine learning and currently working on a project. The project is about classifying images based on feature and color attributes. I have tried classifying images through feature extraction based on the example given in the OpenCV with Python by Example book (the Dense extractor SIFT descriptor technique to generate a codebook and train SVM to classify the extracted feature) but I haven't tried yet combining both feature and color attributes since the images sampled were gray

How to speed up sklearn SVR?

淺唱寂寞╮ 提交于 2021-02-07 03:21:35
问题 I am implementing SVR using sklearn svr package in python. My sparse matrix is of size 146860 x 10202. I have divided it into various sub-matrices of size 2500 x 10202. For each sub matrix, SVR fitting is taking about 10 mins. What could be the ways to speed up the process? Please suggest any different approach or different python package for the same. Thanks! 回答1: You can average the SVR sub-models predictions. Alternatively you can try to fit a linear regression model on the output of

Does one-class svm provide a probability estimate?

大城市里の小女人 提交于 2021-01-29 04:03:56
问题 I am using Libsvm for outlier detection (from Java), but I need a probability estimate not just a label. I traced the code and found that this is not possible. In particular, in the function svm_predict_values(..) I see the following code: if(model.param.svm_type == svm_parameter.ONE_CLASS) return (sum>0)?1:-1; else return sum; I understand that one-class SVM tries to estimate the support of some probability distribution given samples or data points from the "normal" class. Given a new data

Does the decision function in scikit-learn return the true distance to the hyperplane?

孤人 提交于 2021-01-28 09:00:53
问题 Does the decision function return the actual distance to the hyperplane for each sample as stated here. Or do you have to the extra calculation as shown here. Which method should be used? 回答1: No, that's not the actual distance. And depends on the case, you may (linear kernel) or may not (non-linear kernel) be able to convert that to an actually distance. Here is another good explanation. Not matter what, yes you have to take that extra step if you want the actual distance . 来源: https:/

How to implement .dat file for handwritten recognition using SVM in Python

萝らか妹 提交于 2021-01-28 06:35:27
问题 I've been trying to train Hand-written Digits using SVM based on the code on OpenCV library. My training part is as follow: import cv2 import numpy as np SZ=20 bin_n = 16 svm_params = dict( kernel_type = cv2.SVM_LINEAR, svm_type = cv2.SVM_C_SVC, C=2.67, gamma=5.383 ) affine_flags = cv2.WARP_INVERSE_MAP|cv2.INTER_LINEAR def deskew(img): m = cv2.moments(img) if abs(m['mu02']) < 1e-2: return img.copy() skew = m['mu11']/m['mu02'] M = np.float32([[1, skew, -0.5*SZ*skew], [0, 1, 0]]) img = cv2