regression

Time series prediction using support vector regression

时光怂恿深爱的人放手 提交于 2019-12-05 11:07:34
I've been trying to implement time series prediction tool using support vector regression in python language. I use SVR module from scikit-learn for non-linear Support vector regression. But I have serious problem with prediction of future events. The regression line fits the original function great (from known data) but as soon as I want to predict future steps, it returns value from the last known step. My code looks like this: import numpy as np from matplotlib import pyplot as plt from sklearn.svm import SVR X = np.arange(0,100) Y = np.sin(X) svr_rbf = SVR(kernel='rbf', C=1e5, gamma=1e5) y

3D coordinates as the output of a Neural Network

爱⌒轻易说出口 提交于 2019-12-05 07:48:33
问题 Neural Networks are mostly used to classify. So, the activation of a neuron in the output layer indicates the class of whatever you are classifying. Is it possible (and correct) to design a NN to get 3D coordinates? This is, three output neurons with values in ranges, for example [-1000.0, 1000.0], each one. 回答1: Yes. You can use a neural network to perform linear regression, and more complicated types of regression, where the output layer has multiple nodes that can be interpreted as a 3-D

Multi-level regression model on multiply imputed data set in R (Amelia, zelig, lme4)

*爱你&永不变心* 提交于 2019-12-05 07:28:32
I am trying to run a multi-level model on multiply imputed data (created with Amelia); the sample is based on a clustered sample with group = 24, N= 150. library("ZeligMultilevel") ML.model.0 <- zelig(dv~1 + tag(1|group), model="ls.mixed", data=a.out$imputations) summary(ML.model.0) This code produces the following error code: Error in object[[1]]$result$call : $ operator not defined for this S4 class If I run a OLS regression, it works: model.0 <- zelig(dv~1, model="ls", data=a.out$imputations) m.0 <- coef(summary(model.0)) print(m.0, digits = 2) Value Std. Error t-stat p-value [1,] 45 0.34

Multi-variate regression using NumPy in Python?

夙愿已清 提交于 2019-12-05 06:41:57
问题 Is it possible to perform multi-variate regression in Python using NumPy? The documentation here suggests that it is, but I cannot find any more details on the topic. 回答1: Yes, download this ( http://www.scipy.org/Cookbook/OLS?action=AttachFile&do=get&target=ols.0.2.py ) from http://www.scipy.org/Cookbook/OLS Or you can install R and a python-R link. R can do anything . 回答2: The webpage that you linked to mentions numpy.linalg.lstsq to find the vector x which minimizes |b - Ax| . Here is a

Scatter plot kernel smoothing: ksmooth() does not smooth my data at all

风流意气都作罢 提交于 2019-12-05 05:59:37
Original question I want to smooth my explanatory variable, something like Speed data of a vehicle, and then use this smoothed values. I searched a lot, and find nothing that directly is my answer. I know how to calculate the kernel density estimation ( density() or KernSmooth::bkde() ) but I don't know then how to calculate the smoothed values of speed. Re-edited question Thanks to @ZheyuanLi, I am able to better explain what I have and what I want to do. So I have re-edited my question as below. I have some speed measurement of a vehicle during a time, stored as a data frame vehicle : t

Drawing only boundaries of stat_smooth in ggplot2

心已入冬 提交于 2019-12-05 05:13:27
When using stat_smooth() with geom_point is there a way to remove the shaded fit region, but only draw its outer bounds? I know I can remove the shaded region with something like: geom_point(aes(x=x, y=y)) + geom_stat(aes(x=x, y=y), alpha=0) but how can I make the outer bounds of it (outer curves) still visible as faint black lines? agstudy You can also use geom_ribbon with fill = NA. gg <- ggplot(mtcars, aes(qsec, wt))+ geom_point() + stat_smooth( alpha=0,method='loess') rib_data <- ggplot_build(gg)$data[[2]] ggplot(mtcars)+ stat_smooth(aes(qsec, wt), alpha=0,method='loess')+ geom_point(aes

Knn Regression in R

好久不见. 提交于 2019-12-05 02:50:14
问题 I am investigating Knn regression methods and later Kernel Smoothing. I wish to demonstrate these methods using plots in R. I have generated a data set using the following code: x = runif(100,0,pi) e = rnorm(100,0,0.1) y = sin(x)+e I have been trying to follow a description of how to use "knn.reg" in 9.2 here: https://daviddalpiaz.github.io/r4sl/k-nearest-neighbors.html#regression grid2=data.frame(x) knn10 = FNN::knn.reg(train = x, test = grid2, y = y, k = 10) My predicted values seem

Python Keras cross_val_score Error

陌路散爱 提交于 2019-12-05 01:53:49
I am trying to do this little tutorial on keras about regression: http://machinelearningmastery.com/regression-tutorial-keras-deep-learning-library-python/ Unfortunately I am running into an error I cannot fix. If i just copy and paste the code I get the following error when running this snippet: import numpy import pandas from keras.models import Sequential from keras.layers import Dense from keras.wrappers.scikit_learn import KerasRegressor from sklearn.model_selection import cross_val_score from sklearn.model_selection import KFold from sklearn.preprocessing import StandardScaler from

How to set a weighted least-squares in r for heteroscedastic data?

风流意气都作罢 提交于 2019-12-05 00:10:04
问题 I'm running a regression on census data where my dependent variable is life expectancy and I have eight independent variables. The data is aggregated be cities, so I have many thousand observations. My model is somewhat heteroscedastic though. I want to run a weighted least-squares where each observation is weighted by the city’s population. In this case, it would mean that I want to weight the observations by the inverse of the square root of the population. It’s unclear to me, however, what

Missing object error when using step() within a user-defined function

送分小仙女□ 提交于 2019-12-04 23:47:10
问题 5 days and still no answer As can be seen by Simon's comment, this is a reproducible and very strange issue. It seems that the issue only arises when a stepwise regression with very high predictive power is wrapped in a function. I have been struggling with this for a while and any help would be much appreciated. I am trying to write a function that runs several stepwise regressions and outputs all of them to a list. However, R is having trouble reading the dataset that I specify in my