regression

Rolling regression return multiple objects

允我心安 提交于 2019-12-04 19:17:45
I am trying to build a rolling regression function based on the example here , but in addition to returning the predicted values, I would like to return the some rolling model diagnostics (i.e. coefficients, t-values, and mabye R^2). I would like the results to be returned in discrete objects based on the type of results. The example provided in the link above sucessfully creates thr rolling predictions, but I need some assistance packaging and writing out the rolling model diagnostics: In the end, I would like the function to return three (3) objects: Predictions Coefficients T values R^2

Make regressions and predictions for groups in R

半腔热情 提交于 2019-12-04 19:14:14
I have the following data.frame d from an experiment: - Variable y (response, continuous) - Factor f (500 levels) - Time t (posixct) In the last 8 years, y was measured roughly once a month (exact date in t) for each level of f. Sometimes there are 2 measures per month, sometimes a couple of month passed without any measures. Sorry for not providing example data, but making up unregular time series goes beyond my R knowledge. ;) I'd like to do the following with this data: make a regression using the loess() function (y ~ t) , for each level of f make a prediction of y for the first day of

Applying lm() and predict() to multiple columns in a data frame

淺唱寂寞╮ 提交于 2019-12-04 17:26:19
I have an example dataset below. train<-data.frame(x1 = c(4,5,6,4,3,5), x2 = c(4,2,4,0,5,4), x3 = c(1,1,1,0,0,1), x4 = c(1,0,1,1,0,0), x5 = c(0,0,0,1,1,1)) Suppose I want to create separate models for column x3 , x4 , x5 based on column x1 and x2 . For example lm1 <- lm(x3 ~ x1 + x2) lm2 <- lm(x4 ~ x1 + x2) lm3 <- lm(x5 ~ x1 + x2) I want to then take these models and apply them to a testing set using predict, and then create a matrix that has each model outcome as a column. test <- data.frame(x1 = c(4,3,2,1,5,6), x2 = c(4,2,1,6,8,5)) p1 <- predict(lm1, newdata = test) p2 <- predict(lm2,

Get all models from leaps regsubsets

☆樱花仙子☆ 提交于 2019-12-04 16:54:04
I used regsubsets to search for models. Is it possible to automatically create all lm from the list of parameter selections? library(leaps) leaps<-regsubsets(y ~ x1 + x2 + x3, data, nbest=1, method="exhaustive") summary(leaps)$which (Intercept) x1 x2 x3 1 TRUE FALSE FALSE TRUE 2 TRUE FALSE TRUE TRUE 3 TRUE TRUE TRUE TRUE Now i would manually do model_1 <- lm(y ~ x3) and so on. How can this be automated to have them in a list? I don't know why you want a list of all models. summary and coef methods should serve you well. But I will first answer your question from a pure programming aspect, then

Plotting a 95% confidence interval for a lm object

。_饼干妹妹 提交于 2019-12-04 16:29:44
How can I calculate and plot a confidence interval for my regression in r? So far I have two numerical vectors of equal length (x,y) and a regression object(lm.out). I have made a scatterplot of y given x and added the regression line to this plot. I am looking for a way to add a 95% prediction confidence band for lm.out to the plot. I've tried using the predict function, but I don't even know where to start with that :/. Here is my code at the moment: x=c(1,2,3,4,5,6,7,8,9,0) y=c(13,28,43,35,96,84,101,110,108,13) lm.out <- lm(y ~ x) plot(x,y) regression.data = summary(lm.out) #save regression

Automation testing tool for Regression testing of desktop application [closed]

旧城冷巷雨未停 提交于 2019-12-04 16:17:45
Closed. This question is off-topic . It is not currently accepting answers. Want to improve this question? Update the question so it's on-topic for Stack Overflow. Closed 3 years ago . I am working on a desktop application which uses Infragistic grids. We need to automate the regression tests for same. QTP alone does not support this, we need to buy new plug in for same which my company is not very much interested in. Do we have any open source tool for automating regression testing of desktop application? Application is in Dot net but i do not think it makes much of a difference. Please

Unexpected standard errors with weighted least squares in Python Pandas

笑着哭i 提交于 2019-12-04 15:20:03
In the code for the main OLS class in Python Pandas , I am looking for help to clarify what conventions are used for the standard error and t-stats reported when weighted OLS is performed. Here's my example data set, with some imports to use Pandas and to use scikits.statsmodels WLS directly: import pandas import numpy as np from statsmodels.regression.linear_model import WLS # Make some random data. np.random.seed(42) df = pd.DataFrame(np.random.randn(10, 3), columns=['a', 'b', 'weights']) # Add an intercept term for direct use in WLS df['intercept'] = 1 # Add a number (I picked 10) to

Why deep NN can't approximate simple ln(x) function?

◇◆丶佛笑我妖孽 提交于 2019-12-04 14:03:42
问题 I have created ANN with two RELU hidden layers + linear activation layer and trying to approximate simple ln(x) function. And I am can't do this good. I am confused because lx(x) in x:[0.0-1.0] range should be approximated without problems (I am using learning rate 0.01 and basic grad descent optimization). import tensorflow as tf import numpy as np def GetTargetResult(x): curY = np.log(x) return curY # Create model def multilayer_perceptron(x, weights, biases): # Hidden layer with RELU

Why is logistic regression called regression? [closed]

一笑奈何 提交于 2019-12-04 13:52:08
问题 Closed. This question is off-topic. It is not currently accepting answers. Want to improve this question? Update the question so it's on-topic for Stack Overflow. Closed 6 months ago . According to what I have understood, linear regression predicts the outcome which can have continuous values, whereas logistic regression predicts outcome which is discrete. It seems to me that logistic regression is similar to a classification problem. So, why is it called regression ? There is also a related

Difference in Differences in Python + Pandas

ぃ、小莉子 提交于 2019-12-04 13:44:24
问题 I'm trying to perform a Difference in Differences (with panel data and fixed effects) analysis using Python and Pandas. I have no background in Economics and I'm just trying to filter the data and run the method that I was told to. However, as far as I could learn, I understood that the basic diff-in-diffs model looks like this: I.e., I am dealing with a multivariable model. Here it follows a simple example in R: https://thetarzan.wordpress.com/2011/06/20/differences-in-differences-estimation