linear-regression

How to plot a linear regression to a double logarithmic R plot?

江枫思渺然 提交于 2019-12-20 01:09:28
问题 I have the following data: someFactor = 500 x = c(1:250) y = x^-.25 * someFactor which I show in a double logarithmic plot: plot(x, y, log="xy") Now I "find out" the slope of the data using a linear model: model = lm(log(y) ~ log(x)) model which gives: Call: lm(formula = log(y) ~ log(x)) Coefficients: (Intercept) log(x) 6.215 -0.250 Now I'd like to plot the linear regression as a red line, but abline does not work: abline(model, col="red") What is the easiest way to add a regression line to

linear regression in R without copying data in memory?

一曲冷凌霜 提交于 2019-12-20 00:44:31
问题 The standard way of doing a linear regression is something like this: l <- lm(Sepal.Width ~ Petal.Length + Petal.Width, data=iris) and then use predict(l, new_data) to make predictions, where new_data is a dataframe with columns matching the formula. But lm() returns an lm object, which is a list that contains crap-loads of stuff that is mostly irrelevant in most situations. This includes a copy of the original data, and a bunch of named vectors and arrays the length/size of the data: R> str

linear regression in R without copying data in memory?

别说谁变了你拦得住时间么 提交于 2019-12-20 00:44:11
问题 The standard way of doing a linear regression is something like this: l <- lm(Sepal.Width ~ Petal.Length + Petal.Width, data=iris) and then use predict(l, new_data) to make predictions, where new_data is a dataframe with columns matching the formula. But lm() returns an lm object, which is a list that contains crap-loads of stuff that is mostly irrelevant in most situations. This includes a copy of the original data, and a bunch of named vectors and arrays the length/size of the data: R> str

Rolling regression and prediction with lm() and predict()

六眼飞鱼酱① 提交于 2019-12-19 09:42:01
问题 I need to apply lm() to an enlarging subset of my dataframe dat , while making prediction for the next observation. For example, I am doing: fit model predict ---------- ------- dat[1:3, ] dat[4, ] dat[1:4, ] dat[5, ] . . . . dat[-1, ] dat[nrow(dat), ] I know what I should do for a particular subset (related to this question: predict() and newdata - How does this work?). For example to predict the last row, I do dat1 = dat[1:(nrow(dat)-1), ] dat2 = dat[nrow(dat), ] fit = lm(log(clicks) ~ log

plot.lm Error: $ operator is invalid for atomic vectors

拟墨画扇 提交于 2019-12-19 09:37:52
问题 I have the following regression model with transformations: fit <- lm( I(NewValue ^ (1 / 3)) ~ I(CurrentValue ^ (1 / 3)) + Age + Type - 1, data = dataReg) plot(fit) But plot gives me the following error: Error: $ operator is invalid for atomic vectors Any ideas about what I am doing wrong? Note : summary , predict , and residuals all work correctly. 回答1: This is actually quite a interesting observation. In fact, among all 6 plots supported by plot.lm , only the Q-Q plot fails in this case.

Get hat matrix from QR decomposition for weighted least square regression

一笑奈何 提交于 2019-12-18 17:15:33
问题 I am trying to extend the lwr() function of the package McSptial , which fits weigthed regressions as non-parametric estimation. In the core of the lwr() function, it inverts a matrix using solve() instead of a QR decomposition, resulting in numerical instability. I would like to change it but can't figure out how to get the hat matrix (or other derivatives) from the QR decomposition afterward. With data : set.seed(0); xmat <- matrix(rnorm(500), nrow=50) ## model matrix y <- rowSums(rep(2:11

Get hat matrix from QR decomposition for weighted least square regression

纵然是瞬间 提交于 2019-12-18 17:15:22
问题 I am trying to extend the lwr() function of the package McSptial , which fits weigthed regressions as non-parametric estimation. In the core of the lwr() function, it inverts a matrix using solve() instead of a QR decomposition, resulting in numerical instability. I would like to change it but can't figure out how to get the hat matrix (or other derivatives) from the QR decomposition afterward. With data : set.seed(0); xmat <- matrix(rnorm(500), nrow=50) ## model matrix y <- rowSums(rep(2:11

Piecewise regression with a quadratic polynomial and a straight line joining smoothly at a break point

南楼画角 提交于 2019-12-18 16:59:38
问题 I want to fit a piecewise linear regression with one break point xt , such that for x < xt we have a quadratic polynomial and for x >= xt we have a straight line. Two pieces should join smoothly, with continuity up to 1st derivative at xt . Here's picture of what it may look like: I have parametrize my piecewise regression function as: where a , b , c and xt are parameters to be estimated. I want to compare this model with a quadratic polynomial regression over the whole range in terms of

Multiple regression analysis in R using QR decomposition

我们两清 提交于 2019-12-18 12:08:48
问题 I am trying to write a function for solving multiple regression using QR decomposition. Input: y vector and X matrix; output: b, e, R^2. So far I`ve got this and am terribly stuck; I think I have made everything way too complicated: QR.regression <- function(y, X) { X <- as.matrix(X) y <- as.vector(y) p <- as.integer(ncol(X)) if (is.na(p)) stop("ncol(X) is invalid") n <- as.integer(nrow(X)) if (is.na(n)) stop("nrow(X) is invalid") nr <- length(y) nc <- NCOL(X) # Householder for (j in seq_len

Graphing perpendicular offsets in a least squares regression plot in R

一笑奈何 提交于 2019-12-18 10:24:19
问题 I'm interested in making a plot with a least squares regression line and line segments connecting the datapoints to the regression line as illustrated here in the graphic called perpendicular offsets: http://mathworld.wolfram.com/LeastSquaresFitting.html (from MathWorld - A Wolfram Web Resource: wolfram.com) I have the plot and regression line done here: ## Dataset from http://www.apsnet.org/education/advancedplantpath/topics/RModules/doc1/04_Linear_regression.html ## Disease severity as a