glm

Understanding glm$residuals and resid(glm)

独自空忆成欢 提交于 2019-11-28 16:45:40
Can you tell me what is returned by glm$residuals and resid(glm) where glm is a quasipoisson object. e.g. How would I create them using glm$y and glm$linear.predictors. glm$residuals n missing unique Mean .05 .10 .25 .50 .75 .90 .95 37715 10042 2174 -0.2574 -2.7538 -2.2661 -1.4480 -0.4381 0.7542 1.9845 2.7749 lowest : -4.243 -3.552 -3.509 -3.481 -3.464 highest: 8.195 8.319 8.592 9.089 9.416 resid(glm) n missing unique Mean .05 .10 .25 37715 0 2048 -2.727e-10 -1.0000 -1.0000 -0.6276 .50 .75 .90 .95 -0.2080 0.4106 1.1766 1.7333 lowest : -1.0000 -0.8415 -0.8350 -0.8333 -0.8288 highest: 7.2491 7

How to do a GLM when “contrasts can be applied only to factors with 2 or more levels”?

柔情痞子 提交于 2019-11-28 14:13:45
I want to do a regression in R using glm , but is there a way to do it since I get the contrasts error. mydf <- data.frame(Group=c(1,1,2,2,3,3,4,4,5,5,6,6,7,7,8,8,9,9,10,10,11,11,12,12), WL=rep(c(1,0),12), New.Runner=c("N","N","N","N","N","N","Y","N","N","N","N","N","N","Y","N","N","N","Y","N","N","N","N","N","Y"), Last.Run=c(1,5,2,6,5,4,NA,3,7,2,4,9,8,NA,3,5,1,NA,6,10,7,9,2,NA)) mod <- glm(formula = WL~New.Runner+Last.Run, family = binomial, data = mydf) #Error in `contrasts<-`(`*tmp*`, value = contr.funs[1 + isOF[nn]]) : # contrasts can be applied only to factors with 2 or more levels Using

R predict glm fit on each column in data frame using column index number

痴心易碎 提交于 2019-11-28 08:04:11
问题 Trying to fit BLR model to each column in data frame, and then predict on new data pts. Have a lot of columns, so cannot identify the columns by name, only column number. Having reviewed the several examples of similar nature on this site, cannot figure out why this does not work. df <- data.frame(x1 = runif(1000, -10, 10), x2 = runif(1000, -2, 2), x3 = runif(1000, -5, 5), y = rbinom(1000, size = 1, prob = 0.40)) for (i in 1:length(df)-1) { fit <- glm (y ~ df[,i], data = df, family = binomial

How to calculate the predicted probability of negative binomial regression model?

本秂侑毒 提交于 2019-11-28 08:02:35
问题 I use glm.nb() function in R MASS package to estimate the parameters of a negative binomial regression model. How could I calculate the predicted probability (probability mass function) given new data, which R function can I use? My dataset is as follows. y follows negative binomial distribution and x is covariate. And I use glm.nb(y ~ x, data=data) to estimate model parameters. Given new x and y , how can I calculate the predicted probability. Is there a way to calculate it using Java? y x

model.matrix(): why do I lose control of contrast in this case

你。 提交于 2019-11-28 05:59:07
问题 Suppose we have a toy data frame: x <- data.frame(x1 = gl(3, 2, labels = letters[1:3]), x2 = gl(3, 2, labels = LETTERS[1:3])) I would like to construct a model matrix # x1b x1c x2B x2C # 1 0 0 0 0 # 2 0 0 0 0 # 3 1 0 1 0 # 4 1 0 1 0 # 5 0 1 0 1 # 6 0 1 0 1 by: model.matrix(~ x1 + x2 - 1, data = x, contrasts.arg = list(x1 = contr.treatment(letters[1:3]), x2 = contr.treatment(LETTERS[1:3]))) but actually I get: # x1a x1b x1c x2B x2C # 1 1 0 0 0 0 # 2 1 0 0 0 0 # 3 0 1 0 1 0 # 4 0 1 0 1 0 # 5 0

Extract standard errors from glm

五迷三道 提交于 2019-11-27 22:18:57
I did a glm and I just want to extract the standard errors of each coefficient. I saw on the internet the function se.coef() but it doesn't work, it returns "Error: could not find function "se.coef"" . The information you're after is stored in the coefficients object returned by summary() . You can extract it thusly: summary(glm.D93)$coefficients[, 2] #Example from ?glm counts <- c(18,17,15,20,10,20,25,13,12) outcome <- gl(3,1,9) treatment <- gl(3,3) print(d.AD <- data.frame(treatment, outcome, counts)) glm.D93 <- glm(counts ~ outcome + treatment, family=poisson()) #coefficients has the data

Understanding glm$residuals and resid(glm)

ぐ巨炮叔叔 提交于 2019-11-27 19:58:38
问题 Can you tell me what is returned by glm$residuals and resid(glm) where glm is a quasipoisson object. e.g. How would I create them using glm$y and glm$linear.predictors. glm$residuals n missing unique Mean .05 .10 .25 .50 .75 .90 .95 37715 10042 2174 -0.2574 -2.7538 -2.2661 -1.4480 -0.4381 0.7542 1.9845 2.7749 lowest : -4.243 -3.552 -3.509 -3.481 -3.464 highest: 8.195 8.319 8.592 9.089 9.416 resid(glm) n missing unique Mean .05 .10 .25 37715 0 2048 -2.727e-10 -1.0000 -1.0000 -0.6276 .50 .75

modify glm function to adopt user-specified link function in R

我的梦境 提交于 2019-11-27 11:51:27
In glm in R, the default link functions for the Gamma family are inverse , identity and log . Now for my particular question, I need to use gamma regression with response Y and a modified link function in the form of log(E(Y)-1)) . Thus, I consider modifying some glm -related functions in R. There are several functions that may be relevant, and I am seeking help for anyone who had previous experience in doing this. For example, the functions Gamma is defined as function (link = "inverse") { linktemp <- substitute(link) if (!is.character(linktemp)) linktemp <- deparse(linktemp) okLinks <- c(

R error which says “Models were not all fitted to the same size of dataset”

随声附和 提交于 2019-11-27 09:16:33
I have created two generalised linear models as follows: glm1 <-glm(Y ~ X1 + X2 + X3, family=binomial(link=logit)) glm2 <-glm(Y ~ X1 + X2, family=binomial(link=logit)) I then use the anova function: anova(glm2,glm1) but get an error message: "Error in anova.glmlist(c(list(object),dotargs), dispersion = dispersion, : models were not all fitted to the same size of dataset" What does this mean and how can I fix this? I have attach ed the dataset at the start of my code so both models are working off of the same dataset. The main cause of that error is when there are missing values in one or more

Confidence intervals for predictions from logistic regression

二次信任 提交于 2019-11-27 09:00:15
问题 In R predict.lm computes predictions based on the results from linear regression and also offers to compute confidence intervals for these predictions. According to the manual, these intervals are based on the error variance of fitting, but not on the error intervals of the coefficient. On the other hand predict.glm which computes predictions based on logistic and Poisson regression (amongst a few others) doesn't have an option for confidence intervals. And I even have a hard time imagining