regression

how to change softmaxlayer with regression in matconvnet

无人久伴 提交于 2020-01-05 07:33:21
问题 I am trying to train MNIST data set with single output. It means when i give an 28*28 input (image) the model gives us a just number. For example i give '5', the model give me as a result 4.9,5, 5.002 or close to 5. So I have red some documents. People tells softmaxlayer have to be changed with regression layer. For doing do this. I am using matconvnet library and its mnist example. I have changed my network and written regression layer loss function. these are my codes: net.layers = {} ; net

How do I change colours of confidence interval lines when using `matlines` for prediction plot?

寵の児 提交于 2020-01-05 05:57:42
问题 I'm plotting a logarithmic regression's line of best fit as well as the confidence intervals around that line. The code I'm using works well enough, except I'd rather that the confidence intervals both be "gray" (rather than the default "red" and "green"). Unfortunately, I'm not seeing a way to isolate them when specifying colour changes. I'd like for the regression line: lty = 1, col = "black" ; for confidence intervals to have: lty=2, col = "gray" . How can I achieve this? my code is of the

How to create prediction line for Quadratic Model

蹲街弑〆低调 提交于 2020-01-05 04:19:10
问题 I am trying to create a quadratic prediction line for a quadratic model. I am using the Auto dataset that comes with R. I had no trouble creating the prediction line for a linear model. However, the quadratic model yields crazy looking lines. Here is my code. # Linear Model plot(Auto$horsepower, Auto$mpg, main = "MPG versus Horsepower", pch = 20) lin_mod = lm(mpg ~ horsepower, data = Auto) lin_pred = predict(lin_mod) lines( Auto$horsepower, lin_pred, col = "blue", lwd = 2 ) # The Quadratic

Looping through many multiple regressions

点点圈 提交于 2020-01-05 03:51:11
问题 I am trying to run this code from this post: looping with iterations over two lists of variables for a multiple regression in R with modified variable and data frame names, because it seems to do exactly what I want and uses a very similar dataset. However, it keeps giving me an error and I don't know why, so I would really appreciate if someone could help me to understand the error or the corresponding line of code so I could try to figure out what's wrong. for(i in 1:n) { vars = names

How to make group_by and lm fast?

限于喜欢 提交于 2020-01-04 21:33:34
问题 This is a sample. df <- tibble( subject = rep(letters[1:7], c(5, 6, 7, 5, 2, 5, 2)), day = c(3:7, 2:7, 1:7, 3:7, 6:7, 3:7, 6:7), x1 = runif(32), x2 = rpois(32, 3), x3 = rnorm(32), x4 = rnorm(32, 1, 5)) df %>% group_by(subject) %>% summarise( coef_x1 = lm(x1 ~ day)$coefficients[2], coef_x2 = lm(x2 ~ day)$coefficients[2], coef_x3 = lm(x3 ~ day)$coefficients[2], coef_x4 = lm(x4 ~ day)$coefficients[2]) This data is small, so performance is not problem. But my data is so large, approximately 1,000

Fit many linear models in R with identical design matrices [duplicate]

笑着哭i 提交于 2020-01-04 15:58:11
问题 This question already has an answer here : Fitting a linear model with multiple LHS (1 answer) Closed 3 years ago . For a neuroimaging application, I'm trying to fit many linear models by least squares in R (standard call to lm ). Imagine I have a design matrix X. This design matrix will be the same across all of the models. The data (Y) that is being fit will change, and as a result so will all of the fit parameters (e.g. betas, p-values, residuals, etc). At present, I'm just sticking it in

Fit many linear models in R with identical design matrices [duplicate]

二次信任 提交于 2020-01-04 15:57:08
问题 This question already has an answer here : Fitting a linear model with multiple LHS (1 answer) Closed 3 years ago . For a neuroimaging application, I'm trying to fit many linear models by least squares in R (standard call to lm ). Imagine I have a design matrix X. This design matrix will be the same across all of the models. The data (Y) that is being fit will change, and as a result so will all of the fit parameters (e.g. betas, p-values, residuals, etc). At present, I'm just sticking it in

Knn regression in Matlab

穿精又带淫゛_ 提交于 2020-01-03 16:57:43
问题 What is the k nearest neighbour regression function in Matlab? Is only knn classification function available? Is anybody knowing any useful literature regarding to that? Regards Farideh 回答1: I don't believe the k-NN regression algorithm is directly implemented in matlab, but if you do some googling you can find some valid implementations. The algorithm is fairly simple though. Find the k-Nearest elements using whatever distance metric is suitable. Convert the inverse distance weight of each

Error in df(X0) : argument “df1” is missing, with no default--tracing R code

末鹿安然 提交于 2020-01-03 06:19:12
问题 I have written two gradient descent functions and in the second one I just have the alpha parameter and the initial alpha is different. I receive a weird error and was unable to trace the reason for it. Here's the code: k=19000 rho.prime<-function(t,k) ifelse (abs(t)<=k,2*t,(2*k*sign(t))) dMMSE <- function(b,k=19000, y=farmland$farm, x=farmland$land){ n = length(y) a=0 d=0 for (i in 1:n) { a = a + rho.prime(y[i]-b[1]-b[2]*x[i],k) d = d + x[i]*rho.prime(y[i]-b[1]-b[2]*x[i],k) } a <- (-a/n) d <

Time Series Projection in Google BigQuery

﹥>﹥吖頭↗ 提交于 2020-01-03 04:32:06
问题 I was looking for a good way to do time series projection in BigQuery, and found this one, which nicely works to calculate correlations and slope: View Post. But it doesn't help to extend the timeline to your choice. But can anyone please suggest a complete solution, where I can extend the timeline (x) according to my need and get projections of (Y) using a single query? Any help will be highly appreciated. 回答1: The basic idea is to left join the model specs to a generated dates table and use