regression

Writing B-spline as Piecewise Cubic

邮差的信 提交于 2019-12-23 21:15:07
问题 I'm using Scipy's SmoothBivariateSpline class to create a cubic B-spline on bivariate data. I now need to write the piecewise polynomial expression for this spline curve. My mathematical background isn't very strong, so I wasn't able to write my own algorithm for transforming from the t, c, k output of the SmoothBivariateSpline to a polynomial representation. If this is feasible, can you provide pointers on how to approach this? I noticed that Scipy has interpolate.ppform, but I can't find

R Speed up vectorize for square matrix

风格不统一 提交于 2019-12-23 13:32:56
问题 Anyone able to help me speed up some code: n = seq_len(ncol(mat)) # seq 1 to ncol(mat) sym.pr<-outer(n,n,Vectorize(function(a,b) { return(adf.test(LinReg(mat[,c(a,b)]),k=0,alternative="stationary")$p.value) })) Where mat is an NxM matrix of N observation and M objects, e.g: Obj1 Obj2 Obj3 1 . . . 2 . . . 3 . . . LinReg is defined as: # Performs linear regression via OLS LinReg=function(vals) { # regression analysis # force intercept c at y=0 regline<-lm(vals[,1]~as.matrix(vals[,2:ncol(vals)])

R Speed up vectorize for square matrix

北城余情 提交于 2019-12-23 13:32:15
问题 Anyone able to help me speed up some code: n = seq_len(ncol(mat)) # seq 1 to ncol(mat) sym.pr<-outer(n,n,Vectorize(function(a,b) { return(adf.test(LinReg(mat[,c(a,b)]),k=0,alternative="stationary")$p.value) })) Where mat is an NxM matrix of N observation and M objects, e.g: Obj1 Obj2 Obj3 1 . . . 2 . . . 3 . . . LinReg is defined as: # Performs linear regression via OLS LinReg=function(vals) { # regression analysis # force intercept c at y=0 regline<-lm(vals[,1]~as.matrix(vals[,2:ncol(vals)])

Perfect (or near) multicollinearity in julia

白昼怎懂夜的黑 提交于 2019-12-23 13:31:51
问题 Running a simple regression model in Julia with the presence of perfect multicollinearity produces an error. In R, we can run the same model producing NAs in the estimations of the corresponding covariates which R interprets: "not defined because of singularities". We can identify those variables using the alias() function in R. Is there any way I can check for perfect multicollinearity in Julia prior to modeling in order to drop the collinear variables? 回答1: Detecting Perfect Collinearity

plotly regression line R

会有一股神秘感。 提交于 2019-12-23 12:41:55
问题 Problem with adding a regression line to a 'plotly' scatter plot. I've done the following code: require(plotly) data(airquality) ## Scatter plot ## c <- plot_ly(data = airquality, x = Wind, y = Ozone, type = "scatter", mode = "markers" ) c ## Adding regression line (HERE IS THE PROBLEM) ## g <- add_trace(c, x = Wind, y = fitted(lm(Ozone ~ Wind, airquality)), mode = "lines" ) g 回答1: I reckon it's caused by the missing values airq <- airquality %>% filter(!is.na(Ozone)) fit <- lm(Ozone ~ Wind,

Vectorizing the solution of a linear equation system in MATLAB

为君一笑 提交于 2019-12-23 12:13:49
问题 Summary: This question deals with the improvement of an algorithm for the computation of linear regression. I have a 3D ( dlMAT ) array representing monochrome photographs of the same scene taken at different exposure times (the vector IT ) . Mathematically, every vector along the 3rd dimension of dlMAT represents a separate linear regression problem that needs to be solved. The equation whose coefficients need to be estimated is of the form: DL = R*IT^P , where DL and IT are obtained

MATLAB fminunc() not completing for large datasets. Works for smaller ones

十年热恋 提交于 2019-12-23 12:11:35
问题 I am performing logistic regression in MATLAB with L2 regularization on text data. My program works well for small datasets. For larger sets, it keeps running infinitely. I have seen the potentially duplicate question (matlab fminunc not quitting (running indefinitely)). In that question, the cost for initial theta was NaN and there was an error printed in the console. For my implementation, I am getting a real valued cost and there is no error even with verbose parameters being passed to

How to use formula in R to exclude main effect but retain interaction

北城以北 提交于 2019-12-23 09:37:56
问题 I do not want main effect because it is collinear with a finer factor fixed effect, so it is annoying to have these NA . In this example: lm(y ~ x * z) I want the interaction of x (numeric) and z (factor), but not the main effect of z . 回答1: Introduction R documentation of ?formula says: The ‘*’ operator denotes factor crossing: ‘a * b’ interpreted as ‘a + b + a : b So it sounds like that dropping main effect is straightforward, by just doing one of the following: a + a:b ## main effect on `b

Linear regression line in MATLAB scatter plot

左心房为你撑大大i 提交于 2019-12-23 09:12:05
问题 I am trying to get the residuals for the scatter plot of two variables. I could get the least squares linear regression line using lsline function of matlab. However, I want to get the residuals as well. How can I get this in matlab. For that I need to know the parameters a and b of the linear regression line ax+b 回答1: Use the function polyfit to obtain the regression parameters. You can then evaluate the fitted values and calculate your residuals accordingly. Basically polyfit performs least

How can I train a simple, non-linear regression model with tensor flow?

一曲冷凌霜 提交于 2019-12-23 07:59:32
问题 I've seen this example for linear regression and I would like to train a model where What I've tried #!/usr/bin/env python """Example for learning a regression.""" import tensorflow as tf import numpy # Parameters learning_rate = 0.01 training_epochs = 1000 display_step = 50 # Generate training data train_X = [] train_Y = [] f = lambda x: x**2 for x in range(-20, 20): train_X.append(float(x)) train_Y.append(f(x)) train_X = numpy.asarray(train_X) train_Y = numpy.asarray(train_Y) n_samples =