regression

Best fit plane by minimizing orthogonal distances

你离开我真会死。 提交于 2020-01-13 11:49:10
问题 I have a set of points (in the form x1,y1,z1 ... xn,yn,zn ) obtained from a surface mesh. I want to find the best-fit 3D plane to these points by minimizing orthogonal distances. x,y,z coordinates are independent, that is I want to obtain the coefficient A, B, C, D for the plane equation Ax + By + Cz + D = 0. What would be the algorithm to obtain A, B, C, D? Note: in a previous post it was discussed the best-fit plane in a least squares sense, by considering the z coordinate a linear function

Fitting a quadratic function in python without numpy polyfit

女生的网名这么多〃 提交于 2020-01-13 09:32:56
问题 I am trying to fit a quadratic function to some data, and I'm trying to do this without using numpy's polyfit function. Mathematically I tried to follow this website https://neutrium.net/mathematics/least-squares-fitting-of-a-polynomial/ but somehow I don't think that I'm doing it right. If anyone could assist me that would be great, or If you could suggest another way to do it that would also be awesome. What I've tried so far: import numpy as np import matplotlib.pyplot as plt import pandas

Drawing only boundaries of stat_smooth in ggplot2

扶醉桌前 提交于 2020-01-13 08:58:52
问题 When using stat_smooth() with geom_point is there a way to remove the shaded fit region, but only draw its outer bounds? I know I can remove the shaded region with something like: geom_point(aes(x=x, y=y)) + geom_stat(aes(x=x, y=y), alpha=0) but how can I make the outer bounds of it (outer curves) still visible as faint black lines? 回答1: You can also use geom_ribbon with fill = NA. gg <- ggplot(mtcars, aes(qsec, wt))+ geom_point() + stat_smooth( alpha=0,method='loess') rib_data <- ggplot

annotate r squared to ggplot by using facet_wrap

岁酱吖の 提交于 2020-01-13 05:55:07
问题 I just joined the community and looking forward to get some help for the data analysis for my master thesis. At the moment I have the following problem: I plotted 42 varieties with ggplot by using facet_wrap: `ggplot(sumfvvar,aes(x=TemperaturCmean,y=Fv.Fm,col=treatment))+ geom_point(shape=1,size=1)+ geom_smooth(method=lm)+ scale_color_brewer(palette = "Set1")+ facet_wrap(.~Variety)` That works very well, but I would like to annotate the r squared values for the regression lines. I have two

annotate r squared to ggplot by using facet_wrap

ぐ巨炮叔叔 提交于 2020-01-13 05:54:02
问题 I just joined the community and looking forward to get some help for the data analysis for my master thesis. At the moment I have the following problem: I plotted 42 varieties with ggplot by using facet_wrap: `ggplot(sumfvvar,aes(x=TemperaturCmean,y=Fv.Fm,col=treatment))+ geom_point(shape=1,size=1)+ geom_smooth(method=lm)+ scale_color_brewer(palette = "Set1")+ facet_wrap(.~Variety)` That works very well, but I would like to annotate the r squared values for the regression lines. I have two

How do you remove an insignificant factor level from a regression using the lm() function in R?

别等时光非礼了梦想. 提交于 2020-01-13 04:28:25
问题 When I perform a regression in R and use type factor it helps me avoid setting up the categorical variables in the data. But how do I remove a factor that is not significant from the regression to just show significant variables? For example: dependent <- c(1:10) independent1 <- as.factor(c('d','a','a','a','a','a','a','b','b','c')) independent2 <- c(-0.71,0.30,1.32,0.30,2.78,0.85,-0.25,-1.08,-0.94,1.33) output <- lm(dependent ~ independent1+independent2) summary(output) Which results in the

Fminsearch Matlab (Non Linear Regression )

折月煮酒 提交于 2020-01-12 10:53:09
问题 Can anyone explain to me how I can apply non linear regression to this equation t find out K using the matlab command window. I = 10^-9(exp(38.68V/k)-1). Screenshot of Equation I have data values as follows: Voltage := [0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1.0]: Current:= [0, 0, 0, 0, 0, 0, 0, 0.07, 0.92, 12.02, 158.29]: Screenshot of Equation [NEW]: Now I used FminSearch as an alternative another and another error message appeared. Matrix dimensions must agree. Error in @(k)sum((I

Fminsearch Matlab (Non Linear Regression )

流过昼夜 提交于 2020-01-12 10:52:30
问题 Can anyone explain to me how I can apply non linear regression to this equation t find out K using the matlab command window. I = 10^-9(exp(38.68V/k)-1). Screenshot of Equation I have data values as follows: Voltage := [0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1.0]: Current:= [0, 0, 0, 0, 0, 0, 0, 0.07, 0.92, 12.02, 158.29]: Screenshot of Equation [NEW]: Now I used FminSearch as an alternative another and another error message appeared. Matrix dimensions must agree. Error in @(k)sum((I

Neural Network Ordinal Classification for Age

妖精的绣舞 提交于 2020-01-12 07:22:13
问题 I have created a simple neural network (Python, Theano) to estimate a persons age based on their spending history from a selection of different stores. Unfortunately, it is not particularly accurate. The accuracy might be hurt by the fact that the network has no knowledge of ordinality. For the network there is no relationship between the age classifications. It is currently selecting the age with the highest probability from the softmax output layer. I have considered changing the output

Plot “regression line” from multiple regression in R

大城市里の小女人 提交于 2020-01-12 07:22:08
问题 I ran a multiple regression with several continuous predictors, a few of which came out significant, and I'd like to create a scatterplot or scatter-like plot of my DV against one of the predictors, including a "regression line". How can I do this? My plot looks like this D = my.data; plot( D$probCategorySame, D$posttestScore ) If it were simple regression, I could add a regression line like this: lmSimple <- lm( posttestScore ~ probCategorySame, data=D ) abline( lmSimple ) But my actual