curve-fitting

Doing many iterations of curve_fit in one go for piecewise function

血红的双手。 提交于 2019-12-06 00:16:01
问题 I'm trying to perform what are many iterations of Scipy's curve_fit at once in order to avoid loops and therefore increase speed. This is very similar to this problem, which was solved. However, the fact that the functions are piece-wise (discontinuous) makes so that that solution isn't applicable here. Consider this example: import numpy as np from numpy import random as rng from scipy.optimize import curve_fit rng.seed(0) N=20 X=np.logspace(-1,1,N) Y = np.zeros((4, N)) for i in range(0,4):

How to use Java Math Commons CurveFitter?

倾然丶 夕夏残阳落幕 提交于 2019-12-05 22:49:23
问题 How do I use Math Commons CurveFitter to fit a function to a set of data? I was told to use CurveFitter with LevenbergMarquardtOptimizer and ParametricUnivariateFunction , but I don't know what to write in the ParametricUnivariateFunction gradient and value methods. Besides, after writing them, how to get the fitted function parameters? My function: public static double fnc(double t, double a, double b, double c){ return a * Math.pow(t, b) * Math.exp(-c * t); } 回答1: So, this is an old

Tricks for fitting data in nlme?

爱⌒轻易说出口 提交于 2019-12-05 21:45:10
When I fit data in nlme, I never succeed on the first try, and after nlme(fit.model) I am accustomed to seeing things such as: Error in nlme.formula(model = mass ~ SSbgf(day, w.max, t.e, t.m), random = list( : step halving factor reduced below minimum in PNLS step Error in MEestimate(nlmeSt, grpShrunk) : Singularity in backsolve at level 0, block 1 So I go back and 1)Change the units of the x-axis (e.g. from years to days, or days to growing degree days). 2)Make a x=0, y=0 measurement in my dataset 3)Add a random=pdDiag() 4)Mess with what is random and what is fixed 5)Chop up my dataset and

passing arguments to a function for fitting

試著忘記壹切 提交于 2019-12-05 21:39:29
I am trying to fit a function which takes as input 2 independent variables x,y and 3 parameters to be found a,b,c. This is my test code: import numpy as np from scipy.optimize import curve_fit def func(x,y, a, b, c): return a*np.exp(-b*(x+y)) + c y= x = np.linspace(0,4,50) z = func(x,y, 2.5, 1.3, 0.5) #works ok #generate data to be fitted zn = z + 0.2*np.random.normal(size=len(x)) popt, pcov = curve_fit(func, x,y, zn) #<--------Problem here!!!!! But i am getting the error: "func() takes exactly 5 arguments (51 given)". How can pass my arguments x,y correctly? A look at the documentation of

Fitting a curve with a pivot point Python

不打扰是莪最后的温柔 提交于 2019-12-05 21:03:20
I have the plot below and I want to fit it with 2 lines. Using python I manage to fit the upper part: def func(x,a,b): x=np.array(x) return a*(x**b) popt,pcov=curve_fit(func,up_x,up_y) And I want to fit the lower part with another line, but I want the line to pass through the point where the red one stars, so I can have a continuous function. So my question is how can I use curve_fit by giving a point the function has to pass through, but leaving the slope of the line to be calculated by python? (Or any other python package able to do it) A possible stepwise parametrisation of your model in

fitting two dimensional curves in matlab

早过忘川 提交于 2019-12-05 20:44:29
There's a toolbox function for the curve fitting toolbox called cftool that lets you fit curves to 1-d data. Is there anything for 2-d data? Jerry suggested two very good choices. There are other options though, if you want a more formulaic form for the model. The curvefitting toolbox, in the current version, allows you to fit surfaces to data, not just curves. Or fit a 2-d polynomial model, using a tool like polyfitn . Or you can use a nonlinear regression, if you have a model in mind. The optimization toolbox will help you there, with lsqnonlin or lsqcurvefit, either of which can fit 2-d (or

How to fit polynomial to data with error bars

风流意气都作罢 提交于 2019-12-05 19:58:56
问题 I am currently using numpy.polyfit(x,y,deg) to fit a polynomial to experimental data. I would however like to fit a polynomial that uses weighting based on the errors of the points. I have found scipy.curve_fit which makes use of weights and I suppose I could just set the function, 'f', to the form a polynomial of my desired order, and put my weights in 'sigma', which should achieve my goal. I was wondering is there another, better way of doing this? Many Thanks. 回答1: Take a look at http:/

curve_fit failing on even a sine wave

最后都变了- 提交于 2019-12-05 19:50:53
I'm trying to use curve_fit to fit a simple sine wave (not even with any noise) as a test before I move on to more complex problems. Unfortunately it's not giving even remotely the right answer. Here's my syntax: x = linspace(0,100,300) y = sin(1.759*x) def mysine(x, a): return sin(a*x) popt, pcov = curve_fit(mysine, x, y) popt array([ 0.98679056]) And then if I try an initial guess (say 1.5): popt, pcov = curve_fit(mysine, x, y, p0=1.5) popt array([ 1.49153365]) ... which is still nowhere near the right answer. I guess I'm surprised that, given how well the function is sampled, the fit doesn

class method as a model function for scipy.optimize.curve_fit

血红的双手。 提交于 2019-12-05 19:34:43
There is a statement in the manual of curve_fit that The model function, f(x, ...). It must take the independent variable as the first argument and the parameters to fit as separate remaining arguments. However, I would like to use as a model function a method of the class which is defined as: def model_fun(self,x,par): So, the first argument is not an independent variable, as you can see. Is there any way how I can use the method of a class as a model function for curve_fit shx2 Sure, create an instance and pass its bound method: class MyClass(object): ... def model_fun(self,x,par): ... obj =

fit (triple-) gauss to data python

让人想犯罪 __ 提交于 2019-12-05 14:13:06
The short version of my problem is the following: I have a histogram of some data (density of planets) which seems to have 3 peeks. Now I want to fit 3 gaussians to this histogram. I am expecting this outcome. I used different methods to fit my gauss: curve_fit, least square and GaussianMixture from sklearn.mixture. With Curve_fit I get a pretty good fit but it isn't good enough if you compare it to my expected outcome. With least square I get a "good fit" but my gaussians are nonsense, and with GaussianMixture I don't get anywhere, because I can't really adept the codes I've seen in Examples