Scipy.optimize - curve fitting with fixed parameters

为君一笑 提交于 2019-12-10 23:45:19

问题


I'm performing curve fitting with scipy.optimize.leastsq. E.g. for a gaussian:

def fitGaussian(x, y, init=[1.0,0.0,4.0,0.1]):
    fitfunc = lambda p, x: p[0]*np.exp(-(x-p[1])**2/(2*p[2]**2))+p[3] # Target function
    errfunc = lambda p, x, y: fitfunc(p, x) - y # Distance to the target function
    final, success = scipy.optimize.leastsq(errfunc, init[:], args=(x, y))
    return fitfunc, final

Now, I want to optionally fix the values of some of the parameters in the fit. I found that suggestions are to use a different package lmfit, which I want to avoid, or are very general, like here. Since I need a solution which

  1. works with numpy/scipy (no further packages etc.)
  2. is independent of the parameters themselves,
  3. is flexible, in which parameters are fixed or not,

I came up with the following, using a condition on each of the parameters:

def fitGaussian2(x, y, init=[1.0,0.0,4.0,0.1], fix = [False, False, False, False]):
    fitfunc = lambda p, x: (p[0] if not fix[0] else init[0])*np.exp(-(x-(p[1] if not fix[1] else init[1]))**2/(2*(p[2] if not fix[2] else init[2])**2))+(p[3] if not fix[3] else init[3]) 
    errfunc = lambda p, x, y: fitfunc(p, x) - y # Distance to the target function
    final, success = scipy.optimize.leastsq(errfunc, init[:], args=(x, y))
    return fitfunc, final

While this works fine, it's neither practical, nor beautiful. So my question is: Are there better ways of performing curve fitting in scipy for fixed parameters? Or are there wrappers, which already include such parameter fixing?


回答1:


Using scipy, there are no builtin options that I am aware of. You will always have to do a work-around like the one you already did.

If you are willing to use a wrapper package however, may I recommend my own symfit? This is a wrapper to scipy with readability and less boilerplate code as its core principles. In symfit, your problem would be solved as:

from symfit import parameters, variables, exp, Fit, Parameter

a, b, c, d = parameters('a, b, c, d')
x, y = variables('x, y')

model_dict = {y: a * exp(-(x - b)**2 / (2 * c**2)) + d}

fit = Fit(model_dict, x=xdata, y=ydata)
fit_result = fit.execute()

The line a, b, c, d = parameters('a, b, c, d') makes four Parameter objects. To fix e.g. the parameter c to its initial value, do the following anywhere before calling fit.execute():

c.value = 4.0
c.fixed = True

So a possible end result might be:

from symfit import parameters, variables, exp, Fit, Parameter

a, b, c, d = parameters('a, b, c, d')
x, y = variables('x, y')

c.value = 4.0
c.fixed = True

model_dict = {y: a * exp(-(x - b)**2 / (2 * c**2)) + d}

fit = Fit(model_dict, x=xdata, y=ydata)
fit_result = fit.execute()

If you want to be more dynamic in your code, you could make the Parameter objects straight away using:

c = Parameter(4.0, fixed=True)

For more info, check the docs: http://symfit.readthedocs.io/en/latest/tutorial.html#simple-example




回答2:


The above example using symfit would surely simply the syntax of the fitting approach, however, does the example given really constrain the variable c?

If you look at the fit_result.param you get the following:

OrderedDict([('a', 16.374368575343127), ('b', 0.49201249437123556), ('c', 0.5337962977235504), ('d', -9.55593614465743)])

The parameter c is not 4.0.



来源:https://stackoverflow.com/questions/40091479/scipy-optimize-curve-fitting-with-fixed-parameters

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!