问题
Given that the fitting function is of type:
I intend to fit such function to the experimental data (x,y=f(x)) that I have. But then I have some doubts:
How do I define my fitting function when there's a summation involved?
Once the function defined, i.e.
def func(..) return ...
is it still possible to use curve_fit from scipy.optimize? Because now there's a set of parameters s_i and r_i involved compared to the usual fitting cases where one has few single parameters.Finally are such cases treated completely differently?
Feel a bit lost here, thanks for any help.
回答1:
This is very well within reach of scipy.optimize.curve_fit
(or just scipy.optimize.leastsqr
). The fact that a sum is involved does not matter at all, nor that you have arrays of parameters. The only thing to note is that curve_fit
wants to give your fit function the parameters as individual arguments, while leastsqr
gives a single vector.
Here's a solution:
import numpy as np
from scipy.optimize import curve_fit, leastsq
def f(x,r,s):
""" The fit function, applied to every x_k for the vectors r_i and s_i. """
x = x[...,np.newaxis] # add an axis for the summation
# by virtue of numpy's fantastic broadcasting rules,
# the following will be evaluated for every combination of k and i.
x2s2 = (x*s)**2
return np.sum(r * x2s2 / (1 + x2s2), axis=-1)
# fit using curve_fit
popt,pcov = curve_fit(
lambda x,*params: f(x,params[:N],params[N:]),
X,Y,
np.r_[R0,S0],
)
R = popt[:N]
S = popt[N:]
# fit using leastsq
popt,ier = leastsq(
lambda params: f(X,params[:N],params[N:]) - Y,
np.r_[R0,S0],
)
R = popt[:N]
S = popt[N:]
A few things to note:
- Upon start, we need the 1d arrays
X
andY
of measurements to fit to, the 1d arraysR0
andS0
as initial guesses andN
the length of those two arrays. - I separated the implementation of the actual model
f
from the objective functions supplied to the fitters. Those I implemented using lambda functions. Of course, one could also have ordinarydef ...
functions and combine them into one. - The model function
f
uses numpy's broadcasting to simultaneously sum over a set of parameters (along the last axis), and calculate in parallel for manyx
(along any axes before the last, though both fit functions would complain if there is more than one....ravel()
to help there) - We concatenate the fit parameters R and S into a single parameter vector using numpy's shorthand
np.r_[R,S]
. curve_fit
supplies every single parameter as a distinct parameter to the objective function. We want them as a vector, so we use*params
: It catches all remaining parameters in a single list.leastsq
gives a single params vector. However, it neither suppliesx
, nor does it compare it toy
. Those are directly bound into the objective function.
回答2:
In order to use scipy.optimize.leastsq
to estimate multiple parameters, you need to pack them into an array and unpack them inside your function. You can then do anything you want with them. For example, if your s_i are the first 3 and your r_i are the next three parameters in your array p
, you would just set ssum=p[:3].sum()
and rsum=p[3:6].sum()
. But again, your parameters are not identified (according to your comment), so estimation is pointless.
For an example of using leastsq
, see the Cookbook's Fitting Data example.
来源:https://stackoverflow.com/questions/30323573/fitting-a-sum-to-data-in-python