Simultaneous optimization of two different functions to provide a universal solution for both

爷,独闯天下 提交于 2021-02-11 15:38:45

问题


I asked a similar question in January that @Miłosz Wieczór was kind enough to answer. Now, I am faced with a similar but different challenge since I need to fit two parameters (fc and alpha) simultaneously on two datasets (e_exp and iq_exp). I basically need to find the values of fc and alpha that are the best fits to both data e_exp and iq_exp.

import numpy as np
import math
from scipy.optimize import curve_fit, least_squares, minimize

f_exp  = np.array([1, 1.6, 2.7, 4.4, 7.3, 12, 20, 32, 56, 88, 144, 250000])
e_exp  = np.array([7.15, 7.30, 7.20, 7.25, 7.26, 7.28, 7.32, 7.25, 7.35, 7.34, 7.37, 11.55])
iq_exp = np.array([0.010, 0.009, 0.011, 0.011, 0.010, 0.012, 0.019, 0.027, 0.038, 0.044, 0.052, 0.005])

ezero  = np.min(e_exp)
einf   = np.max(e_exp)

ig_fc     = 500
ig_alpha  = 0.35

def CCRI(f_exp, fc, alpha):
    x   = np.log(f_exp/fc)
    R  = ezero + 1/2 * (einf - ezero) * (1 + np.sinh((1 - alpha) * x) / (np.cosh((1 - alpha) * x) + np.sin(1/2 * alpha * math.pi)))
    I  = 1/2 * (einf - ezero) * np.cos(alpha * math.pi / 2) / (np.cosh((1 - alpha) * x) + np.sin(alpha * math.pi / 2))
    RI = np.sqrt(R ** 2 + I ** 2)
    return RI

def CCiQ(f_exp, fc, alpha):
    x   = np.log(f_exp/fc)
    R  = ezero + 1/2 * (einf - ezero) * (1 + np.sinh((1 - alpha) * x) / (np.cosh((1 - alpha) * x) + np.sin(1/2 * alpha * math.pi)))
    I  = 1/2 * (einf - ezero) * np.cos(alpha * math.pi / 2) / (np.cosh((1 - alpha) * x) + np.sin(alpha * math.pi / 2))
    iQ = I / R
    return iQ

poptRI, pcovRI = curve_fit(CCRI, f_exp, e_exp, p0=(ig_fc, ig_alpha))

poptiQ, pcoviQ = curve_fit(CCiQ, f_exp, iq_exp, p0=(ig_fc, ig_alpha))

einf, ezero, and f_exp are all constant plus the variables I need to optimize are ig_fc and ig_alpha, in which ig stands for initial guess. In the code above, I get two different fc and alpha values because I solve them independently. I need however to solve them simultaneously so that fc and alpha are universal.

  • Is there a way to solve two different functions to provide universal solutions for fc and alpha?

回答1:


The docs state on the second returned value from curve_fit:

pcov

The estimated covariance of popt. The diagonals provide the variance of the parameter estimate. To compute one standard deviation errors on the parameters use perr = np.sqrt(np.diag(pcov)).

So if you want to minimize the overall error, you need to combine the errors of both your fits.

def objective(what, ever):

    poptRI, pcovRI = curve_fit(CCRI, f_exp, e_exp, p0=(ig_fc, ig_alpha))

    poptiQ, pcoviQ = curve_fit(CCiQ, f_exp, iq_exp, p0=(ig_fc, ig_alpha))

    # not sure if this the correct equation, but you can start with it
    err_total = np.sum(np.sqrt(np.diag(pcovRI))) + np.sum(np.sqrt(np.diag(pcoviQ)))

    return err_total

On total errors of 2d Gaussian functions:

https://www.visiondummy.com/2014/04/draw-error-ellipse-representing-covariance-matrix/

Update: Since you want poptRI and poptiQ to be the same, you need to minimize their distance.

This can be done like

from numpy import linalg

def objective(what, ever):

    poptRI, pcovRI = curve_fit(CCRI, f_exp, e_exp, p0=(ig_fc, ig_alpha))

    poptiQ, pcoviQ = curve_fit(CCiQ, f_exp, iq_exp, p0=(ig_fc, ig_alpha))

    delta = linalg.norm(poptiQ - poptRI)
    return delta

Minimizing this function will (should) result in similar values for poptRI and poptiQ. You take the parameters as vectors, and try to minimize the length of their delta vector.

However, this approach assumes that poptRI and poptiQ (and their coefficients) are about in the same range since you are using some metric on them. If say one if them is in the range 2000 and the other in the range 2. Then the optimizer will favour tuning the first one. But maybe this is fine.

If you somehow want to treat them the same you need to normalize them. One approach (assuming all coefficients are similar) could be

linalg.norm((poptiQ / linalg.norm(poptiQ)) - (poptRI / linalg.norm(poptRI))))

You normalize the results to unit vectors, then subtract them, then create the norm.

The same is true for the inputs to the function, but it might not be that important there. See the links below.

But this strongly depends on the problem you are trying to solve. There is no general solution.

Some links related to this:

Is normalization useful/necessary in optimization?

Why do we have to normalize the input for an artificial neural network?

Another objective function:

It this what you are trying to do? You want to find the best fc and alpha so the fit results of both functions are as close as possible?

def objective(fc, alpha):

    poptRI, pcovRI = curve_fit(CCRI, f_exp, e_exp, p0=(fc, alpha))

    poptiQ, pcoviQ = curve_fit(CCiQ, f_exp, iq_exp, p0=(fc, alpha))

    delta = linalg.norm(poptiQ - poptRI)
    return delta


来源:https://stackoverflow.com/questions/61227277/simultaneous-optimization-of-two-different-functions-to-provide-a-universal-solu

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!