How do I use a minimization function in scipy with constraints

后端 未结 2 1244
Happy的楠姐
Happy的楠姐 2020-12-24 14:07

I need some help regarding optimisation functions in python(scipy) the problem is optimizing f(x) where x=[a,b,c...n]. the constraints are that val

相关标签:
2条回答
  • You can do a constrained optimization with COBYLA or SLSQP as it says in the docs.

    from scipy.optimize import minimize
    
    start_pos = np.ones(6)*(1/6.) #or whatever
    
    #Says one minus the sum of all variables must be zero
    cons = ({'type': 'eq', 'fun': lambda x:  1 - sum(x)})
    
    #Required to have non negative values
    bnds = tuple((0,1) for x in start_pos)
    

    Combine these into the minimization function.

    res = minimize(getSharpe, start_pos, method='SLSQP', bounds=bnds ,constraints=cons)
    
    0 讨论(0)
  • 2020-12-24 15:07

    Check .minimize docstring:

    scipy.optimize.minimize(fun, x0, args=(), method='BFGS', jac=None, hess=None, hessp=None, \
                  bounds=None, constraints=(), tol=None, callback=None, options=None)
    

    What matters the most in your case will be the bounds. When you want to constrain your parameter in [0,1] (or (0,1)?) You need to define it for each variable, such as:

    bounds=((0,1), (0,1).....)
    

    Now, the other part, sum(x)==1. There may be more elegant ways to do it, but consider this: instead of minimizing f(x), you minimize h=lambda x: f(x)+g(x), a new function essential f(x)+g(x) where g(x) is a function reaches it minimum when sum(x)=1. Such as g=lambda x: (sum(x)-1)**2.

    The minimum of h(x) is reached when both f(x) and g(x) are at their minimum. Sort of a case of Lagrange multiplier method http://en.wikipedia.org/wiki/Lagrange_multiplier

    0 讨论(0)
提交回复
热议问题