What is the fastest way to minimize a function in python?

后端 未结 2 726
孤城傲影
孤城傲影 2020-12-31 10:00

So I have the following problem to minimize. I have a vector w that I need to find in order to minimize the following function:

import numpy as          


        
2条回答
  •  庸人自扰
    2020-12-31 10:27

    Based on pylang comments, I calculated the jacobian of my function which leads to the following function:

    def fct_deriv(x):
        return 2 * matrix.dot(x)
    

    The optimization problem becomes the following

    minimize(fct, x0, method='SLSQP', jac=fct_deriv, bounds=bnds, constraints=cons)['x']
    

    However, that solution does not allow to add the Hessian as the SLSQP method does not allow it. Other optimization methods exist, but SLSQP is the only one accepting bounds and constraints at the same time (which is central to my optimizatio problem).

    See below for full code:

    import numpy as np
    from scipy.optimize import minimize
    
    matrix = np.array([[1.0, 1.5, -2.],
                       [0.5, 3.0, 2.5],
                       [1.0, 0.25, 0.75]])
    
    def fct(x):
        return x.dot(matrix).dot(x)
    
    def fct_deriv(x):
        return 2 * matrix.dot(x)
    
    x0 = np.ones(3) / 3
    cons = ({'type': 'eq', 'fun': lambda x: x.sum() - 1.0})
    bnds = [(0, 1)] * 3
    
    w = minimize(fct, x0, method='SLSQP', jac=fct_deriv, bounds=bnds, constraints=cons)['x']
    

    Edited (added the jacobian of the constraint):

    cons2 = ({'type': 'eq', 'fun': lambda x: x.sum() - 1.0, 'jac': lambda x: np.ones_like(x)})
    
    w = minimize(fct, x0, method='SLSQP', jac=fct_deriv, bounds=bnds, constraints=cons2)['x']
    

提交回复
热议问题