How to display progress of scipy.optimize function?

后端 未结 7 674
醉酒成梦
醉酒成梦 2020-12-01 05:02

I use scipy.optimize to minimize a function of 12 arguments.

I started the optimization a while ago and still waiting for results.

Is there a wa

7条回答
  •  无人及你
    2020-12-01 05:40

    Following @joel's example, there is a neat and efficient way to do the similar thing. Following example show how can we get rid of global variables, call_back functions and re-evaluating target function multiple times.

    import numpy as np
    from scipy.optimize import fmin_bfgs
    
    def rosen(X, info): #Rosenbrock function
        res = (1.0 - X[0])**2 + 100.0 * (X[1] - X[0]**2)**2 + \
               (1.0 - X[1])**2 + 100.0 * (X[2] - X[1]**2)**2
    
    
        # display information
        if info['Nfeval']%100 == 0:
            print '{0:4d}   {1: 3.6f}   {2: 3.6f}   {3: 3.6f}   {4: 3.6f}'.format(info['Nfeval'], X[0], X[1], X[2], res)
        info['Nfeval'] += 1
        return res
    
    print  '{0:4s}   {1:9s}   {2:9s}   {3:9s}   {4:9s}'.format('Iter', ' X1', ' X2', ' X3', 'f(X)')   
    x0 = np.array([1.1, 1.1, 1.1], dtype=np.double)
    [xopt, fopt, gopt, Bopt, func_calls, grad_calls, warnflg] = \
        fmin_bfgs(rosen, 
                  x0, 
                  args=({'Nfeval':0},), 
                  maxiter=1000, 
                  full_output=True, 
                  retall=False,
                  )
    

    This will generate output like

    Iter    X1          X2          X3         f(X)     
       0    1.100000    1.100000    1.100000    2.440000
     100    1.000000    0.999999    0.999998    0.000000
     200    1.000000    0.999999    0.999998    0.000000
     300    1.000000    0.999999    0.999998    0.000000
     400    1.000000    0.999999    0.999998    0.000000
     500    1.000000    0.999999    0.999998    0.000000
    Warning: Desired error not necessarily achieved due to precision loss.
             Current function value: 0.000000
             Iterations: 12
             Function evaluations: 502
             Gradient evaluations: 98
    

    However, no free launch, here I used function evaluation times instead of algorithmic iteration times as a counter. Some algorithms may evaluate target function multiple times in a single iteration.

提交回复
热议问题