Normalization for optimization in python

好久不见. 提交于 2019-12-10 23:54:29

问题


During optimization, it is often helpful to normalize the input parameters to make them on the same order of magnitude, so the convergence can be much better. For example, if we want to minimize f(x), while a reasonable approximation is x0=[1e3, 1e-4], it might be helpful to normalize x0[0] and x0[1] to about the same order of magnitude (often O(1)).

My question is, I have been using scipy.optimize and specifically, the L-BFGS-B algorithm. I was wondering that, do I need to normalize that manually by writing a function, or the algorithm already did it for me?

Thank you!


回答1:


I wrote a quick small program to test your question.

To summarize: if the parameters are within a couple orders of magnitude of each other, then the algorithm handles it (it successfully converges and does not need to do significantly more function evaluations).

On the other hand, when you start getting beyond a factor of 10000, the algorithm starts to break down and errors out.

Here is the program:

import scipy.optimize


def test_lbfgsb():
    def surface(x):
        return (x[0] - 3.0) ** 2 + (factor * x[1] - 4.0) ** 2

    factor = None
    for exponent in xrange(0, 9):
        magnitude = 10 ** exponent
        factors = [x * magnitude for x in [1, 3]]
        for factor in factors:
            optimization_result = scipy.optimize.minimize(surface, [0, 0], method='l-bfgs-b')
            desc = 'at factor %d' % (factor)
            if not optimization_result.success:
                print '%s FAILURE (%s)' % (desc, optimization_result.message)
            else:
                print '%s, found min at %s, after %d evaluations' % (
                    desc, optimization_result.x, optimization_result.nfev)


test_lbfgsb()

Here is its output:

at factor 1, found min at [ 3.00000048  4.00000013], after 12 evaluations
at factor 3, found min at [ 2.99999958  1.33333351], after 36 evaluations
at factor 10, found min at [ 3.00000059  0.39999999], after 28 evaluations
at factor 30, found min at [ 2.99999994  0.13333333], after 36 evaluations
at factor 100, found min at [ 3.00000013  0.03999999], after 40 evaluations
at factor 300, found min at [ 3.          0.01333333], after 52 evaluations
at factor 1000, found min at [ 3.          0.00399999], after 64 evaluations
at factor 3000, found min at [  3.00000006e+00   1.33332833e-03], after 72 evaluations
at factor 10000, found min at [  3.00002680e+00   3.99998309e-04], after 92 evaluations
at factor 30000, found min at [  3.00000002e+00   1.33328333e-04], after 104 evaluations
at factor 100000 FAILURE (ABNORMAL_TERMINATION_IN_LNSRCH)
at factor 300000, found min at [  3.00013621e+00   1.33292531e-05], after 196 evaluations
at factor 1000000, found min at [  3.00000348e-12   3.99500004e-06], after 60 evaluations
at factor 3000000 FAILURE (ABNORMAL_TERMINATION_IN_LNSRCH)
at factor 10000000 FAILURE (ABNORMAL_TERMINATION_IN_LNSRCH)
at factor 30000000 FAILURE (ABNORMAL_TERMINATION_IN_LNSRCH)
at factor 100000000 FAILURE (ABNORMAL_TERMINATION_IN_LNSRCH)
at factor 300000000, found min at [  3.33333330e-17   8.33333350e-09], after 72 evaluations


来源:https://stackoverflow.com/questions/21369139/normalization-for-optimization-in-python

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!