How to initialise only optimizer variables in Tensorflow?

前端 未结 6 878
别跟我提以往
别跟我提以往 2020-12-13 04:27

I want to use MomentumOptimizer in Tensorflow. However, since this optimizer uses some internal variable, attempting to use it without initializing this variabl

6条回答
  •  一整个雨季
    2020-12-13 05:17

    Building off of LucasB's answer about AdamOptimizer, this function takes an AdamOptimizer instance adam_opt that has its Variables created (one of these two called: adam_opt.minimize(loss, var_list=var_list) or adam_opt.apply_gradients(zip(grads, var_list)). The function creates an Op that, when called, re-initializes the optimizer's variables for the passed variable, as well as the global counting state.

    def adam_variables_initializer(adam_opt, var_list):
        adam_vars = [adam_opt.get_slot(var, name)
                     for name in adam_opt.get_slot_names()
                     for var in var_list if var is not None]
        adam_vars.extend(list(adam_opt._get_beta_accumulators()))
        return tf.variables_initializer(adam_vars)
    

    e.g.:

    opt = tf.train.AdamOptimizer(learning_rate=1e-4)
    fit_op = opt.minimize(loss, var_list=var_list)
    reset_opt_vars = adam_variables_initializer(opt, var_list)
    

提交回复
热议问题