I want to use MomentumOptimizer
in Tensorflow. However, since this optimizer uses some internal variable, attempting to use it without initializing this variabl
Building off of LucasB's answer about AdamOptimizer
, this function takes an AdamOptimizer
instance adam_opt
that has its Variables
created (one of these two called: adam_opt.minimize(loss, var_list=var_list)
or adam_opt.apply_gradients(zip(grads, var_list))
. The function creates an Op
that, when called, re-initializes the optimizer's variables for the passed variable, as well as the global counting state.
def adam_variables_initializer(adam_opt, var_list):
adam_vars = [adam_opt.get_slot(var, name)
for name in adam_opt.get_slot_names()
for var in var_list if var is not None]
adam_vars.extend(list(adam_opt._get_beta_accumulators()))
return tf.variables_initializer(adam_vars)
e.g.:
opt = tf.train.AdamOptimizer(learning_rate=1e-4)
fit_op = opt.minimize(loss, var_list=var_list)
reset_opt_vars = adam_variables_initializer(opt, var_list)