How Adagrad wroks in Keras? What does self.weights mean in Keras Optimizer?

馋奶兔 提交于 2019-12-03 16:44:42

You are correct.. for all optimizers in Keras get_updates() implements the tensor logic for one step of updates. This function is called once for each model.fit() from _make_train_function() here, which is used to create the tensor function by passing the update rule as update= here. This update rule is used iteration to iteration to update the model parameters and other parameters.

self.weights of an optimizer class is its internal parameters. This is not used for training. It just functions to keep the state of the optimizer (list of pointers to the param/accumulators tensor) and when model.save is called they are also saved by calling get_weights() here and is loaded back when model.load is called by set_weights() here

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!