How to add variables to progress bar in Keras?

后端 未结 3 1985
清酒与你
清酒与你 2020-12-04 17:09

I\'d like to monitor eg. the learning rate during training in Keras both in the progress bar and in Tensorboard. I figure there must be a way to specify which variables are

3条回答
  •  温柔的废话
    2020-12-04 17:34

    Another way (in fact encouraged one) of how to pass custom values to TensorBoard is by sublcassing the keras.callbacks.TensorBoard class. This allows you to apply custom functions to obtain desired metrics and pass them directly to TensorBoard.

    Here is an example for learning rate of Adam optimizer:

    class SubTensorBoard(TensorBoard):
        def __init__(self, *args, **kwargs):
            super(SubTensorBoard, self).__init__(*args, **kwargs)
    
        def lr_getter(self):
            # Get vals
            decay = self.model.optimizer.decay
            lr = self.model.optimizer.lr
            iters = self.model.optimizer.iterations # only this should not be const
            beta_1 = self.model.optimizer.beta_1
            beta_2 = self.model.optimizer.beta_2
            # calculate
            lr = lr * (1. / (1. + decay * K.cast(iters, K.dtype(decay))))
            t = K.cast(iters, K.floatx()) + 1
            lr_t = lr * (K.sqrt(1. - K.pow(beta_2, t)) / (1. - K.pow(beta_1, t)))
            return np.float32(K.eval(lr_t))
    
        def on_epoch_end(self, episode, logs = {}):
            logs.update({"lr": self.lr_getter()})
            super(SubTensorBoard, self).on_epoch_end(episode, logs)
    

提交回复
热议问题