TensorBoard Distributions and Histograms with Keras and fit_generator

隐身守侯 提交于 2019-12-03 01:40:43

There is no easy way to just plug it in with one line of code, you have to write your summaries by hand.

Good news is that it's not difficult and you can use TensorBoard callback code in Keras as a reference: https://github.com/fchollet/keras/blob/master/keras/callbacks.py#L537

Basically, write a function e.g. write_summaries(model) and call it whenever you want to write your summaries (e.g. just after your fit_generator())

Inside your write_summaries(model) function use tf.summary, histogram_summary and other summary functions to log data you want to see on tensorboard.

If you don't know exactly how, check official tutorial: https://www.tensorflow.org/get_started/summaries_and_tensorboard and this great example of MNIST with summaries: https://github.com/tensorflow/tensorflow/blob/master/tensorflow/examples/tutorials/mnist/mnist_with_summaries.py

I believe bartgras's explanation is superseded in more recent versions of Keras (I'm using Keras 2.2.2). To get histograms in Tensorboard all I did was the following, (where bg is a data wrangling class which exposes a generator for gb.training_batch(); gb.validation_batch() however is NOT a generator):

NAME = "Foo_{}".format(datetime.now().isoformat(timespec='seconds')).replace(':', '-')

tensorboard = keras.callbacks.TensorBoard(
    log_dir="logs/{}".format(NAME),
    histogram_freq=1,
    write_images=True)

callbacks = [
    tensorboard
]

history = model.fit_generator(
    bg.training_batch(),
    validation_data=bg.validation_batch(),
    epochs=EPOCHS,
    steps_per_epoch=bg.steps_per_epoch,
    validation_steps=bg.validation_steps,
    verbose=1,
    shuffle=False,
    callbacks=callbacks)
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!