TensorBoard - Plot training and validation losses on the same graph?

前端 未结 8 1766
不思量自难忘°
不思量自难忘° 2020-12-02 18:34

Is there a way to plot both the training losses and validation losses on the same graph?

It\'s easy to have two separate scalar summaries for each of them i

8条回答
  •  既然无缘
    2020-12-02 18:56

    Here is an example, creating two tf.summary.FileWriters which share the same root directory. Creating a tf.summary.scalar shared by the two tf.summary.FileWriters. At every time step, get the summary and update each tf.summary.FileWriter.

    import os
    
    import tqdm
    import tensorflow as tf
    
    
    def tb_test():
        sess = tf.Session()
    
        x = tf.placeholder(dtype=tf.float32)
        summary = tf.summary.scalar('Values', x)
        merged = tf.summary.merge_all()
    
        sess.run(tf.global_variables_initializer())
    
        writer_1 = tf.summary.FileWriter(os.path.join('tb_summary', 'train'))
        writer_2 = tf.summary.FileWriter(os.path.join('tb_summary', 'eval'))
    
        for i in tqdm.tqdm(range(200)):
            # train
            summary_1 = sess.run(merged, feed_dict={x: i-10})
            writer_1.add_summary(summary_1, i)
            # eval
            summary_2 = sess.run(merged, feed_dict={x: i+10})            
            writer_2.add_summary(summary_2, i)
    
        writer_1.close()
        writer_2.close()
    
    
    if __name__ == '__main__':
        tb_test()
    

    Here is the result:

    The orange line shows the result of the evaluation stage, and correspondingly, the blue line illustrates the data of the training stage.

    Also, there is a very useful post by TF team to which you can refer.

提交回复
热议问题