tensorboard

Creating log directory in tensorboard

十年热恋 提交于 2019-12-20 17:26:48
问题 I am trying to learn how to use tensorboard and I would like to have it run in my program. I do not understand how to create a log directory. These are the lines I have for running tensorboard. summary_writer = tf.train.SummaryWriter('/tensorflow/logdir', sess.graph_def) tensorboard --logdir=tensorflow/logdir The error message that I got was Cannot assign to operator 回答1: This line needs to be in your code (the python script), as it seems you put it: summary_writer = tf.train.SummaryWriter('

Tensorboard- superimpose 2 plots

拟墨画扇 提交于 2019-12-19 19:52:07
问题 In tensorboard, I would like to superimpose 2 plots on the same graph (training and validation losses of a neural network). I can see 2 separate plots, but not one plot with 2 superimposed curves. Otherwise, I get one plot in zigzag. How can I do? 回答1: It is possible to superimpose two plots in Tensorboard. You'll have to satisfy both of the following: Create two separate tf.train.SummaryWriter objects such that it outputs in two folders. Create two summaries (e.g. tf.scalar_summary ) with

Tensorflow summery merge error : Shape [-1,784] has negative dimensions

我们两清 提交于 2019-12-19 06:08:07
问题 I am trying to get summary of a training process of the neural net below. import tensorflow as tf import numpy as np from tensorflow.examples.tutorials.mnist import input_data mnist = input_data.read_data_sets(".\MNIST",one_hot=True) # Create the model def train_and_test(hidden1,hidden2, learning_rate, epochs, batch_size): with tf.name_scope("first_layer"): input_data = tf.placeholder(tf.float32, [batch_size, 784], name = "input") weights1 = tf.Variable( tf.random_normal(shape =[784, hidden1]

TensorFlow 2.0 Keras: How to write image summaries for TensorBoard

痞子三分冷 提交于 2019-12-18 03:46:05
问题 I'm trying to setup an image recognition CNN with TensorFlow 2.0. To be able to analyze my image augmentation I'd like to see the images I feed into the network in tensorboard. Unfortunately, I cannot figure out, how to do this with TensorFlow 2.0 and Keras. I also didn't really find documentation on this. For simplicity, I'm showing the code of an MNIST example. How would I add the image summary here? import tensorflow as tf (x_train, y_train), _ = tf.keras.datasets.mnist.load_data() def

Tensorboard - visualize weights of LSTM

大城市里の小女人 提交于 2019-12-18 03:38:09
问题 I am using several LSTM layers to form a deep recurrent neural network. I would like to monitor the weights of each LSTM layer during training. However, I couldn't find out how to attach summaries of the LSTM layer weights to TensorBoard. Any suggestions on how this can be done? The code is as follows: cells = [] with tf.name_scope("cell_1"): cell1 = tf.contrib.rnn.LSTMCell(self.embd_size, state_is_tuple=True, initializer=self.initializer) cell1 = tf.contrib.rnn.DropoutWrapper(cell1, input

How do I add an arbitrary value to a TensorFlow summary?

↘锁芯ラ 提交于 2019-12-17 19:20:22
问题 In order to log a simple value val to a TensorBoard summary I need to val = 5 test_writer.add_summary(sess.run(tf.scalar_summary('test', val)), global_step) Is sess.run(tf.scalar_summary('test', val)) really necessary to get val added as a summary? 回答1: You can construct the summary by yourself, like from tensorflow.core.framework import summary_pb2 value = summary_pb2.Summary.Value(tag="Accuracy", simple_value=0.95) summary = summary_pb2.Summary(value=[value]) you can then add summary using

View Tensorboard on Docker on Google Cloud

坚强是说给别人听的谎言 提交于 2019-12-17 17:54:52
问题 I am trying to display TensorBoard from TensorFlow on Docker on Google Cloud. http://tensorflow.org/how_tos/summaries_and_tensorboard/index.md tensorboard --logdir ./ I have Apache running on Google Cloud (it may be in my first container "ai-unicorn" Docker made its own container "docker-playground"). I can see the default page from Google Cloud at http://104.197.119.57/ . I start TensorBoard on Google Cloud like this: root@6cf64fd299f0:/# tensorboard --logdir ./ Starting TensorBoard on port

View Tensorboard on Docker on Google Cloud

痴心易碎 提交于 2019-12-17 17:54:06
问题 I am trying to display TensorBoard from TensorFlow on Docker on Google Cloud. http://tensorflow.org/how_tos/summaries_and_tensorboard/index.md tensorboard --logdir ./ I have Apache running on Google Cloud (it may be in my first container "ai-unicorn" Docker made its own container "docker-playground"). I can see the default page from Google Cloud at http://104.197.119.57/ . I start TensorBoard on Google Cloud like this: root@6cf64fd299f0:/# tensorboard --logdir ./ Starting TensorBoard on port

Tensorflow: How to Display Custom Images in Tensorboard (e.g. Matplotlib Plots)

守給你的承諾、 提交于 2019-12-17 17:44:16
问题 The Image Dashboard section of the Tensorboard ReadMe says: Since the image dashboard supports arbitrary pngs, you can use this to embed custom visualizations (e.g. matplotlib scatterplots) into TensorBoard. I see how a pyplot image could be written to file, read back in as a tensor, and then used with tf.image_summary() to write it to TensorBoard, but this statement from the readme suggests there is a more direct way. Is there? If so, is there any further documentation and/or examples of how

How do I use the Tensorboard callback of Keras?

不想你离开。 提交于 2019-12-17 15:00:57
问题 I have built a neural network with Keras. I would visualize its data by Tensorboard, therefore I have utilized: keras.callbacks.TensorBoard(log_dir='/Graph', histogram_freq=0, write_graph=True, write_images=True) as explained in keras.io. When I run the callback I get <keras.callbacks.TensorBoard at 0x7f9abb3898> , but I don't get any file in my folder "Graph". Is there something wrong in how I have used this callback? 回答1: keras.callbacks.TensorBoard(log_dir='./Graph', histogram_freq=0,