tensorboard

TensorBoard Embedding Example?

回眸只為那壹抹淺笑 提交于 2019-11-29 23:09:26
I'm looking for a tensorboard embedding example, with iris data for example like the embedding projector http://projector.tensorflow.org/ But unfortunately i couldn't find one. Just a little bit information about how to do it in https://www.tensorflow.org/how_tos/embedding_viz/ Does someone knows a basic tutorial for this functionality? Basics: 1) Setup a 2D tensor variable(s) that holds your embedding(s). embedding_var = tf.Variable(....) 2) Periodically save your embeddings in a LOG_DIR. 3) Associate metadata with your embedding. It sounds like you want to get the Visualization section with

Is it possible to visualize a tensorflow graph without a training op?

偶尔善良 提交于 2019-11-29 18:54:30
问题 I know how to visualize a tensorflow graph after training with tensorboard. Now, is it possible to visualize just the forward part of the graph, i.e., with no training operator defined? The reason I'm asking this is that I'm getting this error: No gradients provided for any variable, check your graph for ops that do not support gradients, between variables [ ... list of model variables here ... ] and loss Tensor("Mean:0", dtype=float32). I'd like to inspect the graph to find out where the

How to display the average of multiple runs on tensorboard

泄露秘密 提交于 2019-11-29 13:49:46
Is there a way to display the average of multiple different runs on tensorflow? I can only see them on the same graph (by sending the path of the different runs), but I want to see their average on the graph Please follow issue 376 to see progress on this. It's an active feature request with some progress in the last month, but as of now, there's not a way to do what you want. Yet. As @dga mentioned this is not implemented yet. Here is some code that uses EventAccumulator to combine scalar tensorflow summary values. This can be extended to accommodate the other summary types. import os from

Install Tensorflow with Quantization Support

試著忘記壹切 提交于 2019-11-29 12:29:13
This is a follow-up of another question by me : Error with 8-bit Quantization in Tensorflow Basically, I would like to install the Tensorflow with 8-bit quantization support. Currently, I installed Tensorflow 0.9 with pip installation method on CentOS 7 machine (without GPU support). I could compile and run the codes as given in Pete Warden's blog post. But, I can't import the functions given in Pete Warden's reply. I would like to add the quantization support. I couldn't find any details about the quantization part in the Tensorflow documentation also. Can anybody share the details on how to

How to create a Tensorflow Tensorboard Empty Graph

冷暖自知 提交于 2019-11-29 09:18:50
launch tensorboard with tensorboard --logdir=/home/vagrant/notebook at tensorboard:6006 > graph, it says No graph definition files were found. To store a graph, create a tf.python.training.summary_io.SummaryWriter and pass the graph either via the constructor, or by calling its add_graph() method. import tensorflow as tf sess = tf.Session() writer = tf.python.training.summary_io.SummaryWriter("/home/vagrant/notebook", sess.graph_def) However the page is still empty, how can I start playing with tensorboard? current tensorboard result wanted An empty graph that can add nodes, editable. update

How can I use tensorboard with tf.estimator.Estimator

橙三吉。 提交于 2019-11-29 07:48:25
问题 I am considering to move my code base to tf.estimator.Estimator, but I cannot find an example on how to use it in combination with tensorboard summaries. MWE: import numpy as np import tensorflow as tf tf.logging.set_verbosity(tf.logging.INFO) # Declare list of features, we only have one real-valued feature def model(features, labels, mode): # Build a linear model and predict values W = tf.get_variable("W", [1], dtype=tf.float64) b = tf.get_variable("b", [1], dtype=tf.float64) y = W*features[

Tensorboard scalars and graphs duplicated

白昼怎懂夜的黑 提交于 2019-11-29 07:22:09
I'm using TensorBoard to visualize network metrics and graph. I create a session sess = tf.InteractiveSession() and build the graph in Jupyter notebook. In the graph, I include two summary scalars: with tf.variable_scope('summary') as scope: loss_summary = tf.summary.scalar('Loss', cross_entropy) train_accuracy_summary = tf.summary.scalar('Train_accuracy', accuracy) I then create a summary_writer = tf.summary.FileWriter(logdir, sess.graph) and run: _,loss_sum,train_accuracy_sum=sess.run([...],feed_dict=feed_dict) I write the metrics: summary_writer.add_summary(loss_sum, i) summary_writer.add

How to visualize a tensor summary in tensorboard

…衆ロ難τιáo~ 提交于 2019-11-29 06:11:19
I'm trying to visualize a tensor summary in tensorboard. However I can't see the tensor summary at all in the board. Here is my code: out = tf.strided_slice(logits, begin=[self.args.uttWindowSize-1, 0], end=[-self.args.uttWindowSize+1, self.args.numClasses], strides=[1, 1], name='softmax_truncated') tf.summary.tensor_summary('softmax_input', out) where out is a multi-dimensional tensor. I guess there must be something wrong with my code. Probably I used the tensor_summary function incorrectly. Michael Gygli What you do is you create a summary op, but you don't invoke it and don't write the

Tensorboard - visualize weights of LSTM

…衆ロ難τιáo~ 提交于 2019-11-29 02:14:05
I am using several LSTM layers to form a deep recurrent neural network. I would like to monitor the weights of each LSTM layer during training. However, I couldn't find out how to attach summaries of the LSTM layer weights to TensorBoard. Any suggestions on how this can be done? The code is as follows: cells = [] with tf.name_scope("cell_1"): cell1 = tf.contrib.rnn.LSTMCell(self.embd_size, state_is_tuple=True, initializer=self.initializer) cell1 = tf.contrib.rnn.DropoutWrapper(cell1, input_keep_prob=self.input_dropout, output_keep_prob=self.output_dropout, state_keep_prob=self.recurrent

Keras - Save image embedding of the mnist data set

。_饼干妹妹 提交于 2019-11-29 02:00:55
I've written the following simple MLP network for the MNIST db. from __future__ import print_function import keras from keras.datasets import mnist from keras.models import Sequential from keras.layers import Dense, Dropout from keras import callbacks batch_size = 100 num_classes = 10 epochs = 20 tb = callbacks.TensorBoard(log_dir='/Users/shlomi.shwartz/tensorflow/notebooks/logs/minist', histogram_freq=10, batch_size=32, write_graph=True, write_grads=True, write_images=True, embeddings_freq=10, embeddings_layer_names=None, embeddings_metadata=None) early_stop = callbacks.EarlyStopping(monitor=