tensorflow-layers

How to use TensorBoard and summary operations with the tf.layers module

只愿长相守 提交于 2021-02-07 09:18:08
问题 I have followed the TensorFlow Layers tutorial to create a CNN for MNIST digit classification using TensorFlow's tf.layers module. Now I'm trying to learn how to use TensorBoard from TensorBoard: Visualizing Learning. Perhaps this tutorial hasn't been updated recently, because it says its example code is a modification of that tutorial's and links to it, but the code is completely different: it manually defines a single-hidden-layer fully-connected network. The TensorBoard tutorial shows how

How to use TensorBoard and summary operations with the tf.layers module

不想你离开。 提交于 2021-02-07 09:17:59
问题 I have followed the TensorFlow Layers tutorial to create a CNN for MNIST digit classification using TensorFlow's tf.layers module. Now I'm trying to learn how to use TensorBoard from TensorBoard: Visualizing Learning. Perhaps this tutorial hasn't been updated recently, because it says its example code is a modification of that tutorial's and links to it, but the code is completely different: it manually defines a single-hidden-layer fully-connected network. The TensorBoard tutorial shows how

Tensorflow : logits and labels must have the same first dimension

蹲街弑〆低调 提交于 2020-08-01 09:10:29
问题 I am new in tensoflow and I want to adapt the MNIST tutorial https://www.tensorflow.org/tutorials/layers with my own data (images of 40x40). This is my model function : def cnn_model_fn(features, labels, mode): # Input Layer input_layer = tf.reshape(features, [-1, 40, 40, 1]) # Convolutional Layer #1 conv1 = tf.layers.conv2d( inputs=input_layer, filters=32, kernel_size=[5, 5], # To specify that the output tensor should have the same width and height values as the input tensor # value can be

Tensorflow : logits and labels must have the same first dimension

强颜欢笑 提交于 2020-08-01 09:10:06
问题 I am new in tensoflow and I want to adapt the MNIST tutorial https://www.tensorflow.org/tutorials/layers with my own data (images of 40x40). This is my model function : def cnn_model_fn(features, labels, mode): # Input Layer input_layer = tf.reshape(features, [-1, 40, 40, 1]) # Convolutional Layer #1 conv1 = tf.layers.conv2d( inputs=input_layer, filters=32, kernel_size=[5, 5], # To specify that the output tensor should have the same width and height values as the input tensor # value can be

Alternative to arg_scope when using tf.layers

此生再无相见时 提交于 2019-12-14 02:37:01
问题 I'm rewriting tf.contrib.slim.nets.inception_v3 using tf.layers . Unfortunately the new tf.layers module does not work with arg_scope , as it does not have the necessary decorators. Is there better mechanism in place that I should use to set default paramters for layers? Or should I simply add a proper arguments to each layer and remove the arg_scope ? Here is an example that uses the arg_scope: with variable_scope.variable_scope(scope, 'InceptionV3', [inputs]): with arg_scope( [layers.conv2d

How to use tf.contrib.model_pruning on MNIST?

别说谁变了你拦得住时间么 提交于 2019-11-28 01:06:01
问题 I'm struggling to use Tensorflow's pruning library and haven't found many helpful examples so I'm looking for help to prune a simple model trained on the MNIST dataset. If anyone can either help fix my attempt or provide an example of how to use the library on MNIST I would be very grateful. The first half of my code is pretty standard except my model has 2 hidden layers 300 units wide using layers.masked_fully_connected for pruning. import tensorflow as tf from tensorflow.contrib.model