tensorflow-datasets

tf.data with multiple inputs / outputs in Keras

大憨熊 提交于 2019-12-03 11:25:42
For the application, such as pair text similarity , the input data is similar to: pair_1, pair_2 . In these problems, we usually have multiple input data. Previously, I implemented my models successfully: model.fit([pair_1, pair_2], labels, epochs=50) I decided to replace my input pipeline with tf.data API. To this end, I create a Dataset similar to: dataset = tf.data.Dataset.from_tensor_slices((pair_1, pair2, labels)) It compiles successfully but when start to train it throws the following exception: AttributeError: 'tuple' object has no attribute 'ndim' My Keras and Tensorflow version

How to switch between training and validation dataset with tf.MonitoredTrainingSession?

回眸只為那壹抹淺笑 提交于 2019-12-03 06:51:13
I want to use feedable iterator design in tensorflow Dataset API, so I can switch to validation data after some training steps. But if I switched to validation data, it will end the whole session. The following code demonstrate what I want to do: import tensorflow as tf graph = tf.Graph() with graph.as_default(): training_ds = tf.data.Dataset.range(32).batch(4) validation_ds = tf.data.Dataset.range(8).batch(4) handle = tf.placeholder(tf.string, shape=[]) iterator = tf.data.Iterator.from_string_handle( handle, training_ds.output_types, training_ds.output_shapes) next_element = iterator.get_next

Tensorflow Data API - prefetch

╄→尐↘猪︶ㄣ 提交于 2019-12-03 06:38:46
I am trying to use new features of TF, namely Data API, and I am not sure how prefetch works. In the code below def dataset_input_fn(...) dataset = tf.data.TFRecordDataset(filenames, compression_type="ZLIB") dataset = dataset.map(lambda x:parser(...)) dataset = dataset.map(lambda x,y: image_augmentation(...) , num_parallel_calls=num_threads ) dataset = dataset.shuffle(buffer_size) dataset = dataset.batch(batch_size) dataset = dataset.repeat(num_epochs) iterator = dataset.make_one_shot_iterator() does it matter between each lines above I put dataset=dataset.prefetch(batch_size) ? Or maybe it

How to Generate a index file(Table) from a database?

折月煮酒 提交于 2019-12-02 21:56:39
问题 I am recently working on better indexing concepts(Learned Index-Tensorflow), as part of that, I am curious to know, that how do i generate a index file(Let's say a CSV for now) from A large database, for that i want to generate the Index file(CSV) from the database that changes daily dynamically, I want to update the index file based on the new data that is added in the Database. i.e Generating a index file from the Database. And after Generating the index file, I will train the model using

Using a created tensorflow model for predicting

流过昼夜 提交于 2019-12-02 18:04:37
问题 I'm looking at source code from this Tensorflow article that talks about how to create a wide-and-deep learning model. https://www.tensorflow.org/versions/r1.3/tutorials/wide_and_deep Here is the link to the python source code: https://github.com/tensorflow/tensorflow/blob/r1.3/tensorflow/examples/learn/wide_n_deep_tutorial.py What the goal of it is, is to train a model that will predict if someone makes more or less than $50k a year given the data in the census information. As instructed, I

Using a created tensorflow model for predicting

眉间皱痕 提交于 2019-12-02 11:31:52
I'm looking at source code from this Tensorflow article that talks about how to create a wide-and-deep learning model. https://www.tensorflow.org/versions/r1.3/tutorials/wide_and_deep Here is the link to the python source code: https://github.com/tensorflow/tensorflow/blob/r1.3/tensorflow/examples/learn/wide_n_deep_tutorial.py What the goal of it is, is to train a model that will predict if someone makes more or less than $50k a year given the data in the census information. As instructed, I'm running this command to execute: python wide_n_deep_tutorial.py --model_type=wide_n_deep The result

How to feed .h5 files in tf.data pipeline in tensorflow model

吃可爱长大的小学妹 提交于 2019-12-02 09:33:34
I'm trying to optimize the input pipeline for .h5 data with tf.data. But I encountered a TypeError: expected str, bytes or os.PathLike object, not Tensor . I did a research but can't find anything about converting a tensor of string to string. This simplified code is executable and return the same error: batch_size = 1000 conv_size = 3 nb_conv = 32 learning_rate = 0.0001 # define parser function def parse_function(fname): with h5py.File(fname, 'r') as f: #Error comes from here X = f['X'].reshape(batch_size, patch_size, patch_size, 1) y = f['y'].reshape(batch_size, patch_size, patch_size, 1)

Using tf.data.Dataset as training input to Keras model NOT working

大兔子大兔子 提交于 2019-12-02 07:22:17
问题 I have a simple code, which DOES work, for training a Keras model in Tensorflow using numpy arrays as features and labels. If I then wrap these numpy arrays using tf.data.Dataset.from_tensor_slices in order to train the same Keras model using a tensorflow dataset, I get an error. I haven't been able to figure out why (it may be a tensorflow or keras bug, but I may also be missing something). I'm on python 3, tensorflow is 1.10.0, numpy is 1.14.5, no GPU involved. OBS1 : The possibility of

How to acquire tf.data.dataset's shape?

亡梦爱人 提交于 2019-12-02 00:04:34
问题 I know dataset has output_shapes, but it shows like below: data_set: DatasetV1Adapter shapes: {item_id_hist: (?, ?), tags: (?, ?), client_platform: (?,), entrance: (?,), item_id: (?,), lable: (?,), mode: (?,), time: (?,), user_id: (?,)}, types: {item_id_hist: tf.int64, tags: tf.int64, client_platform: tf.string, entrance: tf.string, item_id: tf.int64, lable: tf.int64, mode: tf.int64, time: tf.int64, user_id: tf.int64} How can I get the total number of my data? 回答1: Where the length is known

Restoring a model trained with tf.estimator and feeding input through feed_dict

风格不统一 提交于 2019-12-01 23:42:26
I trained a resnet with tf.estimator, the model was saved during the training process. The saved files consist of .data , .index and .meta . I'd like to load this model back and get predictions for new images. The data was fed to the model during training using tf.data.Dataset . I have closely followed the resnet implementation given here . I would like to restore the model and feed inputs to the nodes using a feed_dict. First attempt #rebuild input pipeline images, labels = input_fn(data_dir, batch_size=32, num_epochs=1) #rebuild graph prediction= imagenet_model_fn(images,labels,{'batch_size'