tensorflow2.0

When is TensorFlow's ParameterServerStrategy preferable to its MultiWorkerMirroredStrategy?

家住魔仙堡 提交于 2020-11-27 04:27:05
问题 When training a neural network across multiple servers and GPUs, I can't think of a scenario where the ParameterServerStrategy would be preferable to the MultiWorkerMirroredStrategy . What are the ParameterServerStrategy 's main use cases and why would it be better than using MultiWorkerMirroredStrategy ? 回答1: MultiWorkerMirroredStrategy is intended for synchronous distributed training across multiple workers, each of which can have multiple GPUs ParameterServerStrategy : Supports parameter

How can I modifya sequencial data using map or filter or reduce method for tf.data.Dataset objects?

牧云@^-^@ 提交于 2020-11-25 04:05:20
问题 I have a python data generator- import numpy as np import tensorflow as tf vocab_size = 5 def create_generator(): 'generates sequences of varying lengths(5 to 7) with random number from 0 to voca_size-1' count = 0 while count < 5: sequence_len = np.random.randint(5, 8) # length varies from 5 to 7 seq = np.random.randint(0, vocab_size, (sequence_len)) yield seq count +=1 gen = tf.data.Dataset.from_generator(create_generator, args=[], output_types=tf.int32, output_shapes = (None, ), ) for g in