I\'m having trouble using buckets in my Tensorflow model. When I run it with buckets = [(100, 100)]
, it works fine. When I run it with buckets = [(100, 10
This solution does not work for me. Any new solution?
These two solutions work for me:
change seq2seq.py under /yourpath/tensorflow/contrib/legacy_seq2seq/python/ops/
#encoder_cell = copy.deepcopy(cell)
encoder_cell = core_rnn_cell.EmbeddingWrapper(
cell, #encoder_cell,
or
for nextBatch in tqdm(batches, desc="Training"):
_, step_loss = model.step(...)
fed one bucket at a step
The problem is with latest changes in seq2seq.py
. Add this to your script and it will avoid deep-coping of the cells:
setattr(tf.contrib.rnn.GRUCell, '__deepcopy__', lambda self, _: self)
setattr(tf.contrib.rnn.BasicLSTMCell, '__deepcopy__', lambda self, _: self)
setattr(tf.contrib.rnn.MultiRNNCell, '__deepcopy__', lambda self, _: self)