Tensorflow `set_random_seed` not working

匿名 (未验证) 提交于 2019-12-03 02:49:01

问题:

Calling tf.set_random_seed(SEED) has no effect that I can tell...

For example, running the code below several times inside an IPython notebook produces different output each time:

import tensorflow as tf tf.set_random_seed(42) sess = tf.InteractiveSession() a = tf.constant([1, 2, 3, 4, 5]) tf.initialize_all_variables().run() a_shuf = tf.random_shuffle(a) print(a.eval()) print(a_shuf.eval()) sess.close() 

If I set the seed explicitly: a_shuf = tf.random_shuffle(a, seed=42), the output is the same after each run. But why do I need to set the seed if I already call tf.set_random_seed(42)?


The equivalent code using numpy just works:

import numpy as np np.random.seed(42) a = [1,2,3,4,5] np.random.shuffle(a) print(a) 

回答1:

That only sets the graph-level random seed. If you execute this snippet several times in a row, the graph will change, and two shuffle statements will get different operation-level seeds. The details are described in the doc string for set_random_seed

To get deterministic a_shuf you can either

  1. Call tf.reset_default_graph() between invocations or
  2. Set operation-level seed for shuffle: a_shuf = tf.random_shuffle(a, seed=42)


标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!