Tensorflow: Setting allow_growth to true does still allocate memory of all my GPUs
问题 I have several GPUs but I only want to use one GPU for my training. I am using following options: config = tf.ConfigProto(allow_soft_placement=True, log_device_placement=True) config.gpu_options.allow_growth = True with tf.Session(config=config) as sess: Despite setting / using all these options, all of my GPUs allocate memory and #processes = #GPUs How can I prevent this from happening? Note I do not want use set the devices manually and I do not want to set CUDA_VISIBLE_DEVICES since I want