When global_variables_initializer() is actually required
import tensorflow as tf x = tf.constant(35, name='x') y = tf.Variable(x + 5, name='y') # model = tf.global_variables_initializer() with tf.Session() as session: print("x = ", session.run(x)) # session.run(model) print("y = ", session.run(y)) I was not able to understand when global_variables_initializer() is actually required. In the above code, if we uncomment lines 4 & 7, I can execute the code and see the values. If I run as-is, I see a crash. My question is which variables it is initializing. x is a constant which does not need initialization and y is variable which is not being