I\'m facing a trouble with tensorFlow. Executing the following code
import tensorflow as tf
import input_data
learning_rate = 0.01
training_epochs = 25
bat
This problem is caused by the following line: tf.nn.softmax_cross_entropy_with_logits(labels=activation, logits=Y)
Based on documentation you should have
labels: Each row labels[i] must be a valid probability distribution.
logits: Unscaled log probabilities.
So logits suppose to be your hypothesis and thus equal to activation
and valid probability distribution is Y
. So just change it with tf.nn.softmax_cross_entropy_with_logits(labels=Y, logits=activation)