问题
I have been reading the theano's logistic regression tutorial. I was trying to understand how the negative log likelihood is calculated.
y = ivector('y')
W = dmatrix('W')
b = dvector('b')
input = dmatrix('inp')
p_y_given_x = T.nnet.softmax(T.dot(input, W) + b)
logs = T.log(self.p_y_given_x)[T.arange(y.shape[0]), y]
On pretty printing printing theano.printing.pprint(logs)
it returned
'AdvancedSubtensor(log(Softmax(x)), ARange(TensorConstant{0}, Constant{0}[Shape(y)], TensorConstant{1}), y)'
Can anyone explain what is this AdvanceSubtensor
doing with a minuscule example?
After this they calculated the -T.mean(logs)
Any help is appreciated! :)
来源:https://stackoverflow.com/questions/34921847/what-does-negative-log-likelihood-of-logistic-regression-in-theano-look-like