What does negative log likelihood of logistic regression in theano look like?

社会主义新天地 提交于 2019-12-12 02:15:17

问题


I have been reading the theano's logistic regression tutorial. I was trying to understand how the negative log likelihood is calculated.

y = ivector('y')
W = dmatrix('W')
b = dvector('b')
input = dmatrix('inp')
p_y_given_x = T.nnet.softmax(T.dot(input, W) + b)
logs = T.log(self.p_y_given_x)[T.arange(y.shape[0]), y]

On pretty printing printing theano.printing.pprint(logs) it returned

'AdvancedSubtensor(log(Softmax(x)), ARange(TensorConstant{0}, Constant{0}[Shape(y)], TensorConstant{1}), y)'

Can anyone explain what is this AdvanceSubtensor doing with a minuscule example?

After this they calculated the -T.mean(logs)

Any help is appreciated! :)

来源:https://stackoverflow.com/questions/34921847/what-does-negative-log-likelihood-of-logistic-regression-in-theano-look-like

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!