How does tf.sign work with back propagation?

做~自己de王妃 提交于 2021-01-28 04:43:45

问题


I'm starting with TensorFlow, and I am trying to create a binary network. More specifically, I'm looking for a hashing network that would encode an image in a binary vector of size n. So I have a classic network, and I binarize the last layer before softmax.

TensorFlow provides a tf.sign function that can do just that, but I can't figure out how the back-propagation can work in practice, since it's not continuous.

Any thoughts ?


回答1:


Unfortunately tf.sign()'s gradient is always defined as 0 (probably because there is no good way to define it.)

https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/ops/math_grad.py#L542

@ops.RegisterGradient("Sign")
def _SignGrad(op, _):
  """Returns 0."""
  x = op.inputs[0]
  return array_ops.zeros(array_ops.shape(x), dtype=x.dtype)


来源:https://stackoverflow.com/questions/43866275/how-does-tf-sign-work-with-back-propagation

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!