Hard limiting / threshold activation function in TensorFlow

匿名 (未验证) 提交于 2019-12-03 08:56:10

问题:

I'm trying to implement a basic, binary Hopfield Network in TensorFlow 0.9. Unfortunately I'm having a very hard time getting the activation function working. I'm looking to get the very simple If net[i] < 0, output[i] = 0, else output[i] = 1 but everything I've tried seems to remove the gradient, i.e. I get the "No gradients provided for any variable" exception when trying to implement the training op.

For example, I tried casting tf.less() to float, I tried doing something along the lines of

tf.maximum(tf.minimum(net, 0) + 1, 0) 

but I forgot about small decimal values. Finally I did

tf.maximum(tf.floor(tf.minimum(net, 0) + 1), 0) 

but tf.floor doesn't register gradients. I also tried replacing the floor with a cast to int and then a cast back to float but same deal.

Any suggestions on what I could do?

回答1:

a bit late, but if anyone needs it, I used this definition

def binary_activation(x):      cond = tf.less(x, tf.zeros(tf.shape(x)))     out = tf.where(cond, tf.zeros(tf.shape(x)), tf.ones(tf.shape(x)))      return out 

with x being a tensor



回答2:

Just for the record, one can get the sign function via tf.sign. It outputs a float or integer (depending on the input) indicating the sign with -1 or 1. However, note that tf.sign(0) == 0!

For a hard limiting activation function, binary threshold activation function, Heaviside step function, see the other answer.



标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!