How to make a piecewise activation function with Python in TensorFlow?

最后都变了- 提交于 2019-12-13 03:22:13

问题


The active function in my CNN has the form:

abs(X)< tou  f = 1.716tanh(0.667x)
x >= tou     f = 1.716[tanh(2tou/3)+tanh'(2tou/3)(x-tou)]
x <= -tou    f = 1.716[tanh(-2tou/3)+tanh'(-2tou/3)(x+tou)]

tou is a constant.

So, in TensorFlow it is possible to make your own activation function. I don't want to write it in C++ and recompile the whole of TensorFlow.

How can I use the function available in TensorFlow to achieve it?


回答1:


In tensorflow it is easy to write your own activation function if it's include already existed ops, for your case you can use tf.case

f = tf.case({tf.less(tf.abs(x), tou): lambda: 7.716 * tf.tanh(0.667 * x),
         tf.greater_equal(x, tou): lambda: 1.716 * tf.tanh(2 * tou / 3) + 1.716 * tf.tanh(2 * tou / 3) * (x - tou)},
        default=lambda: 1.716 * tf.tanh(-2 * tou / 3) + 1.716 * tf.tanh(-2 * tou / 3) * (x + tou), exclusive=True)


来源:https://stackoverflow.com/questions/45769719/how-to-make-a-piecewise-activation-function-with-python-in-tensorflow

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!