how to define the derivative of a custom activation function in keras

一世执手 提交于 2019-12-23 06:52:08

问题


I have a custom activation function and its derivative, although I can use the custom activation function I don't know how to tell keras what is its derivative.

It seems like it finds one itself but I have a parameter that has to be shared between the function and its derivative so how can I do that?

I know there is a relatively easy way to do this in tensorflow but I have no idea how to implement it in keras here is how you do it in tensorflow

Edit: based on the answer I got maybe I wasn't clear enough. What I want is to implement a custom derivative for my activation function so that it use my derivative during the backpropagation. I know how to implement a custom activation function.


回答1:


Take a look at the source code where the activation functions of Keras are defined:

keras/activations.py

For example:

def relu(x, alpha=0., max_value=None):
    """Rectified Linear Unit.

    # Arguments
        x: Input tensor.
        alpha: Slope of the negative part. Defaults to zero.
        max_value: Maximum value for the output.

    # Returns
        The (leaky) rectified linear unit activation: `x` if `x > 0`,
        `alpha * x` if `x < 0`. If `max_value` is defined, the result
        is truncated to this value.
    """
    return K.relu(x, alpha=alpha, max_value=max_value)

And also how does Keras layers call the activation functions: self.activation = activations.get(activation) the activation can be string or callable.

Thus, similarly, you can define your own activation function, for example:

def my_activ(x, p1, p2):
    ...
    return ...

Suppose you want use this activation in Dense layer, you just put your function like this:

x = Dense(128, activation=my_activ(p1, p2))(input)

If you mean you want to implement your own derivative:

If your activation function is written in Tensorflow/Keras functions of which the operations are differentiable (e.g. K.dot(), tf.matmul(), tf.concat() etc.), then the derivatives will be obtained by automatic differentiation https://en.wikipedia.org/wiki/Automatic_differentiation. In that case you dont need to write your own derivative.

If you still want to re-write the derivatives, check this document https://www.tensorflow.org/extend/adding_an_op where you need to register your gradients using tf.RegisterGradient



来源:https://stackoverflow.com/questions/51754639/how-to-define-the-derivative-of-a-custom-activation-function-in-keras

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!