activation-function

How to implement RBF activation function in Keras?

时光怂恿深爱的人放手 提交于 2021-01-27 06:47:48
问题 I am creating a customized activation function, RBF activation function in particular: from keras import backend as K from keras.layers import Lambda l2_norm = lambda a,b: K.sqrt(K.sum(K.pow((a-b),2), axis=0, keepdims=True)) def rbf2(x): X = #here i need inputs that I receive from previous layer Y = # here I need weights that I should apply for this layer l2 = l2_norm(X,Y) res = K.exp(-1 * gamma * K.pow(l2,2)) return res The function rbf2 receives the previous layer as input: #some keras

How to implement the derivative of Leaky Relu in python?

半腔热情 提交于 2021-01-27 04:50:27
问题 How would I implement the derivative of Leaky ReLU in Python without using Tensorflow? Is there a better way than this? I want the function to return a numpy array def dlrelu(x, alpha=.01): # return alpha if x < 0 else 1 return np.array ([1 if i >= 0 else alpha for i in x]) Thanks in advance for the help 回答1: The method you use works, but strictly speaking you are computing the derivative with respect to the loss, or lower layer, so it might be wise to also pass the value from lower layer to

How do you write a custom activation function in python for Keras?

天涯浪子 提交于 2020-06-01 07:32:26
问题 I'm trying to write a custom activation function for use with Keras. I can not write it with tensorflow primitives as it does properly compute the derivative. I followed How to make a custom activation function with only Python in Tensorflow? and it works very we in creating a tensorflow function. However, when I tried putting it into Keras as an activation function for the classic MNIST demo. I got errors. I also tried the tf_spiky function from the above reference. Here is the sample code

Custom activation with parameter

落花浮王杯 提交于 2020-05-27 07:24:46
问题 I'm trying to create an activation function in Keras that can take in a parameter beta like so: from keras import backend as K from keras.utils.generic_utils import get_custom_objects from keras.layers import Activation class Swish(Activation): def __init__(self, activation, beta, **kwargs): super(Swish, self).__init__(activation, **kwargs) self.__name__ = 'swish' self.beta = beta def swish(x): return (K.sigmoid(beta*x) * x) get_custom_objects().update({'swish': Swish(swish, beta=1.)}) It

Custom activation with parameter

故事扮演 提交于 2020-05-27 07:23:15
问题 I'm trying to create an activation function in Keras that can take in a parameter beta like so: from keras import backend as K from keras.utils.generic_utils import get_custom_objects from keras.layers import Activation class Swish(Activation): def __init__(self, activation, beta, **kwargs): super(Swish, self).__init__(activation, **kwargs) self.__name__ = 'swish' self.beta = beta def swish(x): return (K.sigmoid(beta*x) * x) get_custom_objects().update({'swish': Swish(swish, beta=1.)}) It

In simple multi-layer FFNN only ReLU activation function doesn't converge

允我心安 提交于 2020-01-11 13:43:28
问题 I'm learning tensorflow, deep learning and experimenting various kinds of activation functions. I created a multi-layer FFNN for the MNIST problem. Mostly based on the tutorial from the official tensorflow website, except that 3 hidden layers were added. The activation functions I have experimented are: tf.sigmoid , tf.nn.tanh , tf.nn.softsign , tf.nn.softmax , tf.nn.relu . Only tf.nn.relu doesn't converge, the network output random noise (testing accuracy is about 10%). The following are my

In simple multi-layer FFNN only ReLU activation function doesn't converge

ぐ巨炮叔叔 提交于 2020-01-11 13:43:10
问题 I'm learning tensorflow, deep learning and experimenting various kinds of activation functions. I created a multi-layer FFNN for the MNIST problem. Mostly based on the tutorial from the official tensorflow website, except that 3 hidden layers were added. The activation functions I have experimented are: tf.sigmoid , tf.nn.tanh , tf.nn.softsign , tf.nn.softmax , tf.nn.relu . Only tf.nn.relu doesn't converge, the network output random noise (testing accuracy is about 10%). The following are my

Activation function for output layer for regression models in Neural Networks

泄露秘密 提交于 2019-12-30 07:53:16
问题 I have been experimenting with neural networks these days. I have come across a general question regarding the activation function to use. This might be a well known fact to but I couldn't understand properly. A lot of the examples and papers I have seen are working on classification problems and they either use sigmoid (in binary case) or softmax (in multi-class case) as the activation function in the out put layer and it makes sense. But I haven't seen any activation function used in the