How to implement RBF activation function in Keras?

时光怂恿深爱的人放手 提交于 2021-01-27 06:47:48

问题


I am creating a customized activation function, RBF activation function in particular:

from keras import backend as K
from keras.layers import Lambda

l2_norm = lambda a,b:  K.sqrt(K.sum(K.pow((a-b),2), axis=0, keepdims=True))

def rbf2(x):
X = #here i need inputs that I receive from previous layer 
Y = # here I need weights that I should apply for this layer
l2 = l2_norm(X,Y)
res = K.exp(-1 * gamma * K.pow(l2,2))
return res

The function rbf2 receives the previous layer as input:

#some keras layers
model.add(Dense(84, activation='tanh')) #layer1
model.add(Dense(10, activation = rbf2)) #layer2

What should I do to get the inputs from layer1 and weights from layer2 to create the customized activation function?

What I am actually trying to do is, implementing the output layer for LeNet5 neural network. The output layer of LeNet-5 is a bit special, instead of computing the dot product of the inputs and the weight vector, each neuron outputs the square of the Euclidean distance between its input vector and its weight vector.

For example, layer1 has 84 neurons and layer2 has 10 neurons. In general cases, for calculating output for each of 10 neurons of layer2, we do the dot product of 84 neurons of layer1 and 84 weights in between layer1 and layer2. We then apply softmax activation function over it.

But here, instead of doing dot product, each neuron of the layer2 outputs the square of the Euclidean distance between its input vector and its weight vector (I want to use this as my activation function).

Any help on creating RBF activation function (calculating euclidean distance from inputs the layer receives and weights) and using it in the layer is also helpful.


回答1:


You can simply define a custom layer for this purpose:

from keras.layers import Layer
from keras import backend as K

class RBFLayer(Layer):
    def __init__(self, units, gamma, **kwargs):
        super(RBFLayer, self).__init__(**kwargs)
        self.units = units
        self.gamma = K.cast_to_floatx(gamma)

    def build(self, input_shape):
        self.mu = self.add_weight(name='mu',
                                  shape=(int(input_shape[1]), self.units),
                                  initializer='uniform',
                                  trainable=True)
        super(RBFLayer, self).build(input_shape)

    def call(self, inputs):
        diff = K.expand_dims(inputs) - self.mu
        l2 = K.sum(K.pow(diff,2), axis=1)
        res = K.exp(-1 * self.gamma * l2)
        return res

    def compute_output_shape(self, input_shape):
        return (input_shape[0], self.units)

Example usage:

model = Sequential()
model.add(Dense(20, input_shape=(100,)))
model.add(RBFLayer(10, 0.5))



回答2:


There is no need to reinvent the wheel here. A custom RBF layer for Keras already exists.



来源:https://stackoverflow.com/questions/53855941/how-to-implement-rbf-activation-function-in-keras

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!