Keras Functional API and activations

徘徊边缘 提交于 2021-01-29 09:20:31

问题


I'm having problems when trying to use activations with Keras Functional API. My initial goal was to have choice between relu and leaky relu, so I came up with the following piece of code:

def activation(x, activation_type):
    if activation_type == 'leaky_relu':
        return activations.relu(x, alpha=0.3)
    else:
        return activations.get(activation_type)(x)


# building the model

inputs = keras.Input(input_shape, dtype='float32')
x = Conv2D(filters, (3, 3), padding='same')(inputs)
x = activation(x, 'relu')

but something like this gives error: AttributeError: 'Tensor' object has no attribute '_keras_history'. I found out that it may indicate that my inputs and outputs in Model are not connected.

Is keras.advanced_activations the only way to achieve functionality like this in functional API?

EDIT: here's the version of activation function that worked:

    def activation(self, x):
        if self.activation_type == 'leaky_relu':
            act = lambda x: activations.relu(x, alpha=0.3)
        else:
            act = activations.get(self.activation_type)
        return layers.Activation(act)(x)

回答1:


You want to add an activation to your model by means of an activation layer. Currently, you are adding an object that is not a Keras Layer, which is causing your error. (In Keras, layer names always start with a capital). Try something like this (minimal example):

from keras.layers import Input, Dense, Activation
from keras import activations

def activation(x, activation_type):
    if activation_type == 'leaky_relu':
        return activations.relu(x, alpha=0.3)
    else:
        return activations.get(activation_type)(x)


# building the model
inputs = Input((5,), dtype='float32')
x = Dense(128)(inputs)
# Wrap inside an Activation layer
x = Activation(lambda x: activation(x, 'sigmoid'))(x)


来源:https://stackoverflow.com/questions/51948878/keras-functional-api-and-activations

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!