How to set the input of a Keras layer of a functional model, with a Tensorflow tensor?

倖福魔咒の 提交于 2020-01-05 02:24:29

问题


I have two packages I'd like to use, one is written in Keras1.2, and the other one in tensorflow. I'd like to use a part of the architecture that is built in tensorflow into a Keras model.

A partial solution is suggested here, but it's for a sequential model. The suggestion regarding functional models - wrapping the pre-processing in a Lambda layer - didn't work.

The following code worked:

inp = Input(shape=input_shape)
def ID(x):
    return x
lam = Lambda(ID)  
flatten = Flatten(name='flatten')
output = flatten(lam(inp))
Model(input=[inp], output=output)

But, when replacing flatten(lam(inp)) with a pre-processed output tensor flatten(lam(TF_processed_layer)), I got: "Output tensors to a Model must be Keras tensors. Found: Tensor("Reshape:0", shape=(?, ?), dtype=float32)"


回答1:


You could try wrapping your input tensor into the Keras Input layer and carry on building your model from there. Like so:

inp = Input(tensor=tftensor,shape=input_shape)
def ID(x):
    return x
lam = Lambda(ID)  
flatten = Flatten(name='flatten')
output = flatten(lam(inp))
Model(input=inp, output=output)



回答2:


You are not defining your lamba correctly for Keras. Try something like this

def your_lambda_layer(x):
    x -= K.mean(x, axis=1, keepdims=True)
    x = K.l2_normalize(x, axis=1)
    return x

....
model.add(Lambda(your_lambda_layer))

of seeing you are using the Functional API like this

def your_lambda_layer(x):
    x -= K.mean(x, axis=1, keepdims=True)
    x = K.l2_normalize(x, axis=1)
    return x

....
x = SomeLayerBeforeLambda(options...)(x)
x = (Lambda(your_lambda_layer))(x)

But even so, the lambda layer may not be able to be flattened so printout the shape of the lambda and take a look at it and see what it is.



来源:https://stackoverflow.com/questions/44151823/how-to-set-the-input-of-a-keras-layer-of-a-functional-model-with-a-tensorflow-t

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!