Binarize tensor in Keras

落花浮王杯 提交于 2020-01-04 05:36:10

问题


I need to create a loss function for Keras that works with only binary values. In wanted for to transform all the values greater than 0.5 to 1.0, so I did that:

def MyLoss(y_true, y_pred:
    y_true_f = K.flatten(y_true)
    y_pred_f = K.flatten(K.cast(K.greater(y_pred, 0.5), 'float32'))
    #y_pred_f = K.flatten(K.cast(y_pred > 0.5), 'float32')
    #y_pred_f = K.flatten(y_pred > 0.5)
    return K.sum(y_true_f * y_pred_f)

The code compiles, but later it generates the following error:

ValueError: None values not supported.

I also tried the commented lines, same error. If I don't try to modify the values using simply y_pred_f = K.flatten(y_pred), it runs.

What am I doing wrong?

How can I binarize a tensor?


回答1:


The solution for binarizing my logistic Dense layer was to make a custom lambda function in the activation. (I am working on a semantic hashing autoencoder (Hinton)). A warning is thrown by Keras but it proved to work anyway. Earlier attempts threw error due to inability to differentiate a round function in the backpropagation phase's comptuation of the gradient derivative. (It was the old ValueError: None values not supported.) Somehow doing it in the activation instead of as a separate layer was the key here.

encoder_outputs = Dense(units=latent_vector_len, activation=k.layers.Lambda(lambda z: k.backend.round(k.layers.activations.sigmoid(x=z))), kernel_initializer="lecun_normal")(x)

The Real outputs normally in range 0 to 1 were transformed into 0 and 1 as shown.

# Look it works!

y = encoder_model.predict(x=x_in)
print(y)
>>> [[1. 0. 0. 1. 0. 1. 0. 0.]]

In other words this way did not work:

decoder_outputs_bin = k.layers.Lambda(lambda z: k.backend.round(z))(decoder_outputs) # ERR at training time ValueError: None values not supported.


来源:https://stackoverflow.com/questions/48795910/binarize-tensor-in-keras

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!