问题
I want to clip values, how could I do that?
I tried using this:
from keras.backend.tensorflow_backend import clip
from keras.layers.core import Lambda
...
model.add(Dense(1))
model.add(Activation('linear'))
model.add(Lambda(lambda x: clip(x, min_value=200, max_value=1000)))
But it does not matter where I put my Lambda+clip, it does not affect anything?
回答1:
It actually has to be implemented as loss, at the model.compile step.
from keras import backend as K
def clipped_mse(y_true, y_pred):
return K.mean(K.square(K.clip(y_pred, 0., 1900.) - K.clip(y_true, 0., 1900.)), axis=-1)
model.compile(loss=clipped_mse)
EDIT: Actually, now in hindsight I think that this might not be the right approach. This actually means we do not add penalty for going over too high of a values - it's in a way the opposite of what we want.
来源:https://stackoverflow.com/questions/43099233/keras-regression-clip-values