Keras regression clip values

纵饮孤独 提交于 2020-01-11 09:25:10

问题


I want to clip values, how could I do that?

I tried using this:

from keras.backend.tensorflow_backend import clip
from keras.layers.core import Lambda

...
model.add(Dense(1))
model.add(Activation('linear'))
model.add(Lambda(lambda x: clip(x, min_value=200, max_value=1000)))

But it does not matter where I put my Lambda+clip, it does not affect anything?


回答1:


It actually has to be implemented as loss, at the model.compile step.

from keras import backend as K

def clipped_mse(y_true, y_pred):
    return K.mean(K.square(K.clip(y_pred, 0., 1900.) - K.clip(y_true, 0., 1900.)), axis=-1)

model.compile(loss=clipped_mse)

EDIT: Actually, now in hindsight I think that this might not be the right approach. This actually means we do not add penalty for going over too high of a values - it's in a way the opposite of what we want.



来源:https://stackoverflow.com/questions/43099233/keras-regression-clip-values

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!