Custom loss function implementation issue in keras

笑着哭i 提交于 2019-12-02 13:15:57

This happens because your function is not differentiable. It's made of constants.

There is simply no solution for this if you want argmax as result.


An approach to test

Since you're using "softmax", that means that only one class is correct (you don't have two classes at the same time).

And since you want index differences, maybe you could work with a single continuous result (continuous values are differentiable).

Work with only one output ranging from -0.5 to 9.5, and take the classes by rounding the result.

That way, you can have the last layer with only one unit:

lastLayer = Dense(1,activation = 'sigmoid', ....) #or another kind if it's not dense    

And change the range with a lambda layer:

lambdaLayer = Lambda(lambda x: 10*x - 0.5)

Now your loss can be a simple 'mae' (mean absolute error).

The downside of this attempt is that the 'sigmoid' activation is not evenly distributed between the classes. Some classes will be more probable than others. But since it's important to have a limit, it seems at first the best idea.

This will only work if you classes follow a logical increasing sequence. (I guess they do, otherwise you'd not be trying that kind of loss, right?)

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!