loss, val_loss, acc and val_acc do not update at all over epochs

99封情书 提交于 2019-12-05 01:35:31

The softmax activation makes sure the sum of the outputs is 1. It's useful for assuring that only one class among many classes will be output.

Since you have only 1 output (only one class), it's certainly a bad idea. You're probably ending up with 1 as result for all samples.

Use sigmoid instead. It goes well with binary_crossentropy.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!