Neural Network using Softmax with strange outputs

喜欢而已 提交于 2019-12-06 02:56:31

Your cost function implements a softmax atop of your model output which also has a softmax. You should remove the one in the loss function. Besides this your code seems fine: Are you sure: That the topology (dropout rate, number of layers number of neurons per layer) are the same with both of your models? Are you sure you didn't swar the order of your classes. What about loss and validation loss metric after both trainings?

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!