Keras train_on_batch loss/accuracy 0

喜你入骨 提交于 2019-12-04 19:03:47

问题


I am using a big dataset, and so I'm trying to use train_on_batch(or fit with epoch = 1)

model = Sequential()
model.add(LSTM(size,input_shape=input_shape,return_sequences=False))
model.add(Dense(output_dim))
model.add(Activation('softmax'))
model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=["accuracy"])

for e in range(nb_epoch):
    for batch_X, batch_y in batches:
        model.train_on_batch(batch_X,batch_y)
        # or
        # model.fit(batch_X,batch_y,batch_size=batch_size,nb_epoch=1,verbose=1,shuffle=True,)

But when training starts, this happens:

(0, 128)
Epoch 1/1
128/128 [==============================] - 2s - loss: 0.3262 - acc: 0.1130

(129, 257)
Epoch 1/1
128/128 [==============================] - 2s - loss: -0.0000e+00 - acc: 0.0000e+00

It doesn't matter how many epochs I wait, it doesn't change. Even If I change the batch size, same thing happens: The first batch has good values and then it just goes to "loss: -0.0000e+00 - acc: 0.0000e+00" again.

Can someone maybe help in understanding what's happening here?

Thanks in advance

来源:https://stackoverflow.com/questions/37543132/keras-train-on-batch-loss-accuracy-0

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!