问题
Does my model overfit? I would be sure it overfitted, if the validation loss increased heavily, while the training loss decreased. However the validation loss is nearly stable, so I am not sure. Can you please help?
回答1:
- I assume that you're using different hyperparameters? Perhaps save
the parameters and resume with a different set of hyperparameters.
This comment really depends on how you're doing hyperparameter
optimization. Try with different training/test splits. It might be idiosyncratic. Especially with so few epochs.
Depending on how costly it is to train the model and evaluate it, consider bagging your models, akin to how a random forest operates. In others words, fit your model to many different train/test splits, and average the model outputs, either in terms of a majority classification vote, or an averaging of the predicted probabilities. In this case, I'd err on the side of a slightly overfit model, because of the way that averaging can mitigate overfitting. But I wouldn't train to death either, unless you're going to fit very very many neural nets, and somehow ensure that you're decorrelating them akin to the method of random subspaces from random forests.
来源:https://stackoverflow.com/questions/57741135/decreasing-training-loss-stable-validation-loss-is-the-model-overfitting