Training Loss and Validation Loss in Deep Learning

后端 未结 3 1789
时光取名叫无心
时光取名叫无心 2021-02-03 15:10

Would you please guide me how to interpret the following results?

1) loss < validation_loss 2) loss > validation_loss

It seems that the training loss always s

3条回答
  •  轮回少年
    2021-02-03 15:57

    Really a fundamental question in machine learning.

    If validation loss >> training loss you can call it overfitting.
    If validation loss  > training loss you can call it some overfitting.
    If validation loss  < training loss you can call it some underfitting.
    If validation loss << training loss you can call it underfitting.
    

    Your aim is to make the validation loss as low as possible. Some overfitting is nearly always a good thing. All that matters in the end is: is the validation loss as low as you can get it.

    This often occurs when the training loss is quite a bit lower.

提交回复
热议问题