Dropout with densely connected layer
问题 Iam using a densenet model for one of my projects and have some difficulties using regularization. Without any regularization, both validation and training loss (MSE) decrease. The training loss drops faster though, resulting in some overfitting of the final model. So I decided to use dropout to avoid overfitting. When using Dropout, both validation and training loss decrease to about 0.13 during the first epoch and remain constant for about 10 epochs. After that both loss functions decrease