I am using EfficientNet-B0 as a subnet in Siamese network and contrastive loss as a loss function for an image similarity task. My dataset is quite large (27550 images for training) with 2 classes. After the first epoch, the training loss decreases dramatically while the validation loss is unstable. Can overfitting happen this early? Or is there something wrong with my data that is confusing? Here is the graph I get after training my model with 100 epochs
Asked
Active
Viewed 2,064 times
2
-
Do you have same classes ratio for train and test? – bottledmind Jan 12 '22 at 09:44
-
Here is the description details of my dataset: Training data (Class1: 14123, Class2: 13427, Total: 27550), Validation data (Class1: 2513, Class2: 3016, Total: 4889), Testing data (Class1: 1346, Class2: 3543, Total: 5529). – ELbafa Jan 12 '22 at 10:45
1 Answers
2
First, Draw the training and validation loss by setting up a lower and variable learning_rate. This might happen because of higher learning rate. Secondly, we all knows that the model Overfits, when the training loss is way smaller than the testing loss. By using, dropout, regularization and deeper model (vgg, ResNet) you can improve it.

M.Naveed Riaz
- 55
- 1
- 5