If I understand correctly, a model overfits when the training accuracy is much higher than the validation accuracy (or the training loss is much lower than the validation loss). On the hand, a model underfits if it is unable to reduce the training loss to the minimum value (or very close).
It doesn’t make sense for a model to overfit and underfit at the same time but I often observe both when I train my model. Can anyone explain this?
Actually, your training loss will always be lower than your validation loss. It should be that way. So when do you actually overfit? When your training loss is going down and your validation loss initially went down and then starts going up.
I usually train my models until it starts overfitting. Then choose hyperparameters that allow me to reach the point right before it starts overfitting. You are technically underfitting if you don’t reach this point.
Edit: To answer your question, no you can’t overfit and underfit at the same time.