Can a model overfit and underfit at the same time?

If I understand correctly, a model overfits when the training accuracy is much higher than the validation accuracy (or the training loss is much lower than the validation loss). On the hand, a model underfits if it is unable to reduce the training loss to the minimum value (or very close).

It doesn’t make sense for a model to overfit and underfit at the same time but I often observe both when I train my model. Can anyone explain this?

Actually, your training loss will always be lower than your validation loss. It should be that way. So when do you actually overfit? When your training loss is going down and your validation loss initially went down and then starts going up.
I usually train my models until it starts overfitting. Then choose hyperparameters that allow me to reach the point right before it starts overfitting. You are technically underfitting if you don’t reach this point. :man_shrugging:
Edit: To answer your question, no you can’t overfit and underfit at the same time.

both overfitting and underfitting are measured in relative terms, so yes, it is possible to have both at the same time.

Little excerpt from Andrew Ng’s book:

1 Like

Thanks. May I ask what is the name of this book?

It’s called Machine Learning Yearning
You can get it for free here:

It’s basically most insights from his coursera course (deeplearning.ai) in written form, pretty handy.

Thank you sir.

Damn. You learn something everyday

1 Like