Validation loss and Error rate staying steady no matter what... how to rectify?

I’m training an old model with a new dataset. I’ve tried a few learning rates and done around 10 or so cycles of training, each of 10-15 epochs. However, after the initial drop in the first cycle, my validation loss and error rate have stayed steady, no matter what I try in terms of learning rates, moms or wd etc.

Anyone know why? My training loss is still falling… does that just mean I need to train it more? I’ve probably trained (collectively) for 5 hours now.

This is the last run:
26%20PM

Do you tried a higher learning rate?
Maybe you stuck in a „flat region“.

If this is not helping you can try to add more regularization (weight decay, batchnorm, dropout, and data augmentation).

I am curious if these strategies can help you.

Kind regards
Michael