I have trained a multi label model for 20 epochs. Its current accuracy is 0.571. After saving the model as a pth file, I decided to load it again to continue training. However, after another 20 epochs, the accuracy does not improve, remaining constant around ~0.570. The loss function is a custom FocalLoss().
How do I fix this problem? Thank you for assisting me.
Attached are pictures of the loss function; training; save method; load method; and accuracy measurement.
Have you tried different hyper parameters, e.g. a different architecture, different loss function, changing the learning rate(s)?
Please refer to this weblog:
If you interrupt a training in epoch #10 of, say, 20 epochs and then start again for more 9 epochs, you’ll not have the same result as training uninterruptedly for 20 epochs, because a new training from scratch, even if you load the weights from the last epoch, will employ e fresh learning rate and momentum policy and go through the cycle again. What you want is to start from where you were interrupted in the cycle.