Train_loss vs accuracy (newbie question)

Hi there,
please excuse my ignorance, i am still a beginner in this field…

I am training an image classification model, and while the train_loss steadily goes down, the accuracy doesn’t increase by meaningful values.

does that mean there is no reason to keep training the model further?
or does the train_loss still indicate that there is ‘progress’?

thank you

Hey muff, :wave:

I’m not sure how far you have gotten in the course but in Lecture 3 Jeremy talks more about learning rates and number of epochs which may answer some of your questions?

Here’s a to the part of the lecture in question at time 56:23

Further, see the overfitting discussion in Lecture 4.
Lesson 4: Practical Deep Learning for Coders 2022 - YouTube at time 46:00

I hope this helps you! :slightly_smiling_face:

Hi Michael,
yes, I have been through several of the lessons already.
Funny enough I just re-watched lesson 3 yesterday.

I found my learning rate by using the lr_find function (twice), so that value is most likely quite optimised

I was more wondering whether a model actually keeps learning, although the accuracy doesn’t seem to improve.

I guess the answer is: ‘if the accuracy doesn’t change, the model is finished learning’, or ‘the model won’t make significant improvements anymore’.

thank you, and thanks for those links

1 Like