I’ve been working on the dog breed notebook. I’m using “arch = resnet34” and my model achieves an accuracy of 0.88. However, when I submit to Kaggle I get a score of “0.41768.”
Next I tried running the “lesson1-breeds.ipynb” notebook with “arch = resnet34,” because I figured I made a typo somewhere, I figured running FastAI’s notebook would allow me to see where I went wrong. After running Jeremy’s notebook with NO changes other than the “arch” to save time I received a Kaggle score of “0.42490.”
I also tested with “arch = resnet50” and received an even lower Kaggle score even though I was getting an accuracy rating of 86% when training/validating.
I’m very much thinking the file names are getting shuffled/mixed up somewhere when being combined with the class probabilities in the submission file.
Has anyone else seen this issue? If so, how did you solve it?
Or, how do you all combine image files names w/ the probabilities?
Again, I tried this using Jeremy’s dog breed notebook exactly with no changes other than architecture, and I had a the same issue as my own notebook.