Dog Breed Classification Kaggle

Hey Everyone,
I started working on the dog breed classification challenge on Kaggle

and wanted to share my results in case anyone else was interested
https://colab.research.google.com/drive/1ncy45Yiu9JDYQ_4GLzW0c-MfA0EdpPu2

The biggest issue i’ve had was to format the data correctly. So i modified the data to model the imagenet-style folder structure. Then i used
ImageDataBunch.from_folder method

Hope this helps anyone

3 Likes

Hey @tbass134, I too was working on the same dataset after week 1. I got an accuracy of 90% in 8 epochs using resnet-50. Whats your score?

@ady_anr I’m getting 85% using resent-50

1 Like

Anyone figured out how to get the learner to predict the test data?

No @harinsa. I tried but couldn’t find a solution. I searched the docs but in the docs, there was a predict function which takes a single image as input. But as we have this many images in the test set, I don’t think that’ll be the best way to do it. Do share here if you find any other solution.
Here’s a link to the predict function in the docs. https://docs.fast.ai/vision.learner.html#Get-predictions

1 Like

I just tried after looking at your post. I got good results. I was able to achieve ~96% accuracy with resnet18.

96%? Wow! How did you get that kind of accuracy

This is my notebook: https://drive.google.com/file/d/1E18g3_KuafvBCPsEFT7ziNPIKeEbDFwD/view?usp=sharing. It’s on Google Colab but it won’t run there. You can run this file in Kaggle. Thanks.

@mayur can you describe what you did differently apart from what geremy did in week 1?

I just turned off Pertained models and use single epoch with lower learning rate.

Even I am getting an accuracy of 83% using Rest Net 50. May be if I train it for few more epochs I will be able to squeeze in accuracy of 85%. Below is my kernel. Any help will he appreciated.

https://www.kaggle.com/jinudaniel/dog-breed-classification-with-fastai

Pardon me, but as of my actual understanding of Fast.ai library (not good not terrible) that’s impossible. pretrained=False means random initialization of the model’s weights (instead of using the ones obtained with several days of training of ImageNet) so it’s totally unrealistc that such a network can converge in just one epoch.

I’m writing this because, no matter what I try, I can’t go past 90% accuracy with a ResNet-50. So your strategy lured me (and I admit it could also pay off if done properly) but not in one epoch. Moreover in your notebook you train for one epoch (with size=128 and bs=16) and then you do an unfreeze() on an already unfrozen model (because of the pretrained=False) and then train again for just two epochs with a very low max_lr. So it’s totally impossible that the notebook you linked yields 96% accuracy, sorry. I’m writing this also to prevent other people from wasting time on this.

Ok, I understand finally.

In your notebook you use error_rate as metric, not accuracy. You achieved 96% error rate = 4% accuracy, this makes perfectly sense.

OMG!! Thanks for pointing my silly mistake.

I’m sorry to have pointed this out, but I spent a lot of time on this problem and not being able to replicate your results drove me crazy. Until I realized what was wrong…

1 Like