Kaggle Comp: Plant Seedlings Classification

(RobG) #238

I was able to do well with success down to 1. emsembling what I found to be the strongest performing architectures (resnet50 and nasnet), 2. spending time fine tuning hyper parameters and image sizes, 3. running k-fold cross validations, and more than once. I think these are good steps for any serious attempt at any leaderboard climbing on any similar competition, and was a good starter learning experience. The competition has closed, but remains a good one to practise these skills.

(Sharwon Pius) #239

Did you come across an error while running nasnet? I faced size error.

(Benjamin DeKoven) #240

@digitalspecialists could you share the code for how you performed the cross validations? Thank you!

(Sharwon Pius) #241

This is brilliant!

(Benjamin DeKoven) #242

@SHAR1 thank you so much for this information!

(sergii makarevych) #243

Please be careful:

(Sharwon Pius) #244

Here is the notebook snipet for my first attempt at this competition. I think its a good place to start.
Just vanilla fastai tips. No cross-validation, ensemble, segmentation of any sorts. I have just kept an eye on the losses, nothing more. I haven’t added any documentation, cause, I followed jermey’s tips nothing more. If you need explanation. Just ping me, I’ll add up.

0.988 accuracy. Around 0.97 in public leaderboard.

(Sharwon Pius) #245

There were two classes in this problem which had some kind of correlation with each other. Most of the errors were due to this. If I want my model to concentrate more on classifying these two classes. How should I approach the problem?

some intuition that I had …
So, can I train a model specifically for these two classes and ensemble it with my main predictions (updating only these two classes). Or, do you recommend any other approach?

(sergii makarevych) #246

I did not to that, I just blended multiple models, which predicted all classes at once. But I have no idea if your approach will work, just give it a try.

(James) #247

@SHAR1 Try oversampling (duplicated all images) Black-grass in your training set. It has half the number of samples of Loose Silky-bent which is the other class it gets confused with.

This may be the wrong approach but I got 0.98740 with vanilla Resnet50, top down aug and incorporation of the validations set at the end.


My first Kaggle competition. Got 0.97858 with Resnet50. No crossvalidation.
I haven’t done nothing special, but it’s nice to get good results with so little experience. It gives me motivation to move forward and it was fun )

(Walter Vanzella) #249

In this competition, my first Kaggle, I obtain 0.98614 (in the public leaderborad, for what that means) !! Thanks to the Jeremy tips and the fastai software.

Used only Resnet50, resnext gave me memory error and I was not able to load nasnet.
I wanted to use all the examples but did not find any other solution than reducing val_pct. I still do not know how to train on all the examples.
I performed about 5-6 trainings, then I checked the wrong classified patterns. Most of the time the errors came from misclassification between class 0 and 6 (sorry now I dont remember the class names), but one model was different, it worked better over the 0-6 and worst over another couple. So, finally, I created an ensemble of just two classifier.

I really would like to know how you debug the code. I’m working with spyder and this is really a pain.
It seems impossible to put a breakpoint, check some values and continue.
Does it exist any IDE that allows to manage decently the debugging ?

(sergii makarevych) #251

Just pixels.

(saurabh) #252

Here is what worked for me:

PATH = “data/plant-seedlings-classification/”

change code :
from glob2 import glob --> from glob import glob
for image in glob(“train//*.png"): —> for image in glob("{}/train//*.png”.format(PATH)):

Hope this helps!

(Sangeetha James) #253

Thank you. You recommendation worked perfectly. Appreciate it.

I needed to change only the following line after installing glob2
for image in glob("{}/train/**/*.jpg".format(PATH)):