With fast.ai, I got 2 Silver medals at Kaggle in 4 months

Just dropping a note to say Thank you to fast.ai team,

I learned fast.ai in late 2019, as I was not able to progress much using TF for Deep Learning.

Then the results came after 3 months:

  1. I got Silver for Bengali.ai (Computer Vision competition)
    Public LB: Rank 1264 ; Private LB: Rank 102

  2. I got my 2nd Silver for recently concluded M5 sales forecast (Time series forecast)
    Public LB: Rank 3762 ; Private LB: Rank 80
    You can see my Kaggle post here: https://www.kaggle.com/c/m5-forecasting-accuracy/discussion/163203

I used Computer Vision model for 1) and Tabular model for 2)
fast.ai really helped me alot by NOT overfitting & generalizing well !!

Once again, thanks alot & I am convinced fast.ai is the way to go !

33 Likes

Well done! It sounds like you used the standard fastai model? I figured there was untapped potential there in the right hands :wink: Very well done, congrats!

1 Like

%Thanks Zach !

I used Rossmann as a guide, though sales forecast here is more challenging than Rossmann

  1. much more sales data (hierarchical sales data)
  2. sales pattern is sporadic (Spikes followed by long periods of ‘0’)

Anyway, I still have a long way to go (I am also a follower of your work :sunglasses: )

cheers
sid

2 Likes

Hi,
My kernel : https://www.kaggle.com/sidneyng/k-m5-final-public/output?scriptVersionId=37866057

Pls take a look & upvote if you find useful :wink:

5 Likes

great job! =)

Thanks DrHB ! You know how I feel :sunglasses:

1 Like

Great job @SidNg! Impressive jumps from public to private leaderboards, especially in the M5 sales forecast. Could you maybe share your experience on what helped your model to avoid overfitting on the public leaderboard and improve generalization?

1 Like

Thanks Stefan ! Just my opinion:

  1. M5 is very competitive. 1st & 100th placing only 0.08 difference in LB score. So a small improvement in your score can jump a few hundred places in LB

  2. Many participants use very similar techniques (LGBM + multiplier) & features (lag, window, calendar, …). I’m one of few which uses NN (fast.ai Tabular). So either I will do very well or very badly in final placing

  3. Competition metric is quite tricky as predicting 0 sales do not count (organizers want large sales with high sell price). It’s really difficult to judge how you fare based on just validation loss. Visually inspecting the plots helped a lot

  4. M5 only allow 1 final submission. I just submit a safe, conservative entry. Nothing fancy, just follow principles in Jeremy’s Rossmann lesson

PS: To see how my Tabular model fares for ts forcasting, let’s wait for Organizer’s summary report. The previous M4 competition, only a few Kaggle entries beat the Organizer’s benchmark models

3 Likes

What Is Rossman? Can you share as well the link/Content.
I really want to receive a medal but unfortunately I don’t know how to tweak the Fastai models, so I decided to learn the basic in pytorch but I am encountering difficulties, in one hand pytorch tuts show a way to use transfer learning in the second hand other developers as abishek write their own stuff.
I am a bit lost, do you have a suggestion?

Lesson 6 in the Practical Deep Learning for Coders part 1, it shows tabular regression and feature engineering for time series

1 Like

Rossmann is a kaggle competition from 5 years ago: https://www.kaggle.com/c/rossmann-store-sales/overview

Thanks I will take a look

Congrats Sid ! I just started with fastai course and thanks to you, I’m more motivated than ever!

Thanks @Legnica1241 ! fast.ai provides a good framework to submit “competitive” Kaggle entries.

There is no guarantee using fast.ai will get a medal, I still need some luck :wink:

Focus on the process (learning fast.ai), not the outcome (winning). Hopefully, with experience & some luck, odds will increase in your favour !

cheers
sid

1 Like

Congrats @SidNg on your great progress!

Would you be open to sharing your BengaliAI notebook? 102 is impressive and I’m interested in seeing how fastai was used. :slightly_smiling_face:

Hi,
I have summarized my learning in this medium blog post:

cheers
sid

2 Likes